Home | About | Contact
 
Program Interactivity
It doesn't matter which format you choose, VRML or YG for the CAVE, interactivity will need to be programmed. VRML is programmed using Virtual Reality Programming Language and Ygdrasil, a scripting language, is used to program for the CVAE.

Types of Interactivity

Both VRML and YG allow for the same type of interactivity. The user can navigate through the space, be tracked throughout the scene, the geometry can be changed with switches, proximity sensors, and moved. In an urban planning simulation, the user might want to swap out buildings, show options of tree and street furniture, show traffic flow, and scale buildings.

YG For the CAVE

Documentation for YG scripting was prepared by Alex Hill, a PhD student in the Computer Science department at UIC. YG is constantly being updated for ease of use and complex event handling. To create interactivity, a scene graph is constructed, models are imported, scripts are written to control the models and events, then the program is "Run" in a Linux or Unix environment.

A scene graph for YG might resemble the following:

//set up wand trigger to toggle icons on and off
#ifndef _HAZEL_INTERFACE_
node(User0.teleport(1523 610 0)+1)
#endif

#include "hazelInterface.scene"

//set up wand trigger to toggle icons on and off
wandTrigger(when(buttonUp1,opp17mySlider.drop),
when(button2,opp17iconSwitchA.toggle,opp17iconSwitchB.toggle))

//create a value node to increment the object orientation
value opp17objectRotate(integer,delta(90),
when(changed,opp17movedObjectB.orientation(0 0 $value)))

transform(position(1523 626 3))
{
//create a primary model
switch opp17objectSwitchA(on)
{
object (file(32and31.pfb))
switch opp17iconSwitchA(off)
{
pointAtTrigger(when(start,opp17icon1OnSwitchA.off,opp17icon1OffSwitchA.on),
when(stop,opp17icon1OffSwitchA.off,opp17icon1OnSwitchA.on),
when(button1,opp17objectSwitchA.off+0.1,opp17objectSwitchB.on+0.1))
{
transform (position(20 -30 2))
{
group opp17icon1GroupA()
{
switch opp17icon1OffSwitchA(on)
{
object (file(initmodelsselected.pfb))
}
switch opp17icon1OnSwitchA(off)
{
object (file(initmodels.pfb))
}
}
}
transform (position(45 30 35),size(10 10 1))
{
reference(node(opp17icon1GroupA))
}
}
}
}
}

In the above example, I am teleporting (repositioning) the User,

loading the interface scene graph,

creating icons and activating icons with the PointAt Trigger (I created a set of 3 icons for each parcel: rotation, switch between models, and a slider that allowed me to move the building.) The icons are similar to "rollovers" on web pages. When the icon is activated via the PointAt Trigger, the icon would light up. When you press button1 on the CAVE Wand (the Cave's mouse) the building could be swapped, moved, or rotated depending on which icon you activated,

Here,

switch opp17objectSwitchA(on)
{
object (file(32and31.pfb))

I am loading the building named 32and31.pfb under the SwitchA, when switch A is on, the model 32and31.pfb will be visible.

VRML For the Web

In VRML, events are routed to each other to control interactivity. The easiest way to program interactivity in VRML is to use Cosmo Worlds. In this program, an outline editor will enable you visually route objects to scripts. A toolbar (shown left) will enable you to construct interactivity. Proximity, touch, cylinder, sphere, plane, time, and visibility sensors can be routed to geometry. Additionally, interpolators for points, colors, orientation and coordinates are built in to Cosmo. A scene graph of the constructed geometry and interactivity can be seen in Cosmo. Below, a Level of Detail (LOD) has been applied to a building. The LOD enables different models of the building to be switched out depending on the range of the user within the world. At 150 feet away, the most detailed model is switched to the secondary model consisting of less detail and a lower resolution texture map. At 250 feet away, the secondary model is switched off and no building can be seen.

General Rules for Routing:

Events are of two types: outgoing events send values, and incoming events receive values. Many fields have both an outgoing event (called an eventOut) and an incoming event (called an eventIn) associated with them. The Outline Editor uses a graphical shorthand arrow to show these events.

The left-pointing arrow indicates the eventIn, and the right-pointing arrow indicates the eventOut.
When you connect an eventOut to an eventIn (or vice versa), the connection between them is called a route (also sometimes referred to as a wire).

The Outline Editor does not let you make "illegal" routes.

Here's a list of rules for routing events:

  • Incoming events are always routed to outgoing events, and vice versa. You can't route two incoming events to each other, or two outgoing events to each other.
  • The types of the events at both ends of the route must be identical. For example, a single-valued color event (SFColor) to a multi valued color event (MFColor).
  • You can't route an outgoing event to an incoming event for the same field.
  • You can’t route one SFNode event to another if the two are not allowed to use the same kind of node. For instance, you can’t route a material event to or from a texture event even though they’re both SFNode events, because the material field requires a Material node and the texture field requires a texturing node.
  • Detailed type-checking is not done on SFNode events in prototypes and scripts; so the Outline Editor allows you to create this last sort of illegal route to or from events of a prototype or a script. Check your prototypes and scripts carefully to make sure that their SFNode events send and receive legal nodes.
  • You can't route from the 'children' field of one grouping node to the children of one of its descendants because this would cause a cycle.
  • This tutorial for the Cosmo outline editor is very helpful.

A portion of a scene graph for VRML might resemble the following:

#VRML V2.0 utf8

#Cosmo Worlds V2.0

LOD {
center 338.82 4.05152 61.669
range [ 150, 250 ]
level [
Transform {
children [
Transform {
children Shape {
appearance Appearance {
material DEF _0 Material {
}

texture ImageTexture {
repeatS TRUE
repeatT TRUE
url "images/506_56th_1.png"
}

textureTransform NULL
}

geometry DEF _1 IndexedFaceSet {
coord Coordinate {
point [ -1 1 1,
-1 -1 1,
1 -1 1,
1 1 1 ]
}

coordIndex [ 0, 1, 2, 3, -1 ]
texCoord TextureCoordinate {
point [ 0 1,
0 0,
1 0,
1 1 ]
}

creaseAngle 0.5
normalIndex [ ]
texCoordIndex [ 0, 1, 2, 3, -1 ]
}

}

 

 

Computer Graphics & Urban Planning
  What is Virtual Reality?
  What is a CAVE and why use it?
  VR & the Web
  Current & Future state of VR
  Ongoing projects
   
Overview of Classes & Program
  Newspace versus EVL computers
   
Project Fundamentals
  Collect Data
  Construct Models
  Apply Textures
  Program Interactivity
  Output to Display Device