The most interesting feature of VRML 2.0 (besides the ability to use animated objects) is that the world can respond to the actions of the viewer. In this chapter you will see how to create animations, and how can the world "sense the viever" to be able to respond her actions.
Most of the nodes in VRML 2.0 has exposed fields(you already familiar with them: remember for example to the Transform node's translate field), and/or input events and/or output events. Basically the the "living objects" of VRML are based on the simple system that you first generate events, and in response to them "interpolator objects" are changing other objects exposed fields' resulting a change in their appearance or position. In the following example a very simple keyframe-animation starts when a "touch event" is registered: if you click on the blue cone, it rotates around the Y and Z axes.
#VRML V2.0 utf8 #if you click on the cone, it will rotate around the Y and Z axis DEF movingball Transform{ children [ Shape { #a "default" cone in blue appearance Appearance { material Material { diffuseColor 0.0 0.0 1.0 } } geometry Cone {} } # the sensors are children of movingball: other objects can have different ones DEF toucher TouchSensor {} # to sense the click DEF timer TimeSensor { # to have time-changed events cycleInterval 5.0 } DEF rotator OrientationInterpolator { # this will rotate the cone key [ 0.0, 0.5, 1.0 ] #3 key-positions to interpolate between keyValue [ 0.0 1.0 0.0 0.0, 0.0 1.0 1.0 3.14, 0.0 1.0 1.0 6.28 ] } ] } ROUTE toucher.touchTime TO timer.startTime #toucher starts the timer ROUTE timer.fraction_changed TO rotator.set_fraction #timer passes fractions to rotator ROUTE rotator.value_changed TO movingcone.set_rotation # rotator changes movingball rotation
The example is very simple, but illustrates the stucture of events-interpolators-object structure. There are 4 keywords you have to become familiar with:
Let's recall the previous example to study it step by step:
First we have to name the object(group) to make it possible to refer to it using routes:
DEF movingball Transform{ children [ Shape { #a "default" cone in blue appearance Appearance { material Material { diffuseColor 0.0 0.0 1.0 } } geometry Cone {} }
Then we have to setup the sensors (they have to be the children of "movingball" if there are other objects in the scene with their own sensors: now there aren't any, so it doesn't matter where are the sensors, but let there be order:) :
# the sensors are children of movingball: other objects can have different ones DEF toucher TouchSensor {} # to sense the click DEF timer TimeSensor { # to have time-changed events cycleInterval 5.0 }
We need an Interpolator to perform the animation:
DEF rotator OrientationInterpolator { # this will rotate the cone key [ 0.0, 0.5, 1.0 ] #3 key-positions to interpolate between keyValue [ 0.0 1.0 0.0 0.0, 0.0 1.0 1.0 3.14, 0.0 1.0 1.0 6.28 ] } ] }
And finally we need ROUTEs to "transport" the events: we want to start the animation when the user clicks the object. First the toucher(a TouchSensor) sends an event to timer(a TimeSensor) to start counting(startTime) when the user clicks on the cone:
ROUTE toucher.touchTime TO timer.startTime #toucher starts the timer
When the timer is running, it sends events(fraction_changed) every time when a fraction of time passes to the rotator(a PositionInterpolator) to create a new interpolation (according to fraction) between the key-positions:
ROUTE timer.fraction_changed TO rotator.set_fraction #timer passes fractions to rotator
We need a route to change the object's orientation according to the computed interpolation value:
ROUTE rotator.value_changed TO movingcone.set_rotation # rotator changes movingball rotation
After the detailed explanation you might want to see it together again:
#VRML V2.0 utf8 #if you click on the cone, it will rotate around the Y and Z axis DEF movingball Transform{ children [ Shape { #a "default" cone in blue appearance Appearance { material Material { diffuseColor 0.0 0.0 1.0 } } geometry Cone {} } # the sensors are children of movingball: other objects can have different ones DEF toucher TouchSensor {} # to sense the click DEF timer TimeSensor { # to have time-changed events cycleInterval 5.0 } DEF rotator OrientationInterpolator { # this will rotate the cone key [ 0.0, 0.5, 1.0 ] #3 key-positions to interpolate between keyValue [ 0.0 1.0 0.0 0.0, 0.0 1.0 1.0 3.14, 0.0 1.0 1.0 6.28 ] } ] } ROUTE toucher.touchTime TO timer.startTime #toucher starts the timer ROUTE timer.fraction_changed TO rotator.set_fraction #timer passes fractions to rotator ROUTE rotator.value_changed TO movingcone.set_rotation # rotator changes movingball rotation
If you understood everything, you are ready to study the various interpolators and sensors. Let's start with interpolators to see what kind of behavior can you change, next you can try it with the various sensors.
You can use six type of interpolators:
There are a lot of things which are common to all of them. The common set of fields are:
set_fraction key [...] keyValue [...] value_changed
The set_fraction input event sets the current fraction (usually 0.0 to 1.0) along the key path and the value_changed output event outputs the computed interpolation value(for example the PositionInterpolator outputs position) along the path each time the fraction is set.
Let's examine the various interpolators and their data types(later in the sensor-examples you will see some of them working):
ColorInterpolator { key [ 0.0, . . . ] keyValue [ 1.0 1.0 0.0, . . . ] }
CoordinateInterpolator { key [ 0.0, . . . ] keyValue [ 0.0 1.0 0.0, . . . ] }
NormalInterpolator { key [ 0.0, . . . ] keyValue [ 0.0 1.0 0.0, . . . ] }
OrientationInterpolator { key [ 0.0, . . . ] keyValue [ 0.0 1.0 0.0 0.0, . . . ] }
PositionInterpolator { key [ 0.0, . . . ] keyValue [ 0.0 0.0 0.0, . . . ] }
ScalarInterpolator { key [ 0.0, . . . ] keyValue [ 4.5, . . . ] }
There are several different kinds of sensor nodes: the TimeSensor measures as the time passes, one group of sensors(ProximitySensor and VisibilitySensor) are "sensing the viewer"(detecting the user's navigation), other group of them are "pointing device sensors"(detecting mouse-actions: Anchor, CylinderSensor, PlaneSensor, SphereSensor and TouchSensor). Sensors are children nodes in hierarchy, therefore they can be parented by grouping nodes.
The TimeSensor(a clock without any geometry or location associated with) is maybe the most often used sensor, since you need to have data on time-fractions to create any kind of keyframe-animation, so we will study it first. Next we will discuss the possibility to "sense the viewer" using ProximitySensor(detects when the user navigates into a specified region in the world) and VisibilitySensor(detects when a specific part of the world becomes visible for the viewer). Then we will study the pointing device-sensors; the TouchSensor(detects when the user clicks on an object), and the three "drag sensors": CylinderSensor, PlaneSensor, and SphereSensor(the user can rotate and move descending geometry). The Anchor node is also a "pointing device sensor", but since we discussed it earlier, we won't pay attention to it here.
8.2.1 The TimeSensor
TimeSensor nodes generate events as time passes: you can use it for driving continuous simulations and animations, or to control periodic activities.
TimeSensor { exposedField SFTime cycleInterval 1 exposedField SFBool enabled TRUE exposedField SFBool loop FALSE exposedField SFTime startTime 0 exposedField SFTime stopTime 0 eventOut SFTime cycleTime eventOut SFFloat fraction_changed eventOut SFBool isActive eventOut SFTime time }
The TimeSensor node contains two discrete eventOuts: isActive and cycleTime. The isActive eventOut sends TRUE when the TimeSensor node begins running, and FALSE when it stops running. The cycleTime eventOut sends a time event at startTime and at the beginning of each new cycle (useful for synchronization with other time-based objects). The remaining eventOuts generate continuous events. The fraction_changed eventOut, an SFFloat in the closed interval [0,1], sends the completed fraction of the current cycle. The time eventOut sends the absolute time for a given simulation tick.
Creating a keyframe-animation requires control over time. You have to specify when to start and stop the animation, and how fast should be the transition: the speed of the sequence. The TimeSensor node is similar to a stop watch: you control the start time, stop time, and cycle length, and the sensor generates events while it is running. But to create the animation, you have to ROUTE events to change node fields(as we discussed in the first example.)
You can specify start and stop time for the TimeSensor in the startTime and stopTime fields. The looping (whether or not to repeat cycles) can be specified in the loop field. You can start and stop the timer using the inputevents of TimeSensor:
#VRML V2.0 utf8 #TimeSensor example DEF changingball Transform{ children [ Shape { #a "default" sphere in blue appearance Appearance { material DEF ballmaterial Material { diffuseColor 0.0 0.0 1.0 } } geometry Sphere {} } # the sensors are children of changingball: other objects can have different ones DEF timer TimeSensor { # to have time-changed events cycleInterval 5.0 # the length of one loop loop TRUE # the sensor will loop startTime 0 #the timer will start immediately } DEF colrotator ColorInterpolator { # this will change the color key [ 0.0, 0.5, 1.0 ] #3 key-positions to interpolate between keyValue [ 0 0 1, 1 0 0, 0 0 1] } ] } ROUTE timer.fraction_changed TO colrotator.set_fraction #timer passes fractions to colrotator ROUTE colrotator.value_changed TO ballmaterial.set_diffuseColor # colrotator changes movingball color
The ProximitySensor node generates events when the viewer enters, exits, and moves within a region in space (defined by a box specified in the center and size fields).
ProximitySensor { exposedField SFVec3f center 0 0 0 exposedField SFVec3f size 0 0 0 exposedField SFBool enabled TRUE eventOut SFBool isActive eventOut SFVec3f position_changed eventOut SFRotation orientation_changed eventOut SFTime enterTime eventOut SFTime exitTime }
A proximity sensor is enabled or disabled by sending it an enabled event with a value of TRUE or FALSE. A disabled sensor does not send events.
A ProximitySensor node generates isActive TRUE/FALSE events as the viewer enters and exits the rectangular box. The center field defines the centre point of the proximity region in object space. The size field specifies a vector which defines the width (x), height (y), and depth (z) of the box bounding the region. ProximitySensor nodes are affected by the hierarchical transformations of their parents.
The enterTime event is generated whenever the isActive TRUE event is generated (user enters the box), and exitTime events are generated whenever an isActive FALSE event is generated (user exits the box).
The position_changed and orientation_changed eventOuts send events whenever the user is contained within the proximity region and the position and orientation of the viewer changes with respect to the ProximitySensor node's coordinate system including enter and exit times. The viewer movement may be a result of a variety of circumstances resulting from browser navigation, ProximitySensor node's coordinate system changes, or bound Viewpoint node's position or orientation changes.
Each ProximitySensor node behaves independently of all other ProximitySensor nodes. Every enabled ProximitySensor node that is affected by the viewer's movement receives and sends events, possibly resulting in multiple ProximitySensor nodes receiving and sending events simultaneously. A multiply instanced ProximitySensor node will detect enter and exit for all instances of the box and send set of enter/exit events appropriately.
A ProximitySensor node that surrounds the entire world has an enterTime equal to the time that the world was entered and can be used to start up animations or behaviours as soon as a world is loaded.
In the following example an "elevator" is constructed: to go up, you just have to step onto the elevator platform. A ProximitySensor(upelevator) fires and starts the elevator up automatically.
#VRML V2.0 utf8 Transform { translation -1.5 0 0 children [ Transform { translation 0 0 -2 children Shape { appearance Appearance { material Material { diffuseColor 0 1 0 } } geometry Text { string "1st floor" } } } Transform { translation 0 4 -2 children Shape { appearance Appearance { material Material { diffuseColor 1 0 0 } } geometry Text { string "2nd floor" } } } ] } Group { children [ DEF Elevator Transform { children [ DEF upelevatorViewpoint Viewpoint { jump FALSE } DEF upelevator ProximitySensor { size 2 5 5 } # the sensor is travelling with it's parent Transform { translation 0 -1 0 children Shape { appearance Appearance { material Material { } } geometry Box { size 2 0.2 5 } #the elevator platform } } ] } ] } DEF Elevatorup PositionInterpolator { key [ 0, 1 ] keyValue [ 0 0 0, 0 4 0 ] # a floor is 4 meters high } DEF timer TimeSensor { cycleInterval 5 } # 5 seconds travel time ROUTE upelevator.enterTime TO timer.startTime # the proximitysensor starts the keyframe-animation ROUTE timer.isActive TO upelevatorViewpoint.set_bind # if the timer is active, the user's viewpoint is binded to the elevator ROUTE timer.fraction_changed TO Elevatorup.set_fraction ROUTE Elevatorup.value_changed TO Elevator.set_translation
The VisibilitySensor detects visibility changes of a rectangular box as the user navigates the world. VisibilitySensor is typically used to detect when the user can see a specific object or region in the scene in order to activate or deactivate some behaviour or animation. The purpose is often to attract the attention of the user or to improve performance.
VisibilitySensor { exposedField SFVec3f center 0 0 0 exposedField SFBool enabled TRUE exposedField SFVec3f size 0 0 0 eventOut SFTime enterTime eventOut SFTime exitTime eventOut SFBool isActive }
The center field defines the location of the "box-shaped" (but of course invisible) sensor, while its size is specified in the size specifies the size of the rectangular box. The enabled field enables and disables the VisibilitySensor node. If enabled is TRUE, the VisibilitySensor node detects changes to the visibility status of the box specified and sends events through the isActive eventOut: a TRUE event is output to isActive when any portion of the box impacts the rendered view, and a FALSE event is sent when the box has no effect on the view (eg. the user turned away from the sensor). The enterTime event is generated whenever the isActive TRUE event is generated, and exitTime events are generated whenever isActive FALSE events are generated.
In the following example a sound is played until the user turns to a box(indicating both the sound source and the VisibilitySensor). Note, that the sensor is subject to the transform.
#VRML V2.0 utf8 Transform { translation -10 0 0 children[ Sound { source DEF fruit1 AudioClip { url "../wav/fruit1.wav" description "fruit1.wav" loop TRUE startTime 1 stopTime 0 } location 0 0 0 minFront 10 minBack 10 maxFront 40 maxBack 40 spatialize TRUE } DEF leftboxsensor VisibilitySensor { #since the sensor is the children of transformation center 0 0 0 #the center of it is at -10 0 0 size 1 1 1 #the size of the sensor is the same as the box } Shape{ appearance Appearance{ material Material { } } geometry Box { } } DEF Clicked TouchSensor {} ] } ROUTE leftboxsensor.enterTime TO fruit1.stopTime ROUTE leftboxsensor.exitTime TO fruit1.startTime Shape { geometry Text { string [ "if you turn left", "to look at the cube", "the sound will stop" ] fontStyle FontStyle { justify "MIDDLE" } } }
By using the "drag sensors" you can enable the user to move objects in the world by dragging them with the mouse. I will write the description soon, for now see an example illustrating the use of "drag sensors".
#VRML V2.0 utf8 #------------------ Cylinder Sensor ----------------- DEF CShape Transform { translation -5 2.5 25 children [ Shape { geometry Box { size 2 5 2 } appearance Appearance { material Material {diffuseColor .5 .3 .3} } } DEF CSensor CylinderSensor { } ] ROUTE CSensor.rotation_changed TO CShape.rotation } #----------------- Plane Sensor --------------------- DEF PShape Transform { children [ Transform { translation 0 2.5 25 children [ Shape { geometry Box { size 2 5 2 } appearance Appearance { material Material {diffuseColor .3 .5 .3} } } DEF PSensor PlaneSensor { } ] } ] ROUTE PSensor.translation_changed TO PShape.translation } #----------------- Sphere Sensor --------------------- DEF SShape Transform { translation 5 2.5 25 children [ Shape { geometry Box { size 2 5 2 } appearance Appearance { material Material {diffuseColor .3 .3 .5} } } DEF SSensor SphereSensor { } ] ROUTE SSensor.rotation_changed TO SShape.rotation } Viewpoint { position 0 2 50 orientation 0 0 1 0 description "First View" }