You can create more realistic worlds by enchancing the environment with background, atmosphere and acoustic elements.
7.1 Adding Backgrounds
Backgrounds are always beyond the range the world's geometry, no matter how big your world gets. Your viewpoint is always in the center of the Background node: you can't translate to it, but you can rotate your viewpoint to look up or down to it. There are two types of backgrounds what you can combine:
Background { eventIn SFBool set_bind exposedField MFFloat groundAngle [] exposedfield MFColor groundColor [] exposedField MFString backUrl [] exposedField MFString bottomUrl [] exposedField MFString frontUrl [] exposedField MFString leftUrl [] exposedField MFString rightUrl [] exposedField MFString topUrl [] exposedField MFFloat skyAngle [] exposedField MFColor skyColor [ 0 0 0 ] eventOut SFBool isBound }
See the description of the fields below, except for the set_bind and isBound fields: you can use them to change the background in response to an event (see later in the next chapter).
The ground and sky spheres are colored using two fields for each. The skyColor field is an array of RGB values representing the colors of the sky at various angles down from the top(the browser should interpolate between the colors); the first color is always at the top, the last color is the color from the last angle to the bottom of the sphere. The skyAngle field is a list of angles where the colors from the skycolor field are located (the top 0.0 angle is always assumed to be the first angle, so you shouldn't include it).
The groundColor and groundAngle fields work the same, but "upside-down": the angles start at the bottom.
In the following example you can see a simple cube in the center of a backdropped background:
#VRML V2.0 utf8 #Colored Background Background { skyAngle [ .05, .1, 2 ] skyColor [ 1 1 0, 1 1 0.5, 0 0 0.5, 0.2 1 1 ] groundAngle [ 1.57 ] groundColor [ 0.14 0.28 0, 0.09 0.11 0 ] } Shape { geometry Box{} }
Background images are always mapped onto an imaginary cube inside the spheres of the backdrop colors, therefore if you are using images with alpha channem (GIF, PNG), they can overlay backdrops (backdrop shows through transparent regions). The images and their "mapping" can be specified by the frontURL, rightURL, backURL leftURL, bottomURL and the topURL fields. The sides of the cube (frontURL, rightURL, backURL leftURL) are all oriented with the top of the image in a positive y direction, the bottom image's(bottomURL) top will be in the negative z direction, while the top (topURL) image will be positioned with its top in a positive z direction. You don't have to use all the images: you can use a single image as a backdrop for example.
#VRML V2.0 utf8 #Background Node with Images Background { groundAngle 1.5 groundColor [ 0.05 0.1 0.05, 0.05 0.1 0.05 ] skyAngle [ 1.047, 2.094 ] skyColor [ 0.1 0.1 0.5, 0.2 0.2 0.6, 0.6 0.6 0.7 ] backUrl "../images/bg-back.gif" leftUrl "../images/bg-left.gif" frontUrl "../images/bg-front.gif" rightUrl "../images/bg-right.gif" } Shape { geometry Box{} }
Fog is a "special effect" to simulate atmospheric effect by fading colors to a specified value over distance, but its effect is browser dependent.
Fog { exposedField SFColor color 1 1 1 exposedField SFString fogType "LINEAR" exposedField SFFloat visibilityRange 0 eventIn SFBool set_bind eventOut SFBool isBound }
The Fog node provides a way to blend objects with the colour specified by the color field based on the distances of the various objects from the viewer. The distances are calculated in the coordinate space of the Fog node. The visibilityRange specifies the distance (in the local coordinate system) at which objects are totally obscured by the fog(value 0 disables node). Objects located visibilityRange meters or more away from the viewer are drawn with a constant colour of color, while those very close to the viewer are blended very little with the fog color. The visibilityRange is affected by the scaling transformations of the Fog node's parents; translations and rotations have no affect on visibilityRange.
The fogType field controls how much of the fog colour is blended with the object as a function of distance. If fogType is "LINEAR" (the default), the amount of blending is a linear function of the distance, resulting in a depth cuing effect. If fogType is "EXPONENTIAL," an exponential increase in blending is used, resulting in a more natural fog appearance.
Note that by using a "black fog" you can create "night scenes": in the following example a red box is shown in a black fog:
#VRML V2.0 utf8 #Fog example Fog { color .5 .5 .5 fogType "EXPONENTIAL" visibilityRange 10 } Shape { appearance Appearance { material Material { diffuseColor 1 0 0 } geometry Box{} }
The LOD(Level Of Detail) node specifies various levels of detail or complexity for a given object, and provides hints allowing browsers to automatically choose the appropriate version of the object based on the distance from the user. This way you can reduce "upfront" download time(have simpler models loaded first), and can improve frame rate by using simpler models for far away objects. Note that the implementation of the LOD node is browser-dependent: everything you specify is considered as a "hint" for the browser.
LOD { exposedField MFNode level [] field SFVec3f center 0 0 0 field MFFloat range [] }
The level field contains a list of nodes that represent the same object or objects at varying levels of detail, ordered from highest level of detail to the lowest level of detail. The range field specifies the ideal distances at which to switch between the levels. The center field is a translation offset in the local coordinate system that specifies the centre of the LOD node for distance calculations. You can consider the LOD node a grouping node: one of its children is displayed at a time. Not surprisingly you have to specify N+1 children for N ranges since if you specify
range [ 20, 40 ]then the folowing objects are used for the different distances:
< 20 m | 1st child used |
>= 20 & < 40 m | 2nd child used |
>= 40 m | 3rd child used |
#VRML V2.0 utf8 #Controlling detail LOD { range [ 3, 9 ] level [ Shape { #if you are close show the sphere with texture appearance Appearance{ texture PixelTexture { image 2 2 3 #size of image is 2x2, depth is rgb(3) 0xFF0000 0x00FF00 0xFFFFFF 0xFFFF00 } } geometry Sphere { } } Shape { #if >3m show a simple sphere appearance Appearance{ material Material{} } geometry Sphere { } } Shape { #if >9m show a box instead appearance Appearance {} geometry Box { } } ] }
The NavigationInfo node contains information describing the physical characteristics of the viewer's avatar and viewing model(eg. walk, fly, examine). The current NavigationInfo node is considered to be a child of the current Viewpoint node regardless of where it is initially located in the file. Whenever the current Viewpoint nodes changes, the current NavigationInfo node must be re-parented to it by the browser. Note that anything you specify in NavigationInfo, is only a suggestion for the browser.
NavigationInfo { eventIn SFBool set_bind exposedField MFFloat avatarSize [0.25, 1.6, 0.75] exposedField SFBool headlight TRUE exposedField SFFloat speed 1.0 exposedField MFString type ["WALK", "ANY"] exposedField SFFloat visibilityLimit 0.0 eventOut SFBool isBound }You can controll the size of the avatar of the user(important for collision-detection), if the headlight is on or off, the travelling speed and the navigationtype:
NavigationInfo { avatarSize [ 0.25, 1.6, 0.75 ] headlight TRUE speed 1.0 type "WALK" }The NavigationInfo above specifies that the size of the user's avatar is X:0.25, Y:1.6, Z:0.75, the headlight is on, the speed is 1.0(normal) and the type of navigation is "walk".
#VRML V2.0 utf8 #Background Node with Images NavigationInfo { avatarSize [ 0.25, 1.6, 0.75 ] headlight FALSE speed 2.0 type "EXAMINE" } Background { skyAngle [ .05, .1, 2 ] skyColor [ 1 1 0, 1 1 0.5, 0 0 0.5, 0.2 1 1 ] groundAngle [ 1.57 ] groundColor [ 0.14 0.28 0, 0.09 0.11 0 ] } Shape { appearance Appearance {} geometry Box{} }
The possibilities of sound in VRML are the most popular featues of VRML: using spatialized sound effects is a unique feature of VRML, therefore you can find a lot of audio applications on the net using VRML. With the sound-features of VRML you can create background mood and ambience of your scene and/or add auditory crues about what is happening, and/or of course mark the presence of noisy objects. You will experience that in some cases some objects/events may not be visible for the viewer, so sound cue is the only notification what the user gets: such problems from the authoring side are the most interesting ones for me personally...
Sound { exposedField SFVec3f direction 0 0 1 exposedField SFFloat intensity 1 exposedField SFVec3f location 0 0 0 exposedField SFFloat maxBack 10 exposedField SFFloat maxFront 10 exposedField SFFloat minBack 1 exposedField SFFloat minFront 1 exposedField SFFloat priority 0 exposedField SFNode source NULL field SFBool spatialize TRUE }
The VRML Sound node is an object that creates sound in a VRML scene: it is like a single-powered speaker that you may have attached to your computer. Typically the source of audio is the AudioClip(you can use a MovieTexture as well) node, and the most important data is the spatial information about it (specified in the Sound node): where it is, what "shape" the sound takes, how far it reaches, what direction it is pointing. In other words the AudioClip node specifies a sound signal, and the Sound node specifies a place for it.
The sound is located at a point in the local coordinate system and emits sound in an elliptical pattern defined by two ellipsoids: the soundclip is played in full volume(specified in the intensity field from 0.0 to 1.0) if the user is "inside" the inner ellipsoid (maxBack-maxFront), and drops to zero between the inner and outer(minBack-minFront), therefore no sound can be heard outside the outer ellipsoid. The size of the ellipsoids are specified by the maxBack-maxFront and minBack-minFront fields, and their orientation is specified by the direction field. The priority field provides a hint for the browser to choose which sounds to play when there are more active Sound nodes than can be played at once due to either limited system resources or system load.The spatialize field determines if the sound is perceived as being directionally located relative to the viewer. If the spatialize field is TRUE and the viewer is located between the transformed inner and outer ellipsoids, the viewer's direction and the relative location of the Sound node should be taken into account during playback.
The AudioClip may contain the following fields(their default value is shown):
AudioClip { exposedField SFString description "" exposedField SFBool loop FALSE exposedField SFFloat pitch 1.0 exposedField SFTime startTime 0 exposedField SFTime stopTime 0 exposedField MFString url [] eventOut SFTime duration_changed eventOut SFBool isActive }
An AudioClip node specifies audio data that can be referenced by other nodes that require an audio source.
The description field specifies a textual description of the audio source. A browser is not required to display the description field but may choose to do so in addition to playing the sound. The url field specifies the URL from which the sound is loaded. Browsers shall support at least the wavefile format in uncompressed PCM format(Windows WAV - Wave), so I recommend you to always use this sound format, although there are several standard formats are in common use on the Internet.
AudioClip is a time-dependent node: you can specify if the sound is looped or not, when should the clip start/stop playing(in seconds). (For the complete description refer to the VRML/ISO Draft for International Standard (DIS) Node Reference.)
The pitch field specifies a multiplier for the rate at which sampled sound is played. Only positive values shall be valid for pitch. A value of zero or less will produce undefined results. Changing the pitch field affects both the pitch and playback speed of a sound.
In the following simple example you can't hear any sound when you enter the world: you should approach the cylinder in the center of the world to hear the effect of the spatialized sound:
#VRML V2.0 utf8 # Sound example # Sound volume should be zero where you enter the scene. # Sound volume should increase as you approach the cone. Sound { source AudioClip { url "../wav/fruit1.wav" description "Background music" loop TRUE startTime 1 stopTime 0 } location 0 0 0 minFront 10 minBack 10 maxFront 40 maxBack 40 spatialize TRUE } Viewpoint { position 0 0 50 orientation 0 0 1 0 description "front - 50 m from the sound source" } Shape { appearance Appearance { material Material { } } geometry Cylinder { height 2.0 radius 1.5 } } Transform { translation 0 -5 0 children[ Shape { appearance Appearance{ texture ImageTexture { url "../images/floor.gif" } } geometry Box { size 100 0.1 100 } } ] }