FreeVR Library Programming Tutorial (w/ OpenGL)
The FreeVR Library will be used to exemplify many VR interface and
programming techniques.
FreeVR will work in most local VR facilities (eg. CAVE™ or Head-Based-Display),
as well as on the available desktop machines in a simulated-VR mode.
It can be used on PC's with Linux and OpenGL installed, or
in a (currently very) limited way on PC's with Cygwin installed.
The FreeVR webpage is www.freevr.org.
This tutorial takes the approach of starting with an extremely simple initial
example and then advancing in very small increments (baby steps) until we can
perform some interesting tasks in virtual reality.
Another point to make about this tutorial is that it is not intended as a
means to learn advanced OpenGL programing.
The computer graphics in these examples are all very simple.
The goal of the tutorial is to highlight programming tasks that are unique
to programming for a virtual reality system.
On the other hand, some basic OpenGL techniques can be learned by following
this tutorial, as it will show how to handle texture maps, billboards,
cutting planes, and moveable lights.
Other tutorials are under development for interfacing FreeVR to additional
rendering and world simulation systems.
Presently there is a tutorial available for the SGI Performer scene-graph
library, but as Performer has greatly decreased in usage new tutorials
will cover similar, but more popular libraries.
This tutorial has been most recently tested with FreeVR version 0.6a.
Part 4: Examples 16 - 21
Now that we have had some experience with the beginnings of interesting
interaction, we will continue the tutorial by expanding the use of
OpenGL rendering capabilities, and then look at the world-in-miniature.
Each example links to a copy of the source code
(and the difference from the previous example)
.
The examples can be compiled with one or more of the following additional
files:
Example 16: 3D text in the world (ex16)
- By using the GLUT library, we can render (flat) 3D text in the world.
- Now text can be located in the virtual world (not just on a screen's
surface), and it will appear in the correct size and orientation.

- Example 16 (ex16_3dtext.c)
adds GLUT-based 3D text to the virtual world.
- There are minor
differences between ex13 and ex16.
NOTE that we are skipping back to example 13 for the code differences, since
this (and future) examples do not include the virtual widgets.
Example 17: Objects with textured surfaces (ex17)
Objects with textured surfaces, make for a more interesting looking virtual world.

Texture images can be rather large.
Therefore it becomes important to ensure that sufficient shared memory
is allocated to contain the texture images.
Standard OpenGL texture mapping techniques are used to generate
objects with textures.
In this example there is only a single texture, so it is declared once,
and then texturing is enabled/disabled as required.
Often textures are read from image files.
For this example a simple checkerboard texture is created algorithmically.
The function to create the texture (make_texture_checker_pattern())
will either create a white/gray pattern or a white/transparent pattern
depending on the third argument.
Example 17 (ex17_texture.c)
puts textures on particular objects in the world.
There are minor
differences between ex16 and ex17.
Example 18: Objects with billboarded textured surfaces (ex18)
A common technique in computer graphics to simplify rendering
(and thus reduce rendering time), is to use a textured plane that
rotates to continually face the viewer — known as a "billboard".
Common examples include trees and spheres.

In a virtual reality system, the location of the user is
affected by their physical movements.
Thus, additional calculations must be performed to properly
orient billboards toward the user.
FreeVR has a special function to perform these calculations:
vrRenderGetBillboardAngles3d().
Again we use the vrRenderInfo conduit to pass rendering
state information into the function.
The second argument (after the conduit) is a vrEuler.
This is is one of the few places where the use of Euler angles
is not discouraged.
The final argument(s) provide the location of the billboarded object.
This information can be provided in one of two ways:
- vrRenderGetBillboardAngles3d() — three double precision values
- vrRenderGetBillboardAnglesAd() — an array of three double precision values
Depending on the symetry of the billboarded object, there are two
common ways to perform the billboarding rotations:
- cylindrical — symmetry about the Y (up) axis; and
- spherical — symmetry about the X (lateral) and Y (up) axes.
Cylindrical billboarding requires one rotation to properly face the
user, while spherical requires two rotations.
An example of each is shown in the code.
Example 18 (ex18_texture.c)
shows a cylindrical and spherical billboards.
There are minor
differences between ex17 and ex18.
Example 19: The world in miniature (ex19)
The World-in-Miniature (WIM) technique is a common virtual world
interface tool.

By separating out the rendering of just the virtual world (as opposed
to the user interface objects), we can reuse the new virtual-world rendering
function, to render the world twice — once at full scale, and again
as the miniature view.
In this example, a representation of the user's location is included
(a sphere).
In order to properly place the sphere representation, the travel
matrix must be removed from the matrix stack.
This is accomplished by doing a matrix multiple with the inverse of the
travel matrix using the function vrRenderTransformUserTravelInv().
For this example, the method of travel has been alter to walk-through
(rather than fly-through).
This is to better demonstrate the usefulness of the WIM as a means
of navigating through the simple maze.
Example 19 (ex19_wim.c)
lets the user summon a WIM view.
There are minor
differences between ex18 and ex19.
Example 20: Clipping away part of the world (ex20)
Using standard OpenGL clipping routines, we can provide a means for
the user to see-through part of the world.

By clipping away part of the virtual world, we can peer inside and
through objects.
Through judicious use of enabling and disabling the clipping plane
operation, we can control which parts of the world are subject to
clipping.
In this example, the user interface objects cannot be clipped, but
the virtual world can be.
A simple rectangle shape is added to the user interface to represent
the orientation of the clipping plane.
Of course, planes are infinite in scope, so the rectangle is only
a plane-segment.
NOTE that the specification of the OpenGL clipping plane parameters
are done in real-world coordinates.
Example 20 (ex20_clipplane.c)
lets the user remove half the virtual world.
There are minor
differences between ex19 and ex20.
Example 21: Using OpenGL lighting for interactive lights (ex21)
In addition to manipulating objects in the world, sometimes it is valuable to
manipulation other features of the world, such as the lighting parameters.
In this case, giving us the effect of holding a flashlight.
- Again, we make use of standard OpenGL capabilities to affect the
lighting of the world.
- The standard fixed light (the "scene light") can now be toggled on and off.
- In addition to the scene light, a light can be held in the hand.
The type of hand-held light can be:
- none,
- a positional light,
- a directional light, and
- a directional spot light.
- Unlike all of the previous examples, this one makes use of two additional
button inputs (4 & 5).
Therefore to allow this example to work on systems with less than five
buttons, two of the previous operations were moved to buttons 4 & 5
allowing the control of the lighting parameters to be performed with
buttons 1 & 2.
- Example 21 (ex21_flashlight.c)
lets the user shoot a small yellow cube.
- There are minor
differences between ex20 and ex21.
Tutorial Summary:
This tutorial demonstrates most of the important and unique features of
the FreeVR virtual reality integration library.
Each function is introduced as part of a small progression toward
increasinly more capable demonstration applications.
The functions that are not presented here tend to be very similar
to ones that are presented, plus many of the mathematical operations
which do not need to be fully enumerated.
The FreeVR webpage includes a
functions reference document
that fully lists all the functions needed for robust application
development.
Other tutorials are also under development to demonstrate the use of
FreeVR with scene-graph and physics libraries.
These will become available on the
FreeVR tutorials webpage as
they reach sufficiently documented states.
There is also a FreeVR-Performer tutorial available on the webpage,
though that is of limited value since the Performer library is not
as widely used as it has been in the past.
Programming Caveats to Remember:
As a reminder, there are a handful of things to watch our for when
writing virtual reality applications:
- Applications not tested with multi-process rendering may not
work with multi-process rendering.
- E.g. applications developed in conjunction with an ImmersaDesk
or other single-screen display system and/or simulator displays
may not work in a CAVE.
- This can be avoided by testing the application in a
multi-process rendering situation.
- A special configuration file will allow for multi-process rendering
even for a simulated VR system on multi-CPU systems.
Last modified 10 January 2010.
Bill Sherman, shermanw@indiana.edu
© Copyright William R. Sherman, 2010.
All rights reserved.
In particular, republishing any files associated with this tutorial
in whole or in part, in any form (included electronic forms)
is prohibited without written consent of the copyright holder.
Porting to other rendering systems is also prohibited without
written consent of the copyright holder.