FreeVR Library Programming Tutorial (w/ OpenGL)
The FreeVR Library will be used to exemplify many VR interface and
programming techniques.
FreeVR will work in most local VR facilities (eg. CAVE™ or Head-Based-Display),
as well as on the available desktop machines in a simulated-VR mode.
It can be used on PC's with Linux and OpenGL installed, or
in a (currently very) limited way on PC's with Cygwin installed.
The FreeVR webpage is www.freevr.org.
This tutorial takes the approach of starting with an extremely simple initial
example and then advancing in very small increments (baby steps) until we can
perform some interesting tasks in virtual reality.
Another point to make about this tutorial is that it is not intended as a
means to learn advanced OpenGL programing.
The computer graphics in these examples are all very simple.
The goal of the tutorial is to highlight programming tasks that are unique
to programming for a virtual reality system.
On the other hand, some basic OpenGL techniques can be learned by following
this tutorial, as it will show how to handle texture maps, billboards,
cutting planes, and moveable lights.
Other tutorials are under development for interfacing FreeVR to additional
rendering and world simulation systems.
Presently there is a tutorial available for the SGI Performer scene-graph
library, but as Performer has greatly decreased in usage new tutorials
will cover similar, but more popular libraries.
This tutorial has been most recently tested with FreeVR version 0.6a.
Part 1: Examples 0 - 5
In Part 1 of this tutorial we will cover five example programs (with some
varients of those).
These five examples will bring us to the point where we can retrive
some very basic input information from the user of the VR system
and make basic alterations of the virtual world.
We will also discuss:
- The basics of a VR system,
- The basics of a VR integration library,
- OpenGL lighting issues,
- Simulation Time,
- Basics of using the simulator view,
- FreeVR naming conventions,
- FreeVR complex-types, and
- The FreeVR input mechanism.
Components of a VR system
The FreeVR integration library — a crucial component of a VR system
- The FreeVR Library began development in 1998 to provide an Open-source
alternative to other VR libraries that were then in wide-spread use, but
not open systems.
- FreeVR was designed from the outset to handle both projection style VR
systems as well as head-based VR systems.
- FreeVR has been designed to work on a variety of different operating
systems (though at present most are Unix-based).
- FreeVR works with basic OpenGL, OpenSceneGraph and Performer rendering
libraries.
- FreeVR has an API designed for simplified porting from specific other
non open-source VR integration libraries.
- FreeVR has an extensive tutorial demonstrating many common programming
interface features available to the VR programmer.
Features of a typical VR-integration library
- It handles the I/O necessary for creating a VR experience.
- User tracking
- User input controls
- Visual rendering process
- It handles the graphics windowing interface by:
- doing the window placement (and deletion)
- calculating the proper projection matrix for each window
- calling the render routine each frame for each viewpoint
- It is configurable.
- Can be configured to run almost any type of VR display.
- Can be interfaced to a wide variety of tracker and other input systems.
- Can be adjusted for any placement of screens and trackers.
- Many other features can be controlled.
- How to render the scene is left to the application developer.
- It has a built in simulator display allowing any VR application
to be run on any compatible workstation desktop screen.
This is generally used for development and testing, but also
allows for additional people to join in a multiperson VR
experience.
(NOTE: many libraries work under most types of UN*X
operating systems, and occasionally non-UN*X OSes.)
- It allows VR users and developers to share applications and code.
Example 0: A simple virtual world — in Xwindows (ex0)
- The common method of rendering real-time visual scenes of a
virtual world is to represent the world as a collection of
polygons, specify a view ("camera") position, pass this
information to the rendering hardware, update the world based
on any pending input events, and do it again.
- A typical (eg. SGI or Linux) desktop graphical application uses OpenGL
in combination with some Xwindows API.
- Example 0 (ex0_xwin.c) is such
an application.
- This lecture does not cover how to program with OpenGL,
but we need to have something in our virtual world to look at.
A simple shape rendering API is used by all the examples
(shapes.c).
- This lecture also does not cover the use of Makefiles, but
a Makefile is provided to compile
this and all the example programs.

- Perhaps a more fair comparison between using a VR integration library
and a simple OpenGL rendering program is with the GLUT or SDL desktop
integration libraries.
Basic structure of FreeVR Library application
The FreeVR Library runs many tasks in parallel.
This helps reduce tracker lag, as well as maintain better frame rates.
- Simulation process
- Rendering processes (1 - N)
- Input processes (1 - N)
- Telnet communication process
Function callbacks are used to inform the library how to render, etc.
Shared memory is used by the library to maintain consistency across
processes, and by the application to pass data from the simulation to
the rendering.
Example 1: The bare essentials FreeVR application (ex1)
- The FreeVR Library handles all the viewpoint calculations, so these
can (must) be left out of the FreeVR application.
- The FreeVR Library monitors many types of input, and allows the
application to consume this information.
- The basic functions needed for a FreeVR application are:
- vrStart() — forks the processes, etc.
- vrFunctionSetCallback() — sets the render function
- vrExit() — cleans house
- vrCallbackCreate() — make the render function a callback
- Example 1 (ex1_bare.c) is such
an application using the same simple world as example 0.
- There are some significant
differences between ex0 and ex1.
- This example is the rudimentary way to program a FreeVR application.
- Keystrokes (on the numeric keypad) can be used to alter the
simulated view into the world.

Example 2: A more civilized FreeVR application (ex2)
- Same world as before, but this time a little more civilized programming.
- The additional FreeVR functions in this application are:
- vrConfigure() — pass arguments to override the default configuration
- VRFUNC_DISPLAY_INIT — render process initializations
- vrFrame() — VR system upkeep
- vrGet2switchValue() — read a keyboard input
- Self documentation — a collection of functions that allow users a quick reference to using the application:
- vrSystemSetName() — specify the name of the application
- vrSystemSetAuthors() — specify the name(s) of application's authors
- vrSystemSetExtraInfo() — specify some extra information about the application
- vrSystemSetStatusDescription() — provide a status string about the state of the application
- vrInputSet2switchDescription() — specify the action generated by a particular physical button input
- The application programmer provides three functions to create the
virtual world (in addition to the simple main():
- init_gfx() — one-time graphics initialization
- draw_world() — render the world once
- update_world() — make any changes to the world
("World Simulation")
- Example 2 also adds some signal handling routines to ensure that an
interrupt signal received during the VR system initialization will not
interfere with that operation — which could result in only a
partial termination of the program, leaving some orphaned processes behind.
- Example 2 (ex2_static.c) makes use of
these more advanced features, while using the same simple world as before.
- There are some minor
differences between ex1 and ex2.
- For the user the only difference is how the application terminates
(pressing the 'Escape' key vs. waiting 10 seconds).
Example 3: A world with some action (ex3)
The two objects of the simple virtual world are now dynamic.
They move over a simple path.
One new (static) object has been added to the world.
The user still cannot interact with the world other than moving to view
it from another perspective.
- World dynamics are done in the Simulation (main) process.
(Because there may be many rendering processes.)
- Data is passed from the world-simulation process to the rendering
processes via shared memory.
- Collecting shared data into a class or structure is wise.
- New function in the main process to initialize the world — init_world().
- Rendering callback now passes an argument to the data describing the virtual world.
- The additional FreeVR functions in this application are:
- vrShmemAlloc() — allocation some shared memory
- vrCallbackCreate() (revisited) — passing arguments
- vrCurrentSimTime() — how long have we been doing this?
By incorporating the time within the simulation into the circular path
calculation, we ensure that the rate of movement is constant regardless
of the speed of the rendering computer.
It is good programming practice to make use of the passage of real time
when calculating movements in a virtual world to guarentee consistence
across systems.
- Now all our application-programmer supplied operations (init_world(),
update_world(), and draw_world()) are all passed the wd
pointer to the world database.
This is how the state of the virtual world is initialized/simulated and then
used for rendering.
- All changes to the state of the virtual world should happen in the simulation
routine (here called update_world()). The display (aka rendering) process
just reads information about the state of the virtual world and converts it to
graphics primitives.
- The nature of how many virtual reality displays work is to provide a drastically
different perspective rendering on different screens.
The result of this is that the OpenGL lighting mechanism can result in mismatched
colorization if the proper rendering ordering is not followed.
The solution is for the light location/direction value to be set after the
perspective matrix is calulated and placed on the rendering stack.
In other words, the location/direction of the light must be set at the
beginning of the rendering ("draw_world()") routine.
- Example 3 (ex3_dynamic.c) is
such an application — with a slightly more complex world.
- There are minor
differences between ex2 and ex3.
Aside: Simulating VR on the Desktop
Frequently, the availability of a virtual reality facility might be limited.
This might be the result of high usage of the facility, or for some
developers, the facility might be located at a distance that precludes
frequent visits.
In any case, there is a general need for developers of VR applications
to be able to do some amount of testing at their desktop, on systems
that do not have the input or output capabilities of a true VR display.
Therefore most VR integration libraries include the ability to simulate
running VR applications with more mundane interfaces.
- Simulator mode enables developers to work at their desktop for much
of programming effort required to create a new virtual reality
application.
- It is important to note however that the user interface and other
aspects of a good VR application must be tested on actual VR hardware,
so the simulator cannot fully replace a virtual reality system.
- Applications running on a computer not configured as a VR display
will run in "simulator mode."
- Complete Instructions on using the simulator mode are in the
FreeVR User's Guide.
- FreeVR Simulator mode translates keystrokes and mouse movements into
VR inputs:
- Controller Inputs
- mouse BUTTONS — controller (aka wand) buttons
- SPACE + mouse — controller (aka wand) joystick
(hold down the spacebard and move the mouse in X & Y)
- Head and Wand Tracking
- ARROW keys — move active 6-dof tracker left, right, forward, back
- SHIFT + ARROW keys — move active 6-dof tracker left, right, up, down
- ALT + ARROW keys — rotate active 6-dof tracker left, right, up, down
- 'w' key — select the wand as the currently active 6-dof tracker
- 's' key — select the head (skull) as the currently active 6-dof tracker
- 'n' key — select the next 6-dof tracker to be the active tracker
- 'r' key — reset the active 6-dof tracker to its initial position
- '*' key — reset all 6-dof trackers to their initial position
- Simulator View Controls
- Keypad ARROW keys ('2','4','6','8') — rotate the simulated view
- Keypad '-' & '+' keys — zoom simulated view in and out
- Keypad '5' key — reset the simulated view
- Keypad 'Enter' key — set a new home view
- Keypad 'Home' key — jump to the home view
- Simulator Rendering Controls
- Keypad '*' key — toggle the FPS display (not limited to simulator views)
- Keypad '/' key — toggle the simulator representations
- Keypad 'PgUp' key — toggle the self help info (not limited to simulator views)
- Keypad 'PgDn' key — toggle input tracking
- Keypad 'Del' key — toggle the timing statistics (not limited to simulator views)
- Keypad 'End' key — toggle world rendering
- Misc
- '?' — print simulator usage information to the tty
Aside: FreeVR naming conventions
To facilitate the writing and interpretation of FreeVR application code,
the FreeVR library adheres to a set of naming conventions.
The conventions are:
- All functions & types begin with "vr";
- All Performer related functions begin with "vrPf";
- All features of the function are contained within the function's name;
- Each word that identifies a feature begins with an upper case letter;
- Acronyms are upper case throughout (e.g. "RW");
- Function groups begin with the group name followed by the operation.
E.g.:
- vrRender...()
- vrUserTravel...()
- vrGet...()
- vrShmem...()
- Functions that return a complex type value, begin with the type name.
[NOTE: the returned value is also placed in the first argument of
such functions, and the returned value is a pointer to the result.]
E.g.:
- vrMatrixGet...()
- vrVectorGet...()
- vrMatrixMult...()
- vrVectorSubtract()
- Function groups with similar operations but taking different arguments
have the argument list as part of the name.
E.g.:
- vrMatrixSetTranslation3d() — takes 3 doubles
- vrMatrixSetTranslationAd() — takes an array of doubles
Aside: FreeVR complex types
FreeVR provides a small set of type definitions for operations on
mathematical types that contain more than a single scalar value.
In most cases, the internal data representation is a single dimensional
array of values.
The types are:
- vrMatrix — internally represented as 16 doubles stored as array field "v"
- vrPoint — internally represented as 3 doubles stored as array field "v"
- vrVector — internally represented as 3 doubles stored as array field "v"
- vrEuler — internally represented as 2 arrayes of 3 doubles stored as fields "t" & "r"
The types vrPoint and vrVector have identical internal representations.
Therefore they can be easily converted through a simple type-cast.
However, mathematically points and vectors are different, therefore it is strongly advised
that the appropriate type be used for any given circumstance.
NOTE: one can think of points and vectors as actually being 4 element arrays, with the
fourth element of a point being '1' and the fourth element of a vector being '0'.
The use of the vrEuler type is strongly discouraged.
There are two occasions in which vrEuler is not discouraged:
- when porting code of an existing VR application that was written with a VR integration library in which the use of Euler angles was encouraged,
- when using the billboard orientation calculation routines provided by FreeVR.
For functions returning values as a complex type, the first argument is always a
pointer to the location into which the result will be copied.
The function then returns a copy of that pointer as the return value.
The reason for doing this is to eliminate the need for the library to
continually allocate and deallocate memory when performing these operations.
By returning a copy of the pointer as the return value, it is possible to
nest operations.
Aside: FreeVR inputs
Having the ability to render worlds that provide the proper perspective
for a VR user is a good start to creating a virtual reality experience.
But a world in which all the user can do is move their head about to
see from different perspectives can become boring rather quickly.
Therefore, the next two examples (ex4 & ex5) begin to demonstrate
the basic means by which physical inputs from a hand-held controller
can affect the virtual world.
Later examples with introduce more complex means of interaction,
including direct input interactions, virtual input interactions,
as well as a means by which agent input interactions can be
accomplished.
There are three types of FreeVR inputs that are most commonly used:
- 2switch (i.e. a button)
- returns a single integer value
- valuator
- returns a double precision floating point value in [-1.0, 1.0]
- a joystick is implemented as a pair of valuators
- 6sensor (i.e. a 6-degree-of-freedom position)
- returns a vrMatrix value
- is used for returning the positions of 6-dof tracker devices (e.g. the position of the head or a controller)
- there are many functions for extracting particular aspects of a 6sensor rather than the complete matrix
(e.g. the vector pointing forward from the hand controller)
Each type of input can be querried in one of two ways:
- vrGet...Value() returns the current value
- vrGet...Delta() returns the difference in the value between now and the last time a query was made
Using the "Delta" form of the input queries can be especially useful when determining
whether a button was just pressed or just released.
NOTE: In general, input queries should take place only in the simulation routine,
and not in any of the rendering routines.
The effects of the inputs should then be calculated as part of the simulation,
and passed to the rendering routines as part of the state of the world.
Example 4: Very simple inputs — buttons (ex4)
The use of buttons to affect the virtual world is the most basic form of input
possible.
In this example, buttons 1 and 3 (which frequently correspond to
left and right buttons on a VR controller) move the green pyramid up and down
by one unit.
Button 2 resets the green pyramid back to the center of the working volume.

This type of interaction where the manipulation of a physical input device
causes an immediate response to an object in the virtual world is known
as the "physical control" input method.
Here is a sample of the new code:
update_world(WorldDataType *wd)
{
...
if (vrGet2switchDelta(1) == 1)
wd->obj3_y += 1.0;
if (vrGet2switchDelta(2) == 1)
wd->obj3_y = 5.0;
if (vrGet2switchDelta(3) == 1)
wd->obj3_y -= 1.0;
...
}
NOTE that by using the "Delta" version of the input query we can easily specify
that the world database should be altered precisely when a particular button has
just been pressed (vs. all the time that it is pressed).
There are minor
differences between ex3 and ex4.
Example 5: Very simple inputs — valuators/joysticks (ex5)
Another typical controller input for virtual reality (as well as other
hardware such as game controllers) is the joystick.
A joystick is really just two valuators connected to allow one to
easily move in both the X and Y directions at the same time.

- Here is a code snippet:
#define JS_EPSILON 0.125
#define MOVE_FACTOR 0.25
update_world(WorldDataType *wd)
{
double joy_x, joy_y;
...
joy_x = vrGetValuatorValue(0);
joy_y = vrGetValuatorValue(1);
if (abs(joy_x) > JS_EPSILON)
wd->obj3_x += joy_x * delta_time * MOVE_FACTOR;
if (abs(joy_y) > JS_EPSILON)
wd->obj3_z += (joy_y-copysign(JS_EPSILON,joy_y))
* delta_time * MOVE_FACTOR;
...
}
NOTES:
It is generally a good idea to have a "dead zone" around the zero value
due to the nature of physical inputs — often there is no precise zero
value so a small positive or negative number will be returned instead.
Also, light presses on sensitive joysticks may cause unwanted movement.
The JS_EPSILON factor represents the size of the dead zone.
This example shows two ways of handling the "dead zone".
In the case of the joy_x value, as soon as the X dimension of
the joystick moves past the epsilon value, the movement will immediately
jump to that value, causing a discontinuity in the movement.
For the joy_y value, the epsilon value is subtracted from
the input (in a sign-neutral way) such that when the input moves slightly
outside the epsilon range, it will respond with slight movements
reflecting the small delta between the actual value and the epsilon.
The previous button inputs (from example 4) are still active, so they
too can control the green pyramid, and in particular button 2
will reset the location of the pyramid — though the code changed
a bit to reset all three location parameters.
Last modified 10 January 2010.
Bill Sherman, shermanw@indiana.edu
© Copyright William R. Sherman, 2010.
All rights reserved.
In particular, republishing any files associated with this tutorial
in whole or in part, in any form (included electronic forms)
is prohibited without written consent of the copyright holder.
Porting to other rendering systems is also prohibited without
written consent of the copyright holder.