FreeVR: Virtual Reality Integration Library
User Guide




    Function List
    Library Dev
    Socket Interface


All materials
Copyright © 2015
William R. Sherman

FreeVR: Application Development Guide

FreeVR: Application Development

(ie. Programmer's) Guide March 20, 2015 for FreeVR Version 0.6f
Written by Bill Sherman


This guide is for assisting the development of applications designed to be compiled with the FreeVR library. It explains the basic organization of creating a VR application using FreeVR. A FreeVR Function Reference is available that lists all the functions that one might use in programming a FreeVR application.

One major caveat: this guide is not yet complete. It will certainly help to get one started down the path, and combined with the function reference and the tutorials, there should be sufficient documentation to write interesting virtual reality applications.

This document does not explain how to compile the library itself, administer the library, or use applications developed with FreeVR. For these subjects, see one of the companion guides:

What the FreeVR library does for the application programmer

The primary purposes of the FreeVR library is to handle all the hardware interfacing and allow any virtual reality application to run on any hardware configuration. Other benefits provided by the library are the automatic use of multi-processing to make applications run as effectively as possible, and the ability to map actual hardware devices to logical devices so the user can interact with the application in the most convenient way possible given particular hardware availability.

A FreeVR application is configured at run-time, so the user (or VR lab administrator) can choose how to run a particular application by modifying a configuration file, without changing or recompiling the application itself. Configuration information is discussed in the Administrator Guide.

What a FreeVR application does

The FreeVR library handles all of the interfacing with VR hardware input and output. This allows the VR application programmer to focus on how to render the virtual world, and to affect the simulation based on user inputs.

What the application does is divided into world rendering and world simulating. These two sides are handled independently to allow the world to be rendered as rapidly as possible even when the world simulation calculations might become slower. Therefore it is prudent to ensure each aspect be handled independently. In FreeVR, the rendering is assigned as a callback function that the FreeVR system will then call for each viewpoint. The world simulation is handled in the main flow of the application, typically in a loop that scans for termination conditions. Other processes, such as inputs, are handled exclusively by the FreeVR system, with functions that can be called to query the state of the VR system.

Types of Input

FreeVR currently handles five different types of input:

  • Binary switches,
  • Valuators,
  • 6-DOF sensors,
  • N-ary switches, and
  • N-DOF sensors.
For most operations the first three of these types are used.

Binary switches are simply buttons and switches that can be on or off. Examples from a typical desktop include the buttons of a mouse, and the keys on a keyboard. A common example in a VR facility include the buttons on a wand.

Binary inputs are typically used to signal a discrete event. For example grabbing an object, or indicating a menu selection. FreeVR binary inputs give an integer value of zero or one (of course).

Valuators are inputs that provide a value from a continuous range of numbers. On a desktop computer system, a valuator might be the percentage of the cursor's vertical height on the screen, a dial that can be twisted, or an axis of an analog joystick. A common VR valuator is a joystick held by the user. Both axes of a desktop or wand joystick are simply treated as two separate valuators.

Valuators are used to indicate the amount of something. For example, how fast to travel forward, or rotate the world. In FreeVR, most valuators are scaled to fill the range [-1.0, 1.0].

Six-DOF sensors ("6-sensors") are inputs used to track an object or part of the participant in a VR display. DOF is short for "degree of freedom." In a VR system, the six degrees of freedom are X, Y and Z translational movement from the origin (the location), and the X, Y and Z rotational offsets from the origin's defined coordinate system (the orientation). The combined location and orientation can be referred to as the overall position. NOTE: not in this documentation, but in other circumstances, the term "position" is sometimes used to mean just location or just orientation. Here, "position" will be used to refer to the combined location and orientation of a sensor.

Position sensing devices are typically hardware devices specifically designed for VR systems. Common devices that the FreeVR library interfaces with are the ART Dtrack systems, InterSense IS-900 units, and Ascension Flock of Birds. There are many other 6-DOF hardware input devices that FreeVR has been programmed to interface with. Such 6-DOF input hardware is often referred to as "position trackers" or sometimes just "trackers". There are also desktop devices that can provide 6-DOF inputs. Two examples are the Magellan SpaceMouse and the Spaceball. These devices give relative values, so interface code in the library can be used to "fly virtual sensors around the simulated world."

Six-DOF sensors are used to indicate the position of an object in the real world. For example the location of the CAVE wand, and the direction it is pointing. FreeVR uses 4x4 homogeneous matrices to store and manipulate 6-sensors. There are functions that can extract useful values from the matrix.

Documenting the usage of an input

FreeVR provides the means for the application programmer to "self-document" the user-interface of their application. In fact, the result is more akin to a quick-reference guide to using the application, but it can be quite useful. This "self-documentation" works by providing a short string that describes each of the inputs used by the application. When the program is running, the user can press the "9/PageUp" key on the numeric keypad to toggle the display of the quick-reference guide.
picture of a keypad with the 9/pageup key highlighted

The five functions for describing particular inputs are:

  • (int)vrInputSet2switchDescription(<input number>, <string description of input>)
  • (int)vrInputSetNswitchDescription(<input number>, <string description of input>)
  • (int)vrInputSetValuatorDescription(<input number>, <string description of input>)
  • (int)vrInputSet6sensorDescription(<input number>, <string description of input>)
  • (int)vrInputSetNsensorDescription(<input number>, <string description of input>)

There are also two functions for adding information about the application that will be included in the quick-reference display (which are not directly related to inputs, but are worth noting here). These two functions allow the application programmer(s) to specify a title/name of their application, and also to give themselves and their collaborators authorial credit for the application.

These two functions are simply:

  • vrSystemSetName(<string with name of application>)
  • vrSystemSetAuthors(<string with authors of application>)

Requesting the value of an input

There are many options to choose from when retrieving an input value. One can request the current value, or the difference in the value since the last request. One can specify the input numerically (ie. "button-1" "button-2", etc) or directly (with a pointer to the input structure). Six-sensors also allow one to choose which coordinate system the values should be reported in.

The three most common functions for obtaining input values are:

  • (int)vrGet2switchValue(<2switch number>)
  • (double)vrGetValuatorValue(<valuator number>)
  • (vrMatrix *)vrMatrixGet6sensorValues(<matrix pointer>, <6sensor number>)
There are also functions to get specific information about a 6-sensor device, such as it's location (a vrPoint), a vector pointing from it in one of the six cardinal directions (a vrVector), or an Euler angle representation of the 6-sensor position — but this is highly unrecommended, and intended only for people porting code from a library that used Eulers, and don't want to spend too much time eliminating the Eulers.

These functions can return results in one of two coordinate systems (see the next section): the real world's or the virtual world's. When requesting values relative to the virtual world, it is necessary to specify a user number, because if there are multiple users, they are each allowed to have their own virtual world coordinate system. If the application is designed for only one user, then it is reasonable to use 0 (zero) as the user number.

  • (vrPoint *)vrPointGetRWFrom6sensor(<point pointer>, <6sensor number>)
  • (vrPoint *)vrPointGetVWFromUser6sensor(<point pointer>, <user num>, <6sensor number>)

  • (vrVector *)vrVectorGetRWFrom6sensorDir(<vector *>, <6sensor number>, <direction>)
  • (vrVector *)vrVectorGetVWFromUser6sensorDir(<vector *>, <user num>, <6sensor num>, <direction>)

  • (vrEuler *)vrEulerGetRWFrom6sensor(<euler pointer>, <6sensor number>)
  • (vrEuler *)vrEulerGetVWFromUser6sensor(<euler pointer>, <user number>, <6sensor number>)

Coordinate Systems

A VR setup will have an arbitrarily assigned origin that is used to specify the offset of the output displays (e.g. screens) and the input position trackers such that they can be referenced in a single coordinate system. Because this origin is assigned a position in the real world, we refer to this as the real-world coordinate system (RW). In FreeVR, when an application begins, the origin of the virtual world will match that of the real world. However, there are functions available to offset the virtual world from the real world, thus allowing the user to move to another place in the virtual world without physically moving.

The standard FreeVR coordinate system is based on the OpenGL standard. As the user looks into the world in some arbitrarily defined primary viewing direction (e.g. the front wall of a CAVE), the X dimension is to the right, the Y dimension is up, and the Z dimension comes out at the viewer. The origin is also arbitrary, but the de facto standard in CAVE systems is to place the origin in the center of the floor (ie. about where the user's feet will be when the application begins).

FreeVR can also be used with graphics libraries other then OpenGL. Some of these libraries use a different coordinate system. Performer is one current example of this. Performer was designed primarily with the flight simulation community in mind, and that community treats the ground terrain as the X/Y plane. Thus, Performer's coordinate system is basically a rotation from the OpenGL CS of 90 degrees around the X-axis. The Z dimension is now up, the Y axis points into the front screen, and X remains the same.

The standard FreeVR 6-sensor routines return information in the OpenGL-based coordinate system. When working in Performer, the returned values can be converted to the Performer system with a conversion function.

Virtual World Coordinate System

The coordinate system in which all the virtual objects in the virtual world are defined relative to is the virtual-world coordinate system(VW). FreeVR provides a suite of functions that can offset and reorient the virtual world from the real world. Because this offset is a good means of allowing the participant to move through the virtual world, these functions are referred to as the "UserTravel" functions.

The FreeVR input system allows the position of 6-sensors to be reported either in the RW or VW coordinate system. FreeVR provides the option of multiple users each having their own relative VW position, so a user number must be specified when requesting a VW position. Applications can operate on the assumption that all users travel together by using the "VR_ALLUSERS" value when modifying the virtual world coordinate system, and using user-0 when requesting a VW value.

Programming in FreeVR

Several example programs are provided in the Tutorials Manuals with a fairly comprehensive Functions Reference Guide.

Compiling a FreeVR application

FreeVR provides a standard header file that includes all the type definitions and function declarations needed by the FreeVR programmer — freevr.h. An OpenGL application with the FreeVR library requires the OpenGL library (-lGL), The standard and input X11 windows libraries (-lX11 -lXi), and the standard Unix math library (-lm). So, a typical link line for FreeVR will include:

-lfreevr -lGL -lX11 -lXi -lm
On Linux systems, the location of the X11 libraries must often be specified, and the dynamic loading routines require the addition of another library (-ldl).
Linux: -L/usr/X11R6/lib -lX1 -lm -ldl -lfreevr -lGL -X11

Beginning with version 0.5e, FreeVR has been adapted to compile appropriately on both 64bit and 32bit operating systems. When a particular type is specified (since it is possible to compile a 32bit version on 64bit hardware), the name of the library typically has the string "_32" or "_64" appended to the library file name. Thus, on a Linux system (for example), one might now use:

Linux: -L/usr/X11R6/lib -lX1 -lm -ldl -lfreevr_64 -lGL -X11

Also beginning with version 0.5e, there is a compile-time pthreads option for FreeVR which replaces the forking mode of multi-processing with pthreads. Compiling the FreeVR library with the pthreads option produces a separate (and distinctly named) version of the library — typically ending in "pt". So we might now have:

Linux: -L/usr/X11R6/lib -lX1 -lm -ldl -lfreevr_64pt -lGL -X11
The biggest difference in using the pthread version versus the traditional forking version is the fact that global variables in the pthread version will now be accessible in all of the threads. NOTE: the use of such global variables can have a tendency to poorer programming technique, so the use of the pthread specific library is discouraged except in cases where it is absolutely necessary — ie. not just the result of programmer laziness.

Example Code

Many examples of FreeVR OpenGL programs can be found in the FreeVR Tutorials package.

Performance analysis (aka "statistics")

FreeVR keeps track of many timing statistics as an application is running. For the typical user, there may be some interest in the overall frame rate of the system (which is available), but for the power-user, and especially for the efficiency-minded application programmer, a detailed statistical display is available.

For the basic user, the display of the frame rate (aka FPS — frames per second) can be displayed for any rendering window by pressing the "*" key on the numeric keypad. The FPS display will appear on whichever FreeVR window currently has the system input focus. The location, color, and whether the display will initially appear in a window, are all values that can be set within the FreeVR configuration file. Some of these values can also be adjusted while the application is running through the socket interface controls.
picture of a keypad with the '*' key highlighted

During program development application programmers can benefit from the use of the more advanced statistics display available through the standard FreeVR library. As with the "FPS" information, particulars about the statistics, can be controlled both by the FreeVR configuration file, and the socket interface controls. These control options will also be described here.

For the full statistics, the "Decimal" / "Delete" key on the numeric keypad toggles the statistics display for the window with keyboard input focus.
picture of a keypad with the ./del key highlighted

Independent statistics are kept for each of the processes of a FreeVR system. These processes typically include one for each of the windows, one for all of the inputs, and one for the world simulation. Thus for each process, the configuration can specify separate controls of how the statistics will be measured and/or displayed. One thing that is not controlled within the process' own configuration is on which window its data will appear. This information is configured on a window by window basis.

By default, each window will be responsible for displaying the rendering statistics associated with its own process. However, there is no direct correlation between process and window for either the input or simulation processes. Therefore, the ability to control this is part of the system configuration process. NOTE: for the default "simulator" system, the input and simulation processes are displayed on the single simulator rendering window.
Image of a Simulator window with statistics for all three processes displayed

Each of the three categories of process (input, simulation, renderer) has separate categories for separating the sub-timings of the process.

For the input process the categories are simply a listing of all the input-devices handled by that process, plus the time spent sleeping between loops.

For the simulation process (aka the "main process"), the categories are:

  • Time spent paused
  • Time spent sleeping
  • Time spent in simulation category-1
  • Time spent in simulation category-2
The differentiation between simulation categories 1 and 2 is specified by the application programmer through the use of the vrSystemSimCategory() function. If no differentiation is specified, then all simulation time is listed as category-1.

Without the separate categories for dividing the simulation time, it would be difficult for the application programmer to actually determine in which part of their code the application is responsible for consuming a disproportionate amount of time. For example, the programmer may want to know how much of the simulation is spent in one particular routine within the simulation, so they can bracket this part of the code with calls to vrSystemSimCategory() to learn the relative amounts of time inside and outside of this section of their code. Or, another example might be to bracket all the calls to FreeVR's locking routines to determine how much time is spent blocked within the locks.

The "paused" time is controlled by the user during runtime, by pressing the "Pause"/"Break" key on the keypad, FreeVR will stop the advance of time, and ...

The time spent "sleeping" is the result of a configurable option for each process that allows the VR administrator to set the minimum amount of time spent in each "frame" of the given process. Typically this minimum time is set to 0ms, in which case no time will be spent "sleeping".

For the rendering process, the categories are:

  • Time spent initializing the rendering for this frame
  • Time spent within the application's frame callback
  • Time spent within the application's rendering callback — category-1
  • Time spent within the application's rendering callback — category-2
  • Time spent rendering FreeVR system information (includes the statistics themselves, and the simulator display)
  • Time spent waiting for configuration-specified minimal frame time
  • Time spent synchronizing with the other rendering frames (ie. waiting for the slower rendered frames to catch-up)
  • Time spent in freeze mode (which is especially a barrier around the double-buffering of all input/travel data)
  • Time spent swapping the frame buffer (which generally indicates time waiting for the next vertical refresh)
As with the simulation process, two separate categories to differentiate two (not necessarily contiguous) portions of the rendering code are provided. Again, this provides a mechanism by which the application programmer can isolate segments of their rendering routine to identify portions that may be consuming a large segment of time. In this case, the vrRenderCategory() function is called with the value of "1" or "2" to specify under which category the time spent during the previous operations should be summed.

TODO: describe the system configuration options for statistics

	- window object:
		- "statsProcs" -- a list of processes whose stats
			should be shown on this window
			(including a special "self" argument)
		- "showStats" -- boolean to indicate whether stats
			display is on at startup
			[NOTE: currently being set to 1 by default, but really should be 0]
	- process object:
		- "stats" options:
			- "label" (string)
			- "calc" (bool) -- whether to calculate statistics for each frame
						(can be used to sort-of "pause" the stats display)
			- "show" (bool) -- whether to show this particular process' stats
			- "mask" (integer) -- a bitmask of which stats values to show
			- "xloc" (float)
			- "yloc" (float)
			- "width" (float)
			- "top" (float)
			- "interval" (float)
			- "scale" (float)
			- "bg" (4 float-list)

TODO: describe the runtime options for statistics

	- hitting the "DEL"/"." key on the numeric keypad will toggle
		a configured window's stats display

	- telnet controls:
		- window[] stats {0,1} -- turn off/on a window's stats display.
		- "proc[] stats_calc"  -- set the flag of whether to calculate stats\n"
		- "proc[] stats_show"  -- set the flag of whether to show these stats\n"
		- "proc[] stats_mask"  -- set the mask of which statistics to show\n"
		- "proc[] stats_xloc"  -- set the x location of where to put stats\n"
		- "proc[] stats_yloc"  -- set the y location of where to put stats\n"
		- "proc[] stats_width"  -- set the width of the stats display\n"
		- "proc[] stats_top"  -- set the top timeline of the stats display\n"
		- "proc[] stats_interval"  -- set the horizontal time intervals of the stats display\n"
		- "proc[] stats_scale"  -- set the vertical scale of the stats display\n"
		- "proc[] stats_color" ,, -- set the background color of the stats display\n"
		- "proc[] stats_opac"  -- set the background opacity of the stats display\n"

		[the only thing that can't really be done by telnet
		is set which processes will be shown on a particular

Interfacing with SGI's Performer Library

This section requires some augmentation, but in the mean time, we'll provide a list of functions specific to the Performer interface, and a link to several FreeVR-Performer examples:
  • vrPfPreFrame() — must be called between pfSync() and pfFrame()
  • vrPfPostFrame() — should be called right after pfFrame()
  • vrPfMasterChannel() — provides a handle for setting the scene-graph, etc.
  • vrPfDCSTransform6sensor(<pfDCS *>, <6sensor number>) — sets the DCS to the position of the 6-sensor
  • vrPfDCSTransformUserTravelCB(<pfTraverser *trav>, <NULL>) — a CULL callback for pfDCS nodes to set them to the virtual world coordinate system

Example Code

As with OpenGL, many examples of FreeVR Performer programs can be found in the FreeVR Tutorials package.

© Copyright William R. Sherman, 2015.