university fernando pessoa
TRANSCRIPT
University Fernando Pessoa
A Survey of Desktop VR Development Environments
IST-2001-37661
pages: 26 (including cover)
first version: 03 January 2003
last revised: 14 January 2003
final version issued: 15 January 2003
A survey of desktop VR development tools 2 UFP/M.I.N.D. Lab
Summary
This document describes a survey of software tools for the development of desktop-
based virtual reality (VR) environments. It starts by describing the dimensions used to
evaluate the tools, followed by a description of each tool’s characteristic on each of
those dimensions. Details of each tool’s availability are also presented.
A survey of desktop VR development tools 3 UFP/M.I.N.D. Lab
Table of contents
1. INTRODUCTION ............................................................................................................5
2. DEVELOPMENT ENVIRONMENT CHARACTERISTICS........................................7
3. VR DEVELOPMENT ENVIRONMENTS .....................................................................9
1.1 WORLDUP ...................................................................................................................10
1.1.1. Name ..................................................................................................................10
1.1.2. Platforms ............................................................................................................10
1.1.3. Interface for development....................................................................................10
1.1.4. Type of VE produced ...........................................................................................10
1.1.5. Graphics formats supported ................................................................................ 11
1.1.6. VR hardware supported....................................................................................... 11
1.1.7. Realism of the resulting VE .................................................................................12
1.1.8. Type of tool and cost ...........................................................................................12
1.2 VR JUGGLER ...............................................................................................................13
1.2.1. Name ..................................................................................................................13
1.2.2. Platforms ............................................................................................................13
1.2.3. Interface for development....................................................................................13
1.2.4. Type of VE produced ...........................................................................................14
1.2.5. Graphics formats supported ................................................................................15
1.2.6. VR hardware supported.......................................................................................15
1.2.7. Realism of the resulting VE .................................................................................16
1.2.8. Type of tool and cost ...........................................................................................16
1.3 MAVERIK.....................................................................................................................17
1.3.1. Name ..................................................................................................................17
1.3.2. Platforms ............................................................................................................17
1.3.3. Interface for development....................................................................................18
1.3.4. Type of VE produced ...........................................................................................19
1.3.5. Graphics formats supported ................................................................................20
1.3.6. VR hardware supported.......................................................................................20
A survey of desktop VR development tools 4 UFP/M.I.N.D. Lab
1.3.7. Realism of the resulting VE .................................................................................21
1.3.8. Type of tool and cost ...........................................................................................21
1.4 WORLD TOOLKIT (WTK) .............................................................................................22
1.4.1. Name ..................................................................................................................22
1.4.2. Platforms ............................................................................................................22
1.4.3. Interface for development....................................................................................22
1.4.4. Type of VE produced ...........................................................................................24
1.4.5. Graphics formats supported ................................................................................24
1.4.6. VR hardware supported.......................................................................................25
1.4.7. Realism of the resulting VE .................................................................................25
1.4.8. Type of tool and cost ...........................................................................................26
1. INTRODUCTION
In the context of spatial presence, this report introduces a number of characteristics with the
purpose of testing the adequacy and suitability of a number of Virtual Reality (VR)
development environments. This section outlines the characteristics which will be used to
describe and evaluate the development tools, enabling the informed choice of a development
environment for Virtual Environments (VEs) creation.
Because interest in VR has increased in recent years, the number of development tools
enabling the development of VEs has also increased. These tools may take the form of
programming libraries and toolkits requiring the use of a common programming language,
such as C, C++ or Java, to develop the environment. Other tools provide fully integrated
development environments which include in the environment every aspect of modelling,
coding and running the resulting VE.
Each of the available development environments possesses a unique set of features and
capabilities. Different tools provide different development interfaces, languages and different
levels of abstraction for development. This defines the methods available for creating
graphical objects and for controlling their behaviour, for interacting with the environment, and
for querying motion trackers and other VR input devices available to the user.
Because different tools support different VR hardware, such as input devices, displays, audio
outputs, and haptic devices, it is important to acknowledge each tool capability of supporting
the required input devices.
Different tools also support different graphics formats: some require that the developer
implements graphical objects from scratch, while others enable the developer to import
graphics from popular 2D and 3D vector drawing and animation tools.
In the end, different tools produce VEs with different levels of realism, enabling for more or
less immersive environments for the end user. While the relationship between presence and
immersion is still a topic of debate, the virtual environment should enable for testing different
levels of realism and their relationship to the sense of presence.
A survey of desktop VR development tools 6 UFP/M.I.N.D. Lab
To summarise, there are a number of variables to consider when selecting a VR development
environment, and this report intends to present a clear set of dimensions which can be used to
inform and support the choice of a particular development environment. It starts by
suggesting and describing a number of characteristics, or features, which should be taken into
account when selecting a development environment. This is followed by a description of a
number of available tools along the outlined dimensions.
The characteristics outlined in the following paragraphs include the development environment
capabilities in terms of graphics formats and VR hardware support, the realism it supports in
terms of kinematics, obstacle detection and texture detail, the interface and development
language it provides for developers, the end platforms it supports, the possibility of running
the resulting VE in different locations, and its cost.
A survey of desktop VR development tools 7 UFP/M.I.N.D. Lab
2. DEVELOPMENT ENVIRONMENT CHARACTERISTICS
The following characteristics will be considered to describe and classify a range of VR
development environments.
Name
This characteristic describes the software development environment name, including its
current version, vendor and a URL to the development tool homepage.
Platforms
This characteristic describes the operating system platforms where the development tool may
be installed and run.
Interface for development
This characteristic describes the kind of interface the tool makes available for the developer. It
includes a description of the kind of interface (graphical, text), the required level of
development (high, low), the ease of use, and the programming languages which must be
learnt to develop a virtual environment.
Type of VE produced
This characteristic describes the requirements, which must be met by the end platform to
enable the use of the resulting virtual environment (for example CAVE, or desktop VR). It
also classifies the resulting VE in terms of its ability to be run without too many requirements,
and lists the respective requirements.
Graphics formats supported
This characteristics describes the development environment in terms of its ability to import
scenes developed in common graphics formats such as 3D Studio format (3ds) and CAD
format (dxf).
A survey of desktop VR development tools 8 UFP/M.I.N.D. Lab
VR hardware supported
This characteristic describes the development tool in terms of the VR input and output devices
it supports. It also refers the availability of suitable drivers for different VR hardware.
Realism of the resulting VE
This characteristic describes the development tool in terms of kinematics, obstacle detection
and texture detail it supports to be included in the resulting VE.
Type of tool and cost
This characteristic describes the type of tool (open source, freeware, shareware, commercial)
and the cost of development licenses.
A survey of desktop VR development tools 9 UFP/M.I.N.D. Lab
3. VR DEVELOPMENT ENVIRONMENTS
This section describes the following VR development environments along the dimensions
introduced in section 2:
• World Up, provided by Sense8.
• VR Juggler, provided by Iowa University.
• Maverik, provided by University of Manchester.
• WorldToolkit, provided by Sense8.
A survey of desktop VR development tools 10 UFP/M.I.N.D. Lab
1.1 WorldUp
1.1.1. Name
WorldUp, release 5 (R5), is a tool for building interactive 3D worlds. It is provided by Sense8.
More information can be found at: http://www.sense8.com/products/wup.html.
1.1.2. Platforms
The World Up development environment is available for PCs running MS Windows
98/NT/2000/XP. It requires an OpenGL graphics card on the development system.
1.1.3. Interface for development
World Up is a complete graphical-based software development and delivery environment.
It is built on top of Sense8's WorldToolKit, a widely used commercial 3D/VR software
development toolkit which is analysed below.
World Up provides real-time functionality in an interactive, object-oriented environment.
This means that the concepts of real time and interactivity are included in the development
process as well as in the final VR application. Because it provides a development GUI
(Graphical User Interface), it is possible to immediately see the effects of changing design
parameters, modifying object behaviours or modifying textures. These features may
substantially reduce the development time. In fact, World Up provides a set of high level
objects, which may speed application development since they contain predefined properties
and methods that can be accessed via the development interface, or through a Visual Basic-
like scripting language. The object hierarchy is extensible and provides the World Up
developer with an object-oriented development environment.
1.1.4. Type of VE produced
World Up supports the development of both desktop VR and immersive VR applications,
provided the immersion hardware is available.
A survey of desktop VR development tools 11 UFP/M.I.N.D. Lab
In terms of computing platforms for delivering the VR application, World Up supports the
following platforms, allowing for cross platform portability without the need to recompile the
VR application:
• PC: minimum requirements - Pentium 90 Mhz, 24MB RAM, OpenGL-based graphics
accelerator board, Microsoft Windows
• SGI (Silicon Graphics): minimum requirements - Indigo 2, 64MB RAM, Irix 5.2 or
higher.
World Up provides two options for the delivery of the VE:
• For non-commercial use, the stand-alone World Up Player allows developers to
freely distribute their VE, or simulation. There is a stand alone player for OpenGL and
one for Direct3D. This Player also has plug-in Capabilities for Netscape Navigator and
Microsoft Internet Explorer browsers for distribution of content over the Internet.
There is also an ActiveX control to embed in MS Windows applications.
• For commercial distribution of VEs for resale, Sense8 provides the World Up Engine
for a run-time fee.
1.1.5. Graphics formats supported
World Up supports some graphics standards including: Direct3D, VRML, 3DS (3D Studio),
DXF (AutoCAD), as well as file formats from Wavefront, ProEngineer and
MultiGen/ModelGen.
1.1.6. VR hardware supported
World Up supports such viewing devices as StereoGraphics CrystalEyes and CrystalEyes VR,
Virtual i-O i-glasses with head tracker, Victormaxx CyberMaxx II w/head tracker, Virtual
Research VR4 and FS5, and 3DTV stereo glasses.
World Up supports the following tracking devices: Polhemus Fastrak, Polhemus Isotrak and
Isotrak II, Polhemus Insidetrak, Ascension Bird/Flock of Birds, and Logitech (head) tracker.
A survey of desktop VR development tools 12 UFP/M.I.N.D. Lab
World Up supports the following audio devices: standard MS Windows Sound Cards (for 2D
sound), and Crystal River Engineering NT Sound Server/Acoustitron.
World Up supports such navigation devices as Logitech Magellan, a standard 2/3 button
mouse, Spacetec Spaceball Model 2003, Logitech 3D Mouse, Thrustmaster Serial Joystick,
and Formula T2 steering wheel.
1.1.7. Realism of the resulting VE
World Up supports object kinematics through the notions of behaviours and it allows
attaching viewpoints to objects. It also supports automatic obstacle and collision detection and
allows for mapping textures to objects. It supports user interactivity by using “sensors” for a
range of input devices.
World Up integrates two main tools: the development environment, and an integrated
modeller, which renders model objects exactly as they will appear in the final VE. This
integrated modeller puts an emphasis on real-time geometric modelling, including seamless
editing and extrusion of model geometry, presentation of multiple views, controlling
facedness, turning vertex normals on/off, and merging co-planar polygons. The World Up
Modeller can run in two modes: stand-alone, or integrated with the development environment.
1.1.8. Type of tool and cost
World Up is a commercial development environment and it costs $1995 USD. There is
technical support for one year available for $120. There is also a floating license option for
$600 which allows one copy to be used one at a time from any system on a LAN.
A survey of desktop VR development tools 13 UFP/M.I.N.D. Lab
1.2 VR Juggler
1.2.1. Name
VR Juggler is an open source virtual platform for VE development. It provides a standard
environment for application development, testing, and execution. VR Juggler's design allows
for such features as dynamic reconfiguration, abstractions for input, and tit also allows to
monitor the VE performance. VR Juggler is a development framework which is the outcome
of a research project carried out at Iowa State University's Virtual Reality Applications Center.
More information may be found at: http://www.vrjuggler.org.
1.2.2. Platforms
Essentially, VR Juggler is a set of development technologies that provide the required tools
for VE construction. VR Juggler enables users to run the VE on a large number of VR
platforms: VR Juggler supports simple desktop systems such as PCs, as well as complex
multi-screen systems running on powerful workstations. Thus, the VEs that use Juggler
technology are flexible enough to be run on such operating systems as Irix, Linux and MS
Windows (Win32), and support many I/O devices through proxies to their device drivers, as
described below.
1.2.3. Interface for development
Juggler applications can leverage existing VR platforms by providing a set of generic
programming tools that are used through programs developed in C++. Together, these
development tools provide a complete abstraction from the physical details of the VR
platform in the form of reusable, cross platform, and modular components. Each component is
physically decoupled from the others, so that the final VE includes only the indispensable
components. The set of Juggler tools include:
• The VR Juggler virtual platform.
• A device management system which manages local or remote access to I/O devices.
A survey of desktop VR development tools 14 UFP/M.I.N.D. Lab
• A standalone generic math template library.
• A portable runtime providing thread and socket primitives.
• An abstraction for including sound.
• An XML-based configuration system.
VR Juggler uses an object-oriented software design based upon standard interfaces to
encapsulate VR components based on their functionality. This allows isolating applications
from the underlying hardware details. For example, a tracking device may use a positional
device interface so as to hide all hardware details. An application just needs to care about the
type of data it sends and receives, given that the library handles all the details of accessing the
actual hardware. In this way, the development interfaces provide a virtual platform which
allows VE designers to concentrate the development effort on the content of the VE rather
than the details of complex programming.
Moreover, VR Juggler's component abstraction supports “dynamic reconfiguration” of
applications at run time. This means that users may be able to add, swap, or modify any
component of the VE without affecting it when it is running. This may reduce downtime since
a VE does not have to restart just to make changes to a single component.
VR Juggler provides a framework for VE development - it supplies a well-defined interface
that a developer realizes to create an application object. The application object is subsequently
used to create the VE with which the user interacts. The interface assists development by
providing a contract that guarantees the time at which each part of the interface is executed
and the state of the system at that time. Therefore, the advantage of the application interface is
that a VE is merely another component of the system. It is possible to add, remove or
exchange any object at run-time. VR Juggler also allows multiple VEs to run concurrently.
1.2.4. Type of VE produced
VR Juggler scales across multiple platforms, but it also supports various types of VR systems
or VEs. Developers can create applications in a simulator using the resources available at
their desktops, without accessing the actual VR system until the application is to be finally
deployed. The VR Juggler development environment supports many VE types including
A survey of desktop VR development tools 15 UFP/M.I.N.D. Lab
desktop VR, and immersive applications for HMD (head Mounted Displays), CAVE-like
devices, and Powerwall-like devices.
1.2.5. Graphics formats supported
In VR Juggler the visual elements comprising the VE model are created through calls to
specific graphics APIs. VR Juggler provides support for OpenGL, OpenGL Performer, Open
Scene Graph and VTK graphics APIs by encapsulating all graphics-API specific functionality
in draw managers. The draw manager’s interface provides the VE with an API-specific
abstraction. Thus, VR Juggler does not import visual elements created by vector drawing and
3D design software packages.
1.2.6. VR hardware supported
At VR Juggler's center there is a portable microkernel, a minimal core that coordinates the
entire system, and that builds on an operating system abstraction layer which hides platform
specific primitives behind common interfaces to reinforce portability.
VR Juggler’s managers are the key components that encapsulate common services and plug
into the microkernel, supplying most of VR Juggler's services. They are flexible because they
can be added and removed at runtime. Examples are the Input Manager which controls input
devices, and the Draw Manager which handles rendering tasks for specific graphics APIs.
From this perspective, VEs are seen as components that the microkernel and the Draw
Manager execute using the standard application interface defined by VR Juggler.
The resulting VE uses device proxies to interface with all devices, allowing the VE to be
decoupled from the VR hardware being used. Thus, the device referred to by a proxy can be
changed at runtime without disturbing the normal behaviour of the VE.
There is also a device store which allows the system to keep all VR hardware drivers separate
from the main library and to load the drivers dynamically. New VR hardware drivers may be
added to the library without needing to recompile the VE.
The set of tracking classes includes support for the Ascension Technologies Flock of Birds,
Logitech 3D mouse, Fakespace BOOM, Immersion boxes, Virtual Technologies CyberGloves,
and Fakespace PINCH gloves.
A survey of desktop VR development tools 16 UFP/M.I.N.D. Lab
1.2.7. Realism of the resulting VE
VR Juggler does not include a modelling language, nor does it have built-in support for
collision detection and physical object behaviour. In this context, it only offers an abstracted
view of devices and displays that other low-level development environments do not include.
Thus, support for interaction in VR Juggler is at a very low-level - the developer must write
the programming code that looks at the state of the input devices and determines what effect
they must have on the environment. This means that there is neither an event model nor a
built-in way to associate behaviours with graphic elements.
Because VR Juggler only enables developers to create their own graphics models through a
graphic API, it also puts the responsibility of handling user and object interactions upon the
developer. Therefore, in contrast to development environment which use external formats for
graphics objects (such as DXF or 3DS formats), the realism of the resulting VE, measured for
example as the capability to implement collision detection and to support object physics, must
be manually implemented by the VE developer.
1.2.8. Type of tool and cost
VR Juggler is an open source development framework freely distributed under the LGPL
open source license and may be downloaded at http://www.vrjuggler.org.
A survey of desktop VR development tools 17 UFP/M.I.N.D. Lab
1.3 Maverik
1.3.1. Name
Maverik is a VR system designed to help programmers create virtual environments (VEs). It
was developed by the academic Advanced Interfaces Group of the University of Manchester
to tackle some of the problems found when using existing approaches to VE development.
Maverik's main strength is its framework for managing graphics and interaction.
More information may be found at: http://aig.cs.man.ac.uk/maverik/.
1.3.2. Platforms
Maverik runs on GNU/Linux PCs, and Silicon Graphics workstations. It requires an OpenGL
graphics card on the development system. Maverik is available as source code and it compiles
under Windows, MacOS and on UNIX systems, provided that the system supports OpenGL,
Mesa (version 3.1), IrisGL or DirectX (version 8). However, OpenGL/Mesa is the best
supported library.
More specifically, Maverik runs on RedHat 5.2 and 6.x, FreeBSD 3.2, SuSE 7.1, Irix 5.3, 6.3
and 6.5, SunOS 5.7, MacOS and Windows (Win32) 98, 2000 and NT.
Maverik provides two main components:
1. The Maverik micro-kernel, which implements a set of core services, and a
framework that applications can use to build complete VEs and VR interfaces.
2. The Maverik supporting modules, which contain default methods for display
management including culling, spatial management, interaction and navigation, and
control of VR input and output devices. These default methods may be customised
to operate directly on application data, enabling optimal representations and
algorithms to be employed.
A survey of desktop VR development tools 18 UFP/M.I.N.D. Lab
1.3.3. Interface for development
Maverik is a programming tool and not an end-user application. It enables both the rapid
production of complex VEs (virtual environments) and the development of applications with
3D graphics or using 3D peripherals since it provides many functions for that purpose.
Maverik concentrates on graphical and spatial management
Maverik is a C toolkit that interfaces with a 3D rendering library such as Mesa or OpenGL,
providing facilities for managing a VE. It can be viewed as an extra layer of functionality
above the raw 3D graphics library. The differentiating aspect of Maverik is that it does not
need to create its own internal data structures for the VE. Instead, it makes direct use of an
application's own data structures through a callback mechanism.
The VE instructs Maverik about the classes of objects it needs to render, and then it provides
the functions needed to do the rendering. On the other hand, Maverik provides the extra
facilities for building flexible spatial management structures (SMSs) with which the objects
are managed. When the user navigates the VE, Maverik tracks the SMSs and makes
appropriate calls on the VE to render its objects.
The advantage of this approach is that the application can keep control of its data, in a way
that there is only one set of data to manage. Moreover, the application can define the meaning
of the data when there is the need to decide how to render the objects. For example, there may
be a set of CAD application's native data structures for the VE model objects. When the VE is
called to render a particular object, it can then decide what would be an appropriate rendering
for this frame - it may, for example, decide to always render the object, because it is an
important piece of the VE. But the VE can change representations dynamically to suit
conditions. Similarly, VEs with special lighting algorithms may use their own structures and
algorithms to maintain real time performance.
In essence, Maverik provides three levels of usage of increasingly complexity:
1. Using the default objects and algorithms provided.
2. Defining the developer’s own classes of objects.
3. Extension and modification of the Maverik’s kernel.
A survey of desktop VR development tools 19 UFP/M.I.N.D. Lab
Using the default objects and algorithms provided.
Used in this way, Maverik resembles any other VR building package for programmers. The
tutorial documentation for the system leads the developer through the building of a VE with
behaviour, collision detection and customized navigation.
Defining the developer’s own classes of objects.
This is the usage level where developers benefit from application-specific optimisations that
may be specified by providing their own rendering and associated callbacks. The tutorial
documentation provides examples at this level, and the supplied demonstration applications
make extensive use of the techniques included in this level.
Extension and modification of the Maverik kernel.
Rendering and navigation callbacks are one set of facilities that can be customized. However,
other parts of the Maverik core functionality may also be customised: for example, alternate
culling strategies, spatial management structures and input device support.
1.3.4. Type of VE produced
Although it was originally designed as a component of a large-scale distributed multi-user VR
system called Deva, Maverik works equally well in stand-alone mode, and it supports the
construction of desktop-VR applications, as well as VR applications for individual users. So,
on its own, because Maverik is a single-user VE micro-kernel, it provides a framework and
toolkit for a single user to perceive, interact with, and navigate around, a graphically complex
VE. Therefore it can be used successfully for stand-alone single-user VR applications.
It does not include any assistance for running multiple VEs on different machines for sharing
a world between people. Running a system with many users across wide area networks with
multiple VEs, applications, interaction and behaviour is a more complex task. But when
integrated with Deva, Maverik supports multiple virtual worlds and applications, together
with sophisticated methods of specifying behaviours for objects within VEs.
A survey of desktop VR development tools 20 UFP/M.I.N.D. Lab
1.3.5. Graphics formats supported
Maverik supports three graphics formats: VRML97 [http://www.vrml.org], Lightwave and
AC3D [http://www.ac3d.org].
Only the geometry of VRML97 files is read, and Maverik does not attempt to parse scripts,
URL's, and viewpoints. Furthermore, not all of the ways in which the geometry can be
defined are supported, such as concave polygons or colour-per-vertex.
AC3D is geometry modeler which allows for creating, editing and importing objects from a
number of common 3D file formats such as 3DS, DXF, Lightwave and VRML1. Other
utilities, such as Crossroads and 3DC then translate between many different 3D file formats
and the AC3D format. Thus, supporting AC3D provides indirect support for many common
3D file formats.
On the other hand, Maverik is distributed with sample graphics file parsers and libraries of
common geometric primitives such as cones, cylinders, and teapots. These may be used as a
starting point, but more complex models have to be provided the developer in terms of objects
and algorithms.
1.3.6. VR hardware supported
Maverik is distributed with stereo, 3D peripheral drivers for head-mounted display and 3-D
mice. However, Maverik has no support for audio or video within it, so the developer needs to
develop a mechanism for those. A standard compilation of Maverik provides supports for a
desktop mouse, keyboard and screen.
The configuration of 3D peripherals used in some VEs tends to be environment specific.
Maverik includes code in the distribution to support Polhemus Fastrak and Isotrak six degree
of freedom trackers (optionally coupled to Division 3D mice), Ascension Flock of birds (ERC
only), Spacetec SpaceBalls and SpaceOrb360s, Magellan Space Mouse, InterSense InterTrax
30 gyroscopic trackers, 5DT data gloves, and a serial Logitech Marble Mouse. With
modification other similar specification 6 DOF (degree-of-freedom) input devices/tracking
technology can be supported. It also supports IBM's ViaVoice speech recognition engine, and
A survey of desktop VR development tools 21 UFP/M.I.N.D. Lab
more specific peripherals such as Microsoft SideWinder Force-Feedback joystick and a
special MIDI server.
1.3.7. Realism of the resulting VE
Maverik supports high-performance rendering, including large-model processing, customised
representations of environments for different types of VEs, and customisable techniques for
interaction and navigation.
Maverik is distributed with an animated avatar, navigation facilities, and 3D math functions,
but the standard Maverik navigation does not perform full collision detection. In fact, Maverik
provides the functionality to easily detect if a collision has occurred, but it does not dictate
what happens as a result. The implementation of navigation which performs collision
detection to stop the user’s movement must be manually programmed by the developer.
1.3.8. Type of tool and cost
Maverik is part of the GNU project and is distributed freely with full source under the GNU
GPL license. It is an open source virtual reality (VR) development system publicly available
at http://aig.cs.man.ac.uk/maverik/download.php. The distribution includes documentation, a
tutorial and examples.
A survey of desktop VR development tools 22 UFP/M.I.N.D. Lab
1.4 World Toolkit (WTK)
1.4.1. Name
WTK, Release 8, is a standard VR library provided by Sense8 with a large user community
and it can be used to write many types of VEs. WTK is one of the few packages that includes
functionalities for a large range of VR development needs.
More information can be found at: http://www.sense8.com/products/wtk.html.
1.4.2. Platforms
WTK is a cross-platform environment, available on many platforms, including SGI, Intel,
Sun, HP, DEC, and PowerPC. It requires an OpenGL graphics card on the development
system.
1.4.3. Interface for development
WTK is a VR library written in C, but C++ wrappers are also available. To create a VE, the
developer has to write C/C++ code that uses the WTK API. WTK is capable of managing the
details of reading sensor input, rendering scene geometry, and loading databases. The VE
developer is responsible for manipulating the simulation and changing the WTK scene graph
based on user inputs.
The WTK library is based on object-orient concepts but it has no inheritance or dynamic
binding. WTK functions are ordered in 20 classes, including: Universe (manages all other
objects), Geometries, Nodes, Viewpoints, Windows, Lights, Sensors, Paths, and Motion
Links.
The basis for all WTK VEs is the Universe – it contains all objects that appear in the VE. It is
possible to have multiple scene graphs in a VE, but it is only possible to have one Universe in
a VE. When new objects are created, the WTK simulation manager automatically manages
them. The core of a WTK application (VE) is the simulation loop. Once the simulation loop is
started, every part of the simulation occurs in the Universe.
A survey of desktop VR development tools 23 UFP/M.I.N.D. Lab
There is a special function called the universe action function. It is a user-defined function
that is called each time through the simulation loop. This special function is the place where
the application can execute the VE and change it accordingly, for example changing geometry
properties, manipulating objects, detecting collision, or stopping the simulation.
WTK sensor objects return position and orientation data from the real world, and WTK allows
sensors to control the motion of other objects in the simulation. WTK has two main categories
of sensors: relative and absolute. Relative sensors just report changes in position and rotation.
Absolute sensors report values that relate to a specific position and orientation. WTK allows
developers to treat these two categories of sensors identically by using a common interface.
The simulation loop takes care of updating all sensor data and dealing with the data the sensor
is returning. The WTK interface allows sensor pointers to be used nearly interchangeably in a
VE. But when creating a new sensor object, the type of sensor being used must be specified in
the function call. This means that when given a sensor pointer, retrieving data from a sensor is
identical regardless of the type of sensor. But in order to get a sensor pointer, developers have
to specify the type of device that they would like to use. If the developer needs to use a
different type of sensor, the application code has to be changed and re-compiled. This leads to
VEs that are not completely portable across differing VR hardware. However, this problem
can be avoided if the user writes code to control sensors using a configuration file for the
application.
In order to extend the object-oriented feel of WTK and also to allow for multi-user
simulations, WTK Release 8 includes a new Object/Property/Event architecture. This
architecture has three fundamental capabilities:
• all objects have properties that are accessed through a common interface,
• all property changes trigger an event that can be handled by an event handler, and
• all properties can be shared across multi-user simulations using World2World.
It is also possible to add user-defined properties to objects. These properties can then be
treated in the same way other properties are treated in the system.
The new Object/Property/Event architecture helps to simplify many common programming
tasks. By using the event-based structure, data updates can be propagated through the system.
A survey of desktop VR development tools 24 UFP/M.I.N.D. Lab
For example, if a sensor value is modified, the event handler can automatically modify any
objects that rely upon that sensor value.
1.4.4. Type of VE produced
WTK supports the development of both desktop VR and immersive VR applications,
provided the immersion hardware is available. In order to provide for multi-user distributed
environments using WTK, Sense8 provides World2World - a server for WTK that distributes
the property information about each object. Because World2World distributes object
properties, it will only work with VEs that use the Object/Property/Event architecture of
WTK. World2World allows client VEs to connect to the World2World server which acts as
the central distributor of data updates for the whole simulation. The server controls the system
by distributing the properties of objects that are specified as shared. Whenever a shared object
has a property changed, an event is automatically generated that will distribute that data to the
World2World server and subsequently to the other clients.
1.4.5. Graphics formats supported
WTK enables the developer to construct geometries in two ways: WTK provides functions
both for loading geometry as well as for specifying dynamic geometry. The developer can use
a modelling program to design a geometry and then load it into WTK, or alternatively use the
special functions WTK provides for creating geometries.
WTK supports the following CAD and modelling programs: AutoCAD, Pro/Engineer, and
any 3D modelling tool that can produce a DXF file. WTK also supports other file formats
including 3D Studio (3DS), Wavefront (OBJ), MultiGen/ModelGen (FLT), and VideoScape
(GEO). Additionally, WTK can read and write VRML (WRL) files. WTK can also read and
write neutral file format (NFF) files - ASCII text files with an easy-to-read syntax, and binary
NFF files.
WTK also lets the developer create spheres, cones, cylinders, 3D text, as well as geometry
built from the ground up with vertex and polygon primitives. WTK geometry is based on a
scene graph hierarchy. The scene graph specifies how the application is rendered and allows
for performance optimisation. The scene graph allows for such features as object culling, level
of detail (LOD) switching, and object grouping. WTK also allows the user to manually edit
A survey of desktop VR development tools 25 UFP/M.I.N.D. Lab
the scene graph. Developers are able to create geometry at the vertex and polygon levels, but
they can also use primitives provided by WTK such as spheres, cones, cylinders, and 3D text.
When WTK loads in a data file, all geometry is automatically allocated into one node. So, the
data file is not converted into the internal scene graph structure. This means that WTK does
not allow the developer to manipulate the geometry within their files once they were loaded.
The developer is only able to manipulate a single node that holds all the geometry.
1.4.6. VR hardware supported
WTK supports such viewing devices as StereoGraphics CrystalEyes and CrystalEyes VR,
Virtual i-O i-glasses with head tracker, Victormaxx CyberMaxx II w/head tracker, Virtual
Research VR4 and FS5, and 3DTV stereo glasses.
WTK supports the following tracking devices: Polhemus Fastrak, Polhemus Isotrak and
Isotrak II, Polhemus Insidetrak, Ascension Bird/Flock of Birds, and Logitech (head) tracker.
WTK provides cross-platform support for 3D and stereo sound. The sound API provides the
ability for 3D spatialization, Doppler shifts, volume and roll-off, and other common audio
effects. WTK supports the following audio devices: standard MS Windows Sound Cards (for
2D sound), and Crystal River Engineering NT Sound Server/Acoustitron.
World Up supports such navigation devices as Logitech Magellan, a standard 2/3 button
mouse, Spacetec Spaceball Model 2003, Logitech 3D Mouse, Thrustmaster Serial Joystick,
and Formula T2 steering wheel.
1.4.7. Realism of the resulting VE
WTK provides functions for collision detection, and object behaviour. In this context, WTK
supports the creation of paths - a path is a list of position and orientation values. These paths
can be used to control viewpoints or to transform objects in the scene graph. WTK provides
the ability to record, save, load, and play paths. There is also support for smoothing rough
paths using interpolation. Additionally, WTK supports motion links that connect a source of
position and orientation data with some target that is transformed based on the information
from the source. The source of a motion link can be a sensor or a path. Valid targets include
A survey of desktop VR development tools 26 UFP/M.I.N.D. Lab
viewpoints, transform nodes, node paths, or a movable node. It is also possible to add
constraints to motion links in order to restrict the degrees of freedom.
1.4.8. Type of tool and cost
WTK is a commercial cross platform VR development environment available from Sense8
There are versions for Windows, Linux, SGI IRIX, SUN Solaris and each of these is treated as
a different product, and each costs $7000 USD. The technical support for one year is $360
per copy. Because WorldToolKit is a C programming tool, it is much more low level than the
other Sense8 tool, WorldUp. The mains strengths of WTK are: it is a highly used development
environment with a large user base, it has solid cross platform support, and it has a vast
library of device drivers – WTK supports nearly every device on the market. The limitations
of WTK include the need to change the VE programming code and to recompile it if there is a
hardware change, and WTK does not perform as well as some other VR libraries.