first person indoor/outdoor augmented reality application: arquake

12
First Person Indoor/Outdoor Augmented Reality Application: ARQuake Bruce Thomas, Ben Close, John Donoghue, John Squires, Phillip De Bondi and Wayne Piekarski Wearable Computer Laboratory, School of Computer and Information Science, University of South Australia, Mawson Lakes, South Australia Abstract: This paper presents a first person outdoor/indoor augmented reality application ARQuake that we have developed. ARQuake is an extension of the desktop game Quake, and as such we are investigating how to convert a desktop first person application into an outdoor/indoor mobile augmented reality application. We present an architecture for a low cost, moderately accurate six degrees of freedom tracking system based on GPS, digital compass, and fiducial vision-based tracking. Usability issues such as monster selection, colour, input devices, and multi-person collaboration are discussed. Keywords: Augmented reality; Computer games; Wearable computers 1. Introduction Many current applications place the user in a first-person perspective view of a virtual world [1], such as games, architectural design viewers [2], geographic information systems and medical applications [3,4]. In this paper, we describe a project to move these forms of applications outdoors, displaying their relevant information by augmenting reality. In particular we consider the game Quake (idSoftware). As with other researchers [5], we wish to place these applica- tions in a spatial context with the physical world, which we achieve by employing our wearable computer system Tinmith [6–8]. Tinmith is a context-aware wearable computer system, allow- ing applications to sense the position of the user’s body and the orientation of the user’s head. The technique we are developing will genuinely take computers out of the laboratory and into the field, with geographically-aware applications designed to interact with users in the physical world, not just in the confines of the computer’s artificial reality. The key to this exciting practical technology is augmented reality (AR). Users view overlaid computer-generated infor- mation by means of see-through head-mounted displays. Unlike virtual reality, where the computer generates the entire user environment, augmented reality places the computer in a relatively unobtrusive, assistive role. In the ARQuake application, a simplified representation of the physical world gaming location is modelled as a Quake 3D graphical model. The augmented reality information (monsters, weapons, objects of interest) is displayed in spatial context with the physical world. The Quake model of the physical world (walls, ceiling, floors) is not shown to the user: the see-through display allows the user to see the actual walls, ceilings and floors which ARQuake need only model internally. Coincidence of the actual structures and virtual structures is key to the investigation; the AR application models the existing physical outdoor structures, and so omission of their rendered image from the display becomes in effect one of our rendering techniques. 1.1. Aims Our aim is to construct first-person perspective applications with the following attributes: 1. The applications are situated in the physical world. 2. The point of view that the application shows to the user is completely determined by the position and orientation of the user’s head. 75 # Springer-Verlag London Ltd Personal and Ubiquitous Computing (2002) 6:75–86

Upload: bruce-thomas

Post on 14-Jul-2016

219 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

First Person Indoor/Outdoor Augmented

Reality Application: ARQuake

Bruce Thomas, Ben Close, John Donoghue, John Squires, Phillip De Bondiand Wayne Piekarski

Wearable Computer Laboratory, School of Computer and Information Science, University of SouthAustralia, Mawson Lakes, South Australia

Abstract: This paper presents a first person outdoor/indoor augmented reality application ARQuake that we have developed. ARQuake isan extension of the desktop game Quake, and as such we are investigating how to convert a desktop first person application into anoutdoor/indoor mobile augmented reality application. We present an architecture for a low cost, moderately accurate six degrees offreedom tracking system based on GPS, digital compass, and fiducial vision-based tracking. Usability issues such as monster selection,colour, input devices, and multi-person collaboration are discussed.

Keywords: Augmented reality; Computer games; Wearable computers

1. Introduction

Many current applications place the user in afirst-person perspective view of a virtual world[1], such as games, architectural design viewers[2], geographic information systems and medicalapplications [3,4]. In this paper, we describe aproject to move these forms of applicationsoutdoors, displaying their relevant informationby augmenting reality. In particular we considerthe game Quake (idSoftware). As with otherresearchers [5], we wish to place these applica-tions in a spatial context with the physical world,which we achieve by employing our wearablecomputer system Tinmith [6–8]. Tinmith is acontext-aware wearable computer system, allow-ing applications to sense the position of theuser’s body and the orientation of the user’s head.The technique we are developing will genuinelytake computers out of the laboratory and into thefield, with geographically-aware applicationsdesigned to interact with users in the physicalworld, not just in the confines of the computer’sartificial reality. The key to this excitingpractical technology is augmented reality (AR).Users view overlaid computer-generated infor-mation by means of see-through head-mounteddisplays. Unlike virtual reality, where thecomputer generates the entire user environment,

augmented reality places the computer in arelatively unobtrusive, assistive role.

In the ARQuake application, a simplifiedrepresentation of the physical world gaminglocation is modelled as a Quake 3D graphicalmodel. The augmented reality information(monsters, weapons, objects of interest) isdisplayed in spatial context with the physicalworld. The Quake model of the physical world(walls, ceiling, floors) is not shown to the user:the see-through display allows the user to see theactual walls, ceilings and floors which ARQuakeneed only model internally. Coincidence of theactual structures and virtual structures is key tothe investigation; the AR application models theexisting physical outdoor structures, and soomission of their rendered image from thedisplay becomes in effect one of our renderingtechniques.

1.1. Aims

Our aim is to construct first-person perspectiveapplications with the following attributes:

1. The applications are situated in the physicalworld.

2. The point of view that the application showsto the user is completely determined by theposition and orientation of the user’s head.

75

# Springer-Verlag London LtdPersonal and Ubiquitous Computing (2002) 6:75–86

Page 2: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

3. Relevant information is displayed as augmen-ted reality via a head-mounted see-throughdisplay.

4. The user is mobile and able to walk throughthe information spaces.

5. The applications are operational in bothoutdoor and indoor environments.

6. The user interface additionally requires only asimple hand-held button device.

1.2. Research issues

To achieve these aims, we investigated a numberof research issues in the areas of user interfaces,tracking, and conversion of existing desktopapplications to AR environments.

User interfaces for augmented reality applica-tions that simultaneously display both thephysical world and computer-generated imagesrequire special care. The choice of screen coloursfor the purely virtual images that the applicationmust display requires attention to the lightingconditions and background colours of the out-doors. The keyboard and mouse interactionsmust be replaced with head/body movement andsimple buttons. The screen layout of the userinterface must accommodate the AR nature ofthe application.

The six degrees of freedom (6DOF) trackingrequirements for these forms of applications mustbe addressed. We require a low cost, moderatelyaccurate 6DOF tracking system. Tracking isrequired for indoor and outdoor environmentsover large areas, for example our usual testingenvironment is our campus [9]. GPS positionalerror has a less noticeable effect for theregistration of augmented reality information atdistance, but we need to address positional errorwhen registering augmented information at closedistances (< 50 m). Such a tracking system couldbe used for other applications, such as tourisminformation, visualisation of GIS information,and architectural visualisation.

It is also necessary to modify the Quake gameto accommodate the AR nature of the newapplication. The user’s movement changes froma keystroke-based relative movement mode to atracking-based absolute mode. The game’s co-ordinate system must be calibrated to thephysical world. Finally, the field of view of thedisplay must be calibrated to the physical world.

2. Background

There are two basic styles of tracking: absoluteand relative. Furthermore, machine learning cantrain a system to recognise locations in a buildingor outdoors. Golding and Lesh use an array ofsensors: accelerometers, magnetometers, tem-perature and light to track a user in a set ofknown locations [10]. Aoki, Schiele and Pentland[11] use a camera to train the system to recognizethe user’s location and approaching trajectory.These systems can determine its present room,and whether it is entering or leaving that room.

Previous research has established that outdoortracking with inexpensive differential GPS andcommercial grade magnetic compasses are in-accurate for augmented reality applications [12].Traditional hybrid approaches combine anumber of different systems such as inertial,optical, electro-magnetic and GPS. In this paper,we present our hybrid approach of combiningvision-based optical tracking with GPS and amagnetic compass.

A number of researchers are investigatingfiducial vision-based tracking [3,13]. We basedour optical tracking system on the fiducialmarker tracking system ARToolKit developedby Kato and Billinghurst [14]. The ARToolKit isa set of computer vision tracking libraries thatcan be used to calculate camera position andorientation relative to physical markers in real

76

Fig. 1 Example of a fiducial marker.

B. Thomas et al.

Page 3: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

time. ARToolKit features include the use of asingle camera for position/orientation tracking,fiducial tracking from simple black squares,pattern matching software that allows anymarker patterns to be used, calibration code forvideo and optical see-through applications, andsufficiently fast performance for real-time aug-mented reality applications.

The fiducial markers are known-sized squareswith high contrast patterns in their centres.Figure 1 shows an example marker. TheARToolKit determines the relative distancesand orientation of the marker from the camera.In addition, the ARToolKit incorporates acalibration application to determine the place-ment of the camera relative to the user’s line ofsight; thus the ARToolKit can determine properplacement of graphical objects for AR applica-tions.

2.1. The original Quake game

We chose Quake as the primary application for anumber of reasons. Quake fits the general modelof AR that we are studying, as it is a first-person3D application with autonomous agents tointeract with the user. We were able to obtainthe application source code. Finally, the Quakegraphics engine is very quick and runs on a widerange of computing platforms and operatingsystems.

Quake is a first-person shoot ’em up game.Quake has two stated goals: ‘‘First, stay alive.Second, get out of the place you’re in’’ (idSoft-ware). The user interface is based around asingle, first-person perspective screen. The largetop part of the screen is the view area, showingmonsters. Status information is immediatelybelow at the bottom of the screen.

One moves around Quake in one of fourmodes: walking, running, jumping or swimming,and performs one of three actions: shooting aweapon, using an object, or picking up an object.Weapons are aimed by changing the viewdirection of the user, and fired by pressing akey. To push a button or open a door, the userwalks up to the door or button. A user picks upitems by walking over them. Part of thechallenge of the game is finding special objectslike buttons, floor-plate doors, secret doors,platforms, pressure plates and motion detectors.Quake incorporates platforms that move up anddown, or follow tracks around rooms or levels.Pressure plates and motion detectors may be

invisible or visible, and there are sensors whichopen doors, unleash traps, or warn monsters.

2.2 Wearable computer platform

The Tinmith wearable computer system hard-ware is all mounted on a rigid backpack so thatthe items can be attached firmly, (see Fig. 2).Processing is performed by a Toshiba 320CDSnotebook (Pentium-233, 64 Mb RAM) runningthe freely available LinuxOS and associatedprograms and development tools. The laptop isvery generic, and not even the latest in availableCPUs, so another computing unit could besubstituted. The limited I/O capabilities of thesingle serial port are augmented with the use of afour serial port Quatech QSP-100 communica-tions card. Connected to the laptop are aPrecision Navigation TCM2-80 digital compassfor orientation information (we now use anIntersense 300 tracker for head orientation), aGarmin 12XL GPS receiver for positioning, anda DGPS receiver for improved accuracy. For theHead Mounted Display (HMD), we use alter-nately the i-Glasses unit from I-O DisplaySystems, and the Sony Glasstron PLM-S700E.Various other devices are present as well, such aspower converters for the different components,

77

Fig. 2. Wearable computer platform.

First Person Indoor/Outdoor Augmented Reality ApplicationFirst Person Indoor/Outdoor Augmented Reality Application

Page 4: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

necessary connection cabling, and adaptors. Theconstruction of the backpack was directed withease of modifications in mind, at the sacrifice ofwearability and miniaturisation.

The Tinmith system [8] supports outdooraugmented reality research. The system iscomprised of a number of interacting librariesand modules. A number of software librariesform a support base for writing code in thesystem: a graphics interface on top of X windows;an interface to coordinate/datum transformationsand numeric conversions; encode/decode li-braries for transmitting structures over a net-work; tools for network communications andhigh level I/O; low level interfaces to Unixsystem calls, asynchronous I/O code, stringhandling, event generation, and error checking.

3. Using ARQuake

The goal of ARQuake was to bring the intuitivenature of VR/AR interfaces into an indoor/outdoor game. A user first dons the wearablecomputer on their back, places the HMD ontheir head, and holds a simple two-button inputdevice. The user then performs a simple calibra-tion exercise to align the HMD with their eyes,and then they start playing the game. All of thekeyboard and mouse controls have been replacedwith position/orientation information and a two-button haptic gun input device. As movementaspects of the game have been engineered to fitthe physical world, there is no concept ofcommands to walk, run, jump, swim, or ofmoving platforms. The user’s own movementdetermines the rate and direction of movement.The remainder of this section describes theQuake level we developed and its user interac-tion.

3.1. Haptic gun

To improve the play-ability of ARQuake, wereplaced mouse and keyboard button presseswith a haptic gun device. The aiming of theweapons is still the direction of the user’s head,but the firing of the weapon and changing ofweapons is performed with button presses onthe new gun input device. To give the gun a‘‘recoil’’ feel, we installed simpe haptic feedbackinto the gun.

A haptic gun was developed from an appro-priate toy plastic gun (see Fig. 3). Two standard

commercial push buttons were installed, alongwith a micro-switch to replace the primitivetrigger switch. A solenoid was placed towards therear of the gun, behind the centre of gravity, toenhance the ‘‘pitch-up’’ sensation of the recoil.A vibrating motor was placed as far forward inthe gun as possible to increase the moment armfrom the centre of gravity, thereby enhancingthe effect. The sound effects are generated by thesolenoid and vibrating motor.

The gun provides a number of haptic sensa-tions and sounds. There is the single shot, whichprovides a strong recoil with a loud bang. Thenumber of single shot weapons allow for addi-tional shots to be fired after a suitable reloadtime. The amount of time for reloading variesbetween single shot weapons. The shotgunprovides a double shot haptic and sound effect;this effect simulates the rapid firing of bothbarrels serially. The multiple firing weapons,such as the machinegun, provide a less strongrecoil with a short time interval between firing.The sound effect is a higher pitch bang with alower volume. Finally, there is an energy weaponthat fires a continuous stream of energy; thehaptic effect is a continuous vibration of the gunwith a high pitch whining sound effect.

78

Fig. 3. Haptic gun.

B. Thomas et al.

Page 5: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

3.2. Monsters

There are 16 different types of monster in theQuake world. Some have attributes that makethem unsuitable for inclusion in our AR versionof the game. Because of the limitations onmovement imposed by the tracking hardware,the best monsters were those that walked orleaped and those that were relatively easy todestroy and did not inflict extreme damage onthe user with their first attack.

We chose seven types of monsters to beincluded in our game world. These monsterstypes are all land-based creatures that useweapons from a distance. The monsters’ skincolour and texture were changed to make themeasier to see and distinguish from the physicalworld. The choice of colours used in the texturemaps or skins of the monsters are based on theuser testing described later.

We excluded monsters which were too largefor the environment or which had unexpected

effects; those which swam, as our campus doesnot include water features; those which flew,they move too quickly; those which surround theuser or have some other interaction which wouldrequire haptic feedback; and those whose attacktended to be immediately fatal, as they are notenjoyable.

3.3. Campus level

We created a Quake level (game world)representing the Mawson Lakes campus of theUniversity of South Australia. The walls inQuake are the external walls of the campusbuildings and the interior walls of the WearableComputer Laboratory (WCL). The walls arerendered in two fashions, black for game modeand a grid patterned for testing mode. In boththese modes, the walls occlude the graphicobjects in Quake that may be located behindthe walls. As described earlier, in the game modeblack walls are transparent to the users duringthe game. The Quake graphics engine renders

79

Fig. 4. Quake campus level.

First Person Indoor/Outdoor Augmented Reality ApplicationFirst Person Indoor/Outdoor Augmented Reality Application

Page 6: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

only monsters, items on the ground, and regionsof interest. This Quake level was derived fromarchitectural drawings of the campus provided bythe university; where the architect’s drawingshad become incorrect, we surveyed those por-tions ourselves.

The size of the outside modelled area is 500metres (East/West) by 500 metres (North/South). Figure 4 depicts a top-down view ofthe level we created, and Fig. 5 is a detailed viewof the most interesting quarter of the map. Wehave placed over 200 monsters in the level asfollows: on the ground, on top of buildings, andin second floor windows. There are hundreds ofitems placed on the ground for the user to pickup: pieces of armour, rockets, rocket launchers,shotgun shells and health boxes.

The system of tracking used in this systemtends to make the user less agile than the‘‘super-human’’ agility found in the normalgame. Therefore we have included moresupport equipment than would be found inthe normal game: armour, weapons and ammu-nition.

3.4. Walking around

Once the system is up and running, the usermoves through the level by walking, and changesview by looking around. The user views the gameand the physical world through the HMD, anexample is shown in Fig. 6 and Fig. 7. Thebottom portion of the screen is a status barcontaining information about armour, health,ammo and weapon type. The majority of thescreen is reserved for the AR images of monstersand game objects (see Fig. 8). In the originalQuake, certain actions are performed by the userbeing in a certain proximity to a location in aQuake level. We have retained most of thoseactions. Users pick up objects as in the originalQuake by walking over them. Traps are triggeredby standing in or moving through predeterminedlocations. Actions that are not easily reflected inthe physical world are removed from the game,such as secret and locked doors.

The tracking of the user’s position andorientation of the user’s head handles themajority of the interaction for the user. The

80

Fig. 5. One quarter of the Quake campus level.

B. Thomas et al.

Page 7: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

only other interactions for the user to performare to shoot or change the current weapon. Weemploy a two-button (thumb button and indexfinger button) hand-held device as a physicalinput device for these actions. The thumb buttonis used to change weapons, and the index fingerbutton fires the current weapon. The direction

the weapon fires is the centre of the current viewof the HMD.

Through informal user’s studies, users thoughtthat the visibility of the ARQuake system wasgood; however many of the users found thatbright sunlight made seeing through the displaydifficult. Despite only using the system once, theusers found the hand held input device intuitive,easy to use, and very quick to learn. A few of theusers found themselves pointing the device in agun like fashion when firing at the targets. Noone reported feeling nauseated while using thesystem. Users believed that it was easy to pick upitems although it was difficult to tell when anitem had been picked up without some form ofconfirmation.

People disliked the colours on the status barand thought the range of colours were limited.The monster colours were good and easy to see,and the users were able to easily identifymonsters. When asked ‘‘Is the movement inthe augmented reality relative to the real world?’’Most people thought that the movement relativeto the real world was okay but commented onthe lag and jitter when rotating their heads.When asked ‘‘Is it easy to shoot at the monsters?’’Most subjects found that the lag made it difficultto align the cross hairs at the targets. The actualprocess of firing the weapon was easy.

3.5. Field of view

Even if the alignment of the Quake world withthe physical world is exact, an incorrectperspective or field of view will be highlightedas inconsistencies in the virtual world. Thedefault field of view for the game is 90 degrees(45 degrees each side), allowing a reasonablecoverage of the world to fit onto a computerscreen. This field of view unfortunately suffersfrom the fish eye distortion effect when compar-ing the objects in the Quake world with realobjects. The HMD we are using, I-Glasses, hasapproximately a 25-degree horizontal field ofview. The only calibration adjustment for theHMD with Quake is changing the game’s field ofview setting and scaling of the graphical objects.We are currently using a field of view value of 25degrees, but there are artifacts introduced as inthe user is positioned farther forward. We areinvestigating the graphics model of Quake todetermine how it differs from traditional graphicsmodels.

81

Fig. 6. User’s heads up display.

Fig. 7. Second user’s heads up display.

Fig. 8. Status bar.

First Person Indoor/Outdoor Augmented Reality ApplicationFirst Person Indoor/Outdoor Augmented Reality Application

Page 8: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

4. Tracking

As previously stated, one of the goals of thesystem is to provide continuous indoor andoutdoor tracking. The system tracks throughthe combination of a GPS/compass system andwith a vision-based system. Our tracking needsare categorized into three areas as follows:outdoors far from buildings, outdoors nearbuildings, and indoors. Each of these require adifferent approach, while maintaining positionand orientation information in a common formatof WGS 84/UTM positioning information andheading/pitch/roll angles for orientation infor-mation. The use of visual landmarks can improveregistration in one of two ways, first to allow thesystem to correct the final image by aligning thelandmark with a known position in the graphicalimage, and secondly to use the landmarks toextract a relative position and orientation of thecamera from the landmarks. We have chosen thesecond option to investigate as it provides themost general tracking solution.

4.1. Outdoors away from buildings

GPS positional inaccuracies are less of a problemfor our Quake application when a user is at alarge distance >50 m from an object whichrequires registration, while orientation errorsremain constant as to angular deviations in theuser’s field of view. An extreme example of howpositional errors have a reduced registration erroreffect at distance is the using of the ARQuakegame on a flat open field, where the system doesnot require graphics to be registered to anyphysical object except the ground. In thisscenario there are no walls to occlude themonsters and items of interest. Since the gameis slaved to the screen, what the user sees on thedisplay is what the game believes is the user’scurrent view. Therefore, the user’s actions willperform correctly in the context of the game.

In the case where a building is visible but theuser is a large distance from the building, theinaccuracies are low and therefore not distract-ing. The problems occur when the physicalbuildings do not occlude the game graphicsproperly. The visual effect of poor occlusion isthat monsters appear to walk through walls orpop out of thin air, but at distance these errors donot detract from the game. Such occlusionproblems exist but they are visually very minor,because the user is generally moving their head

during the operation of the game. At 50 m adifference of 2–5 m (GPS tracking error) of theuser’s position is approximately a 2–5 degreeerror in user’s horizontal field of view, and thecompass itself has an error of ± 1 degrees.

4.2. Outdoors near buildings

When using ARQuake with the GPS/compasstracking less then 50 m from a building, the poorocclusion of monsters and objects near thephysical buildings, due to GPS error, becomesmore apparent. As the user moves closer tobuildings, inaccuracies in GPS positionalinformation become prevalent. The system isnow required to slave the Quake world to thereal world, and furthermore, in real time. As anexample, when a user is ten metres from abuilding and their position is out by 2–5 m, thisequates to an error of 11–27 degrees; this isapproximately a half to the full size of thehorizontal field of view of the HMD. When theerror is greater than the horizontal field ofview,the virtual object is not visible on theHMD.

Our proposed design is to enhance theaccuracy when the user is near buildings usingan extended version of ARToolKit. By usingfiducial markers specifically engineered for out-door clarity (approximately 1 m in size), andwith each marker set up on a real world objectwith known coordinates, accurate locationinformation can be obtained. Figure 9 showswhat a fiducial marker on the corner of abuilding would look like in our Quake world.These markers provide a correction in thealignment of the two worlds. We are investigat-ing the use of multiple fiducial markers to reduce

82

Fig. 9. Fiducial marker on a building.

B. Thomas et al.

Page 9: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

uncertainty due to marker mis-detection causedby lighting issues. Since the extended ARToolK-it we are developing supplies positioning andorientation information in the same format asthe GPS/compass system, ARQuake can trans-parently use either the GPS/compass or vision-based tracking systems. Our initial approach fordetermining when to use the information fromthe GPS/compass or the ARToolKit methods isuse the ARToolKit’s information first, when it isconfident of registering a fiducial marker. AsARToolKit recognises a fiducial marker, thetoolkit returns a confidence value, and thesystem will have a threshold of when to switchover to use the toolkit. When the confidencevalue goes below the threshold, the GPS/compass information is used.

4.3. Indoors

Our next proposed design caters for a userwalking into a building with fiducial markerson the inside walls and/or ceilings; the trackingsystem starts using the vision-based componentof the tracking system. This form of tracking issimilar to the work of (Ward et al.) [15]. Oursystem is lower-cost and is not as accurate, butdoes keep tracking errors within the accuracywhich our application needs, 2–5 degree of errorin user’s horizontal field of view. We areexperimenting with placing markers on thewalls and/or the ceilings.

When the markers are placed on the wall, wepoint the vision-based tracking camera forwards.It was necessary to size and position the patternson the walls so that they would be usable by thesystem regardless of whether the user was very

close or very far from the wall. In this case, wechose to use patterns that were a size of 19cm2.From testing, we found that the system couldregister a pattern at a range of 22.5 cm to 385 cmfrom the wall. In a 8 by 7 m room this rangewould be sufficient (for the initial stages of theproject) and an accuracy of within 10 cm at thelonger distances. It is important that no matterwhere the user looks in the room that, at leastone pattern must be visible to provide tracking.For this reason, we realised that to implementthe patterns on the walls, as the sole means oftracking would require different size targets. Weare investigating the use of targets themselves aspatterns inside larger targets; therefore one largetarget may contain four smaller targets when auser is close to a target.

Our second approach has been to place themarkers on the ceiling, with the vision-basedtracking camera pointed upwards. The cameradoes not have the problem of variable area ofvisible wall space, as the distance to the ceiling isrelatively constant. The main differences are thevarying heights of users. In the first instance weare implementing a small range of head tilt andhead roll (± 45 degrees). Perspectives such asthose from laying down or crawling will beinvestigated in the future.

The patterns on the ceiling were placed sothat at one time the tracking software couldreliably identify at least one pattern. With thecamera mounted on the backpack at height of170 cm and with the room height of 270 cm, ourcurrent lens for the camera views a boundary ofat least 130 cm2; we chose a pattern of 10 cm2 insize.

4.4. Choosing colours

The choice of colours is important for outdooraugmented reality applications, as some coloursare difficult to distinguish from natural surround-ings or in bright sunlight. The original Quakegame incorporates a ‘‘dark and gloomy’’ colourscheme to give the game a foreboding feelings.Dark colours appear translucent with the see-through HMDs. Monsters and items needdifferent colours to be more visible in an outdoorenvironment. We ran a small informal experi-ment to determine a starting point in pickingcolours for use in an outdoor setting. Thisinformal study was to gauge the visibility andopaqueness of solid filled polygons display of the

83

Fig. 10. Fiducial marker on the wall.

First Person Indoor/Outdoor Augmented Reality ApplicationFirst Person Indoor/Outdoor Augmented Reality Application

Page 10: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

see-through HMD. We are interested in whichcolours to texture large areas of the monsters anditems in the game. These colours are notnecessarily appropriate for textual or wire-frameinformation. Further studies are required forthese and other forms of AR information.

The testing method was to view differentcolours in one of four conditions: (1) standing inshade and looking into the shady area; (2)standing in shade and looking into a sunny area;(3) standing in a sunny area and looking at ashady area; and (4) standing in a sunny area andlooking to a sunny area. We tested 36 differentcolour and intensity combinations, nine differentcolours (green, yellow, red, blue, purple, pink,magenta, orange and cyan) and four differentintensities. The testing was performed outsidewith the Tinmith wearable computer using the I-Glasses see-through HMD. The colour/intensitycombinations were scored for visibility andopaqueness in each of the four viewing condi-tions on a scale of one (very poor) to ten (verygood)).

Our strongest set of criteria for colour/intensities were both a mean score of at leastseven over the four viewing conditions, as well asa minimum score of six on each of theconditions. Nine colours satisfy this qualitylevel: three shades of purple, two shades ofblue, two shades of yellow, and two of green.

A more in depth look at when standing inshade and looking into the shade shows the bestcolours were determined to be bright purple andbright magenta. When standing in shade butlooking into a sunny area, the best colours weredetermined to be a dark green and pink. Thecolour to avoid is dark orange, which was given ascore of 4. All other colours were a score of okay(5) and above. The results changed when thesubject was moved out into a sunny area. Whenstanding in a sunny area and looking to a shadyarea, bright yellow and bright purple scored thebest. There were eight colours that scored a poor;these are as follows: bright yellow, bright red,bright pink, bright magenta, bright orange,bright cyan, and dark cyan. Finally, the resultsof the best colours for when the subject wasstanding in the sun and looking into a sunny areawere as follows: bright green, three shades of blueand dark purple. There were 17 colours thatscore a poor (3 or 4). The colours to avoid are allintensities of cyan, orange, magenta, pink, andred.

5. Collaboration

The original Quake engine allows multiplepeople to play the game simultaneously, andallows communication between players via atext-based mechanism. To avoid a user having todivert attention from what is occurring in theworld around them, we provide the users afacility for two-way voice communication. Wehave add a number of collaboration features tothe ARQuake game.

5.1. Multi-player

To allow both users on a desktop computer andusers on a wearable computer to use the systemsimultaneously, two different representations ofthe maps are required. For example the userswith the wearable will not require the buildingsto be rendered, as they will actually see thephysical buildings in the augmented reality.Alternatively the users who are indoors willrequire the buildings to be textured, as they willnot be able to see the physical buildings fromtheir stationary position. Both representations ofthe maps require the buildings’ placement, sizeand orientation to be identical on both computerplatforms.

As previously mentioned, multi-player from adesktop platform to a second desktop platform isalready implemented in the original Quakesoftware. We then extended this to a desktopplatform to wearable platform multi-player con-figuration. We used a WaveLAN to network thedesktop platform with the wearable platform. Atpresent we have been able to have two playersusing the WaveLAN, one on the wearable andone on a desktop. The game was able to runsmoothly using the WaveLAN network.

5.2. Using a Pointing device to aidcommunication

In ARQuake there is no method for a person topoint at a desired object. The avatar, whichrepresents them, is not capable of displaying apointing action to other users who may be in thesystem. This is not an issue if all players are usingwearable computers and are using the ARsystem. The players can see other playersphysically pointing. However in the case wheremultiple users are on a desktop platform and awearable platform, a method of presenting anartificial method of pointing is required. The

84

B. Thomas et al.

Page 11: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

system should allow a user to point out objectsand with speech we believe that a user will beable to accurately indicate an object. The use ofspeech and social protocols will allow theconfirmation of the correct direction or objectbeing referring to. The pointer is able to guideother users rather than specify an exact object.

To make this pointing device useable, it needsto be easily seen from all angles. We designed thepointer as a 3D pyramid with dented walls (seeFig. 11). The pointer allows other users lookingat the shape the ability to gauge where the otheruser is referring. Initially the shape was onecolour, but we found that in some cases it wasdifficult to tell which direction the arrow waspointing. We then modified the pointing deviceto have one end textured a different colour, toallow the user to establish which direction thepointer is facing at all angles.

A future development is to allow the user tocontrol the visibility and independent move-ment of the pointer. The pointer currentlymimics the behaviour of the player virtualavatar, and allows a full range of pointing onboth the x and y planes.

6. Implementing ARQuake

The original Tinmith software did have notrequired any modifications to support the exten-sions mentioned in this paper, but two additionalmodules modpatt and modquake have beenadded to provide the new features.

The modpatt module performs the patternrecognition using the ARToolKit, and also readsin position and orientation values. (More detailsconcerning architecture and implementation ofthe existing Tinmith modules can be found inPiekarski et al. [6,8]. The modpatt module uses

the pattern recognition extended from AR-ToolKit to refine the position and orientationinformation, which is then passed back to theTinmith system for the other modules to process.In the case of when the system is indoors,modpatt is responsible for using just thecamera recognition to generate position infor-mation, as the GPS does not function indoors.

The modquake module extracts informationfrom the Tinmith system, such as position andorientation, and converts this into UDP packets,which are then sent onto the local network. Themodified Quake program then receives theseUDP packets, and converts the data into Quake’slocal coordinate system, by scaling and translat-ing. Quake then uses this information to renderthe display at the appropriate location with thecorrect head position, and the player can controlthe game using physical movement.

7. Conclusion

This paper reports on an ongoing researchproject – ARQuake, an outdoor/indoor augmen-ted reality first-person application. Although theimplementation has not been completed, manyinteresting results have been found, in particularuser interface issues for AR outdoor/indooraugmented reality games; an architecture forlow cost, moderately accurate indoor/outdoor6DOF tracking; and implementation issues forconverting a desktop application into an ARapplication.

The current ARQuake game is running onour wearable computer platform with the 6DOFGPS/compass tracking system. The interactionof the game was operating within the accuracy ofthis tracking system. We have modelled anoutdoor section of our campus and interior of theWCL. The graphics of the game are running atapproximately 30 frames per second, with GPSupdates once per second and compass updates at15 times per second.

The results of our research to this point in theimplementation, include the design and imple-mentation of a user interface for an augmentedreality application, and proposed some guide-lines. Specifically, identified by informal experi-mental evidence, a set of colours that take intoaccount different lighting and background col-ours for outdoors use. We successfully mappedthe keyboard and mouse interactions of theARQuake application to head/body movements

85

Fig. 11. 3D pointing device.

First Person Indoor/Outdoor Augmented Reality ApplicationFirst Person Indoor/Outdoor Augmented Reality Application

Page 12: First Person Indoor/Outdoor Augmented Reality Application: ARQuake

and a simple two-button input device. A simpleuser interface layout was incorporated into theARQuake application. The game is quite play-able in this configuration, and we are continuingto investigate how to overcome difficulties withregistration errors when indoors and close tobuildings.

We proposed and described a low costmoderately accurate indoor/outdoor 6DOF track-ing system, by a novel combination of opticaltracking and GPS/compass tracking to providean absolute tracking system.

The solution is not restricted to overcomingregistration problems, but it does provide ageneral tracking solution. Fiducial markers areplaced at the corners of buildings to providemore accurate positioning information for AR-Quake. When a fiducial marker comes into viewof a head mounted camera, this more accuratetracking information is used instead of the GPS/compass data. The vision based tracking is notfully functional, and we are continuing toinvestigate and develop this solution.

Aside from the specific technical achieve-ments of our work to date, it is perhaps mostimportant to point out that our work can serve asproof that augmented reality is readily achiev-able with inexpensive, off-the-shelf software.The translation of the application set from thedesktop to incorporate the physical world bringscloser the possibility of truly ubiquitous comput-ing.

Acknowledgments

For their support of this project, we would like tothank the Advanced Computing ResearchCentre of the University of South Australia;John Maraist; Arron Piekarski; and the DefenceScience and Technology Organisation.

References

1. Klinker GJ, Ahlers KH et al., Confluence of computervision and interactive graphics for augmented reality.PRESENCE: Teleoperations and Virtual Environments1997; 6(4): 433–451

2. Brooks FP, Walkthrough – a dynamic graphics system forsimulating virtual buildings. Workshop on Interactive 3DGraphics, 1986

3. State A, Hirota G et al., Superior augmented realityregistration by integrating landmark tracking andmagnetic tracking. Proceedings of SIGGRAPH 1996,New Orleans, LA, ACM, 1996; 439–446

4. State A, Livingston MA et al., Technologies foraugmented-reality systems: realizing ultrasound-guidedneedle biopsies. Proceedings of SIGGRAPH 1996, NewOrleans, LA. ACM, 1996.

5. Hollerer T, Feiner S et al., Situated documentaries:embbeding multimedia presentations in the real world.3rd International Symposium on Wearable Computers,San Francisco, CA, 1999; 79–86

6. Piekarski W, Gunther B et al., Integrating virtual andaugmented realities in an outdoor application. Proceed-ings of the Second IEEE and ACM InternationalWorkshop on Augmented Reality (IWAR) ’99, SanFrancisco, CA, 1999

7. Piekarski W, Thomas BH, Tinmith-Metro: new outdoortechniques for creating city models with an augmentedreality wearable computer. Fifth International Sympo-sium on Wearable Computers, Zurich. IEEE, 2001

8. Piekarski W, Thomas BH et al., An architecture foroutdoor wearable computers to support augmented realityand multimedia applications. Proceedings of the ThirdInternational Conference on Knowledge-Based Intelli-gent Information Engineering Systems, Adelaide,IEEE,1999

9. Neumann U, You S et al., Augmented reality tracking innatural environments. International Symposium onMixed Realities, Tokyo, Japan, 1999

10. Golding AR, Lesh N, Indoor navigation using a diverseset of cheap, wearable sensors. 3rd InternationalSymposium on Wearable Computers. San Francisco,CA, 1999; 29–36

11. Aoki H, Schiele B et al. Realtime positioning system fora wearable computers. 3rd International Symposium onWearable Computers, San Francisco, CA, 1999

12. Azuma RT, The challenge of making augmented realitywork outdoors. First International Symposium on MixedReality (ISMR ’99), Yokohama, Japan. Springer-Verlag,1999

13. Park J, Jiang B et al., Vision-based pose computation:robust and accurate augmented reality Tracking. Pro-ceedings of the 2nd IEEE and ACM InternationalWorkshop on Augmented Reality ’99, San Francisco,CA, 1999; 3–12

14. Kato H, Billinghurst M, Marker tracking and HMDcalibration for a video-based augmented reality confer-encing system. Proceedings of the 2nd IEEE and ACMInternational Workshop on Augmented Reality ’99, SanFrancisco, CA, 1999; 85–94

15. Ward M, Azuma R et al., A demonstrated optical trackerwith scalable work area for head-mounted displaysystems. Proceedings of 1992 Symposium on Interactive3D Graphics. Cambridge, 1992

Correspondence to: Dr B. Thomas, Wearable ComputerLaboratory, School of Computer and Information Science,University of South Australia, Mawson Lakes, SA 5095Australia. Email: [email protected]

86

B. Thomas et al.