occlusion handling in outdoors augmented reality...

26
Occlusion handling in outdoors augmented reality games Vlasios Kasapakis 1 & Damianos Gavalas 1 Received: 27 July 2015 /Revised: 28 February 2016 /Accepted: 3 May 2016 / Published online: 12 May 2016 # Springer Science+Business Media New York 2016 Abstract The use of augmented reality (AR) becomes increasingly common in mobile game development as a means of enhancing the playersview of the physical world through computer-generated graphical information. A situation often encountered in AR applications is the -partial or full- occlusion of virtual objects by physical artifacts; if not appropriately handled, the visualization of occluded objects often misleads usersperception. This paper introduces three alternative Geolocative Raycasting techniques aiming at assisting developers of outdoors AR games in generating a realistic field of view (FoV) for the players by integrating real time building recognition, so as to address the occlusion problem. Our geolocative raycasting methods have been applied in the location-based, AR game Order Elimination, which utilizes publicly and freely available building information to calculate the players FoV in real-time. The proposed algorithms are applicable to a variety of sensor-based AR applications and portable to any real setting, provided that sufficient topographical data exist. The three FoV determination methods have been tested with respect to several perfor- mance parameters demonstrating that real-time FoV rendering is feasible by modest mobile devices, even under stress conditions. A user evaluation study revealed that the consideration of buildings for determining the FoV in AR pervasive games can increase the quality of experience of players when compared with standard FoV generation methods. Keywords Pervasive games . Augmented reality . Occlusion . Field of view . Line of sight . Raycasting . OSM . Performance tests 1 Introduction The physical world is an unmatched source of information, providing humans a continual stream of images, sounds and feelings that cannot be fully simulated by computers. Pervasive games aim to fully exploit the richness of the physical world as a resource for play by interweaving digital media Multimed Tools Appl (2017) 76:98299854 DOI 10.1007/s11042-016-3581-1 * Vlasios Kasapakis [email protected] 1 Department of Cultural Technology and Communication, University of the Aegean, Mytilene, Greece

Upload: others

Post on 23-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

Occlusion handling in outdoors augmented reality games

Vlasios Kasapakis1 & Damianos Gavalas1

Received: 27 July 2015 /Revised: 28 February 2016 /Accepted: 3 May 2016 /Published online: 12 May 2016# Springer Science+Business Media New York 2016

Abstract The use of augmented reality (AR) becomes increasingly common in mobile gamedevelopment as a means of enhancing the players’ view of the physical world throughcomputer-generated graphical information. A situation often encountered in AR applicationsis the -partial or full- occlusion of virtual objects by physical artifacts; if not appropriatelyhandled, the visualization of occluded objects often misleads users’ perception. This paperintroduces three alternative Geolocative Raycasting techniques aiming at assisting developersof outdoors AR games in generating a realistic field of view (FoV) for the players byintegrating real time building recognition, so as to address the occlusion problem. Ourgeolocative raycasting methods have been applied in the location-based, AR game OrderElimination, which utilizes publicly and freely available building information to calculate theplayers FoV in real-time. The proposed algorithms are applicable to a variety of sensor-basedAR applications and portable to any real setting, provided that sufficient topographical dataexist. The three FoV determination methods have been tested with respect to several perfor-mance parameters demonstrating that real-time FoV rendering is feasible by modest mobiledevices, even under stress conditions. A user evaluation study revealed that the considerationof buildings for determining the FoV in AR pervasive games can increase the quality ofexperience of players when compared with standard FoV generation methods.

Keywords Pervasive games . Augmented reality . Occlusion . Field of view. Line of sight .

Raycasting . OSM . Performance tests

1 Introduction

The physical world is an unmatched source of information, providing humans a continual stream ofimages, sounds and feelings that cannot be fully simulated by computers. Pervasive games aim tofully exploit the richness of the physical world as a resource for play by interweaving digital media

Multimed Tools Appl (2017) 76:9829–9854DOI 10.1007/s11042-016-3581-1

* Vlasios [email protected]

1 Department of Cultural Technology and Communication, University of the Aegean, Mytilene, Greece

Page 2: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

with our everyday experience [3, 6, 18]. While this already poses new challenges to gamedevelopers, requirements are even stricter in pervasive augmented reality (AR) games, whereinthe real world perceived by users is enhanced through superimposed virtual objects [5].

AR requires only a limited amount of the user’s field of view (FoV) to be rendered withcomputer-generated graphics while the major part of the user’s view perceives the physical world[4, 27]. The allowance of users to view the physical world provides them a better sense of wherethey are and what is around them. Nevertheless, cases often occur that a physical obstacleoccludes a virtual object; such cases are evenmore common in outdoors pervasive games whereinsurrounding buildings are highly likely to occlude a virtual character. Then, the overlaying ofvisual augmentation (representing a virtual object hidden from the current user’s viewpoint) maycause confusion to users’ perception. This incorrect display impairs depth judgment and contrib-utes to misconceptions and wrong pursuance of tasks amongst users [26, 29].

The occlusion problem has received considerable attention in the field of AR research. Theapproaches proposed for handling the occlusion problem in AR are mainly classified into twotypes [28]: model-based and depth-based approaches. Model-based methods require theaccurate (offline) 3D modeling of the real environment’s objects [14], hence, limiting theirportability. Depth-based methods rely on some sort of image processing (e.g. stereo matching)to acquire depth information of real objects, which is computationally expensive, hence, notapplicable to real-time applications [21, 24]. Recently, several studies have investigated anumber of visual metaphors to convey the depth (i.e. the positioning) of the graphical entitiesrelative to the real objects, considering handheld AR in outdoors environments [10, 23].

The occlusion problem has only be addressed in indoors AR games with most prototypesrelying on the Microsoft Kinect sensor to receive depth cues [9]. To the best of our knowledge,none of the existing outdoors AR games sufficiently addresses the occlusion problem. In thosegames, players are typically allowed to interact with virtual objects provided that their in-between distance is below a certain threshold [2, 16]; some prototypes also require players topoint their device towards the targeted virtual object in order to enable interaction [12, 15, 17].The occlusion problem is partly addressed in ManHunt which only examines whether theplayers and the virtual objects lie on the same road segment [17, 20]. However, ManHunt -incorrectly- always assumes lack of line-of-sight (LoS) when the involved parties lie ondifferent streets (although their in-between space may be void). Aside LoS, none of theexisting pervasive games addresses the problem of determining the device’s ‘field of view’(FoV), namely the 2D polygon which approximates the player’s sight.

In classic video games, the visibility of virtual objects is estimated utilizing the raycastingtechnique. Raycasting is the act of casting imaginary light beams (rays) from a source location(typically the point of view of the character or object controlled by the player) and recording theobjects hit by the rays [25]. Herein, we extend this idea in pervasive gaming wherein, unlikevideo games, the game space is not pre-registered and occlusion is typically due to surroundingbuildings. Our focus is on methods for determining LoS/FoV which satisfy critical require-ments of pervasive games: real-time performance; anytime/anywhere game play; suitability forexecution on average mobile equipment; support of popular map platforms.

Along this line, we introduce three (3) alternative Geolocative Raycasting methods whichallow pervasive game developers to detect buildings or custom-generated obstacles in location-based and AR game environments, thereby reliably resolving the object occlusion issue:

& The first method is equivalent to the raycasting technique utilized in video games to detectobjects hit by bullets. Essentially, this method involves casting a sequence of rays (consecutive

9830 Multimed Tools Appl (2017) 76:9829–9854

Page 3: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

rays are separated by a specific angle) with each ray progressively generating virtual locationsuntil a virtual location is found to lie within a (building) polygon, i.e. until a ‘hit’ is detected.

& The second method also involves casting a sequence of rays, yet, the intersection of the raysegments with the building polygons is calculated through an efficient geometric technique.

& The third method comprises an image-based variant of our first raycasting method; thegame engine firstly processes a screenshot of the current player’s map view to identify thecoordinates of buildings, which are drawn using a specific color code; thereafter, the exactlocations where the rays hit buildings are found through examining the color of pixelscorresponding to successive ray steps.

The performance of the three above listed methods has been thoroughly evaluated underrealistic game play conditions. Building-related data are yield from open map data repositories.Hence, our raycasting techniques suggest a portable scheme which may be incorporated in anyoutdoors AR application and be utilized at any urban setting, provided that sufficient topo-graphical data exist. As a case-study, we have prototyped the proposed geolocative raycastingtechniques in the context of Order elimination, an outdoors AR game. Note that our work isregarded as complementary to research dealing with occlusion detection in AR applications.Most existing works enable the user to see virtual representations of physical objects whosepositions are occluded in real scenes. Rather, we tackle the problem of detecting the occlusionof virtual objects hidden by physical obstacles (mainly buildings). We also take into accountthe mobility of both the players and the virtual characters which makes occlusion managementharder as it poses real-time execution requirements. Furthermore, our approach determines theFoVof handheld devices, which may be regarded as generalization of occlusion handling andmay be applicable to several AR applications, other than gaming. Nevertheless, our methodsare also suitable to estimate the LoS, i.e. whether the virtual character is along the device’sexact direction and not occluded by any building. The determination of LoS would be crucialin ‘first-person shooter1’-like mobile games where the player would have to point his devicedirectly towards a virtual character to enable interaction. It is noted that a preliminary versionof this work has been recently published [19].

The remainder of this article is structured as follows: Section 2 presents previous researchrelated to our work. Section 3 describes the game mechanics of Order Elimination. Section 4introduces our three geolocative raycasting algorithms. Section 5 reports the results of theperformance tests conducted on the three FoV determination methods discussing their relevantstrengths and shortcomings. Section 6 suggests game design workarounds for dealing with theinaccuracy of GPS-based localization in outdoors AR games. Section 7 presents the evaluationresults of a game incorporating geolocative raycasting as well as alternative FoV determinationtechniques. Finally, Section 8 concludes our work and suggests directions for future research.

2 Related work

Notably, most existing pervasive games utilizing maps to visualize the game space neglect theLoS/FoV issue and enable the interaction of the player with in-game entities by solely

1 The term ‘first-person shooter’ refers to a video game genre centered on gun and projectile weapon-basedcombat through a first-person perspective; that is, the player experiences the action through the eyes of theprotagonist.

Multimed Tools Appl (2017) 76:9829–9854 9831

Page 4: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

considering their in-between straight line distance [2, 7, 16]. A number of pervasive gameshave been developed in the past utilizing AR to visualize in-game objects. In EpidemicMenace II (see Fig. 1a) players used, among other equipment, a mobile location-based ARsystem consisting of an HMD connected to a laptop carried in a backpack. That systemprojected 3D models of virtual viruses in AR according to the players’ location. The player hadto approach the viruses in close distance and point their device towards them to eliminate them[12]. The interaction of the player with a virus is solely based on their in-between distance andthe device’s orientation, without testing the LoS condition. TimeWarp (see Fig. 1b) is a mobilemixed reality game utilizing a mobile PDA AR system which augments the real environmenttaking into account the players’ location, in order to project virtual robots to the players. Theevaluation trials of TimeWarp revealed the need to address the occlusion of virtual objects bytheir physical counterparts. For instance, if a virtual character moves behind a real building,then the character should be hidden from the player’s sight [30].

Fig. 1 (a) The HMD AR systemof Epidemic Menace II; (b) ARprojection of virtual characters inTimeWarp

Fig. 2 Screenshots taken from the ManHunt mobile application: (a) The player is not in LoS with the Pirate(they do not lie on the same road and a building blocks the player’s FoV); (b) the player is in LoS with the Pirateas they are on the same road segment; (c) the player is in LoS with the Pirate although they are not on the sameroad (a park lies in between clearing the player’s FoV)

9832 Multimed Tools Appl (2017) 76:9829–9854

Page 5: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

To the best of our knowledge, ManHunt2 [17] is the only pervasive game addressing theocclusion problem between players and in-game entities. In ManHunt the player acts as aKnight who has to run after and kill a runaway (virtual) Pirate before the latter reaches anescape point. ManHunt utilizes Google Maps to display the location of the player (Knight) andthe Pirate. The Knight needs to approach the Pirate in distance less than 60 m and shoot him,by tapping a ‘cannon’ button, provided that he is in LoS with him.

In order to check the LoS condition, the ManHunt game engine requests walking directions(from the Knight’s to the Pirate’s location) from the Google Directions API: it is assumed thatno LoS exists if there is a ‘turn’ direction in the Google Directions API JSON route response(i.e. it is assumed that a building lies in between). The above described technique may be auseful instrument in a variety of outdoors location-based games. However, it fails to detectsituations wherein LoS exists even if a ‘turn’ is in between two locations (e.g. when a park or aparking lot lie in the corner). This is illustrated in the example scenarios of Fig. 2, wherein thegray line denotes the route followed by the Pirate, while the black dashed line denotes the routerecommended by the Google Directions API, which is invoked when the player tries to shootthe Pirate. While in the first two cases the game engine correctly determines the visibilityamong the player and the Pirate, in the third case it erroneously assumes lack of LoS althoughthe park lying in between the player and the Pirate ensures FoV clearance.

Further to the problem discussed above, the Google Directions API applies strict usage limitswith respect to the number of requests answered. As a result, the LoS inspection methodimplemented in ManHunt (a) does not reliably resolve the occlusion problem in realistic urbanenvironments, and (b) is unpractical for commercial AR pervasive game platforms which shouldsupport large audiences of players playing simultaneously on different locations and often requirereal-time resolution of the occlusion problem (hence, requiring unlimited use of web services).

The requirement to function anytime/anywhere becomes increasingly common amongpervasive game prototypes [15–18]. This trend highlights the need for a LoS/FoV determina-tion technique suitable for diverse outdoors locations (e.g. city streets or parks) and differenttime periods (e.g. during daytime or nighttime), namely under various lighting conditions.Typically, LoS/FoV determination methods should satisfy strict real-time requirements evenwhen executed in resource-constrained mobile devices. Additionally, recent advances inpervasive gaming should be carefully taken into consideration: smartphones have becomethe most favorable platform for pervasive games development; interactive maps are widelyused to visualize the game space; AR glasses are expected to have profound impact in gamingin the near future [18]. Therefore, any LoS/FoV determination method addressed to pervasivegames should appropriately address the above listed trends. Moreover, several pervasivegames allow the interaction of players with an in-game objects from a relatively long distance(e.g. 40–50 m) [16, 17]. Hence, the LoS/FoV should be determined up to relatively longdistances to avoid player misconceptions.

To the best of our knowledge existing FoV calculation and occlusion handling techniques fail toaddress the above discussed requirements either featuring short FoV distance [1]; lacking supportfor smartphones [11]; being only applicable to indoors environments [32]; being sensitive toenvironmental light to perform accurate FoV determination (often using cameras), thereby failingto function during nighttime [31]; exhibiting poor performance to be considered as real-time FoVrendering options [22]. Notably, several relevant commercial products have been released recently,

2 The ManHunt Android application file as well as user evaluation statistics are available from: http://www.barbarossarpg.com/.

Multimed Tools Appl (2017) 76:9829–9854 9833

Page 6: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

like Arduino Ping (see Fig. 3a), Project Tango (see Fig. 3b) and Structure Sensor (see Fig. 3c),whichmay be easily integrated with smartphones and utilized for FoV determination. Nevertheless,those products feature short FoV determination distance (maximum 4 m); thus, they would not beapplicable to outdoors pervasive games.

3 Order elimination

Orderelimination3 is a pervasive game adopting a scenario similarwith that ofManHunt (see Section2). InOrder Elimination the player acts as a Soldierwho tries to stop a running virtual Zombie, beforeit approaches a group of victims in the city. In order to stop the Zombie, the Soldier has to approach itin a 50 m distance and shoot it, provided that the Zombie is within the player’s FoV.

On startup, the game engine requires a GPS fix of the player. Having obtained the location fix, itqueries the OpenStreetMap (OSM) Overpass Turbo API,4 wherefrom it obtains the surroundingbuildings stored in the OSM database.5 Then it pairs the Zombie’s current location with a randompoint located at least 300 m distance far and queries the Google Directions API to generate thewalking route to be followed by the Zombie. The directions’ turning points (contained in theGoogle Directions API response) are utilized to construct a waypoint line and animate theZombie’s marker on the map interface, visualizing a smooth movement along their routes (seethe dashed line in Fig. 4). Except from map visualization, Order Elimination utilizes theBeyondAR framework6 to generate an AR view of the Zombie. The main contribution of thiswork is the replicable method used by the system to determine whether the Zombie lies within theplayer’s FoV. The FoVof the player is determined utilizing a form of geolocative raycasting, basedon algorithmic techniques presented in Section 4.

Fig. 3 (a) Arduino Ping (https://store.arduino.cc/product/SEN136B5B) sends out a burst of ultrasound andlistens for the echo when it bounces off an object, calculating the distance from the nearest object; (b) ProjectTango (https://www.google.com/atap/project-tango/) allows spatial recognition and depth perception utilizingcomputer vision, image processing and special vision sensors; (c) Structure Sensor (http://structure.io) enablesrapid 3D scanning of objects and people and 3D maps of interior spaces

3 A commercial version of Order Elimination is available in Google Play (https://play.google.com/store/apps/details?id=bl.on.mi.en)4 http://overpass-turbo.eu/5 The OSMOverpass API provides textual description of buildings in a certain area, essentially a list of polygons,each comprising a series of latitude/longitude points.6 http://www.beyondar.com/

9834 Multimed Tools Appl (2017) 76:9829–9854

Page 7: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

4 FoV determination methods

This section introduces three novel FoV determination methods: raycasting, ray-polygon intersec-tion and image-based raycasting. In addition to the player’s exact location, all these methods requirethe calculation of the device’s orientation (bearing7) based on measurements taken from theaccelerometer and magnetometer sensors8 of the player’s device. Thereafter, the bearing is set asthe center of the player’s FoV. The precise estimation of the player’s FoV (i.e. the exact 2D polygonwhich delimits the player’s sight) may then be undertaken by one of the alternative approachesdiscussed in the following subsections.

4.1 Raycasting (RC)

The raycasting algorithm utilized in Order Elimination progressively generates virtual loca-tions along a straight line (each virtual location is at distance ds far from the previous one,along a ray of dmax length) towards the player’s facing direction. A single ray would likely besufficient to check the LoS condition, however, it is insufficient to accurately estimate theplayer’s FoV. Thus, the same process (i.e. the casting of rays) is repeated every angles degrees,from the leftmost to the rightmost FoV’s angle (anglel and angler, respectively) considering thecurrent bearing of the device as the bisector of the FoV’s angle.

The above described method is illustrated in Fig. 5a where 15 rays determine theplayers FoV in an area featuring buildings stored in the OSM database (blue-colored dotsdenote points invisible from the device’s current stand point). When one of the ray steps(i.e. virtual location) is found to lie inside a polygon (building) of the surrounding

7 Bearing refers to the angle of a moving object’s direction from the North.

8 http://developer.android.com/guide/topics/sensors/sensors_overview.html

Fig. 4 Visualization of the Zombie’s route (obtained from the Google Directions API). The red line representsthe Zombie’s route while blue dots represent the intermediate turning points along the route

Multimed Tools Appl (2017) 76:9829–9854 9835

Page 8: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

buildings’ list,9 it is inferred that the ray has been blocked by a building, hence, furtherray steps along that line are unnecessary. Through recording the ‘hit’ points of successiverays, the algorithm shapes a 2D polygon which approximates the player’s FoV (seeFig. 5b).

A pseudocode implementation of the RC algorithm is presented in Algorithm 1. Thecomputational complexity of the RC algorithm is O(r ⋅ l ⋅p), where r denotes the number ofrays covering the FoV (depends on the overall FoVangle and the incremental step for the FoVangle), l denotes the number of steps per ray (depends on the maximum ray length and theincremental step for the ray steps) and p denotes the number of buildings considered in theraycasting process (depends on the maximum ray length and the buildings’ density).

In Order Elimination, upon a ray step hitting a building, the point of impact of theblocked ray is saved in a vector; upon the completion of the raycasting process, thosecollision points are utilized to draw a polygon on the OSM map, providing a visualrepresentation of the player’s FoV. Finally, utilizing the Google Maps Android API utilitylibrary the game engine easily inspects whether the Zombie marker lies inside the FoV’spolygon, thus determining whether the Zombie is within the Soldier’s sight. If thiscondition holds, both the Zombie’s AR and map markers turn red, allowing the players

9 https://github.com/googlemaps/android-maps-utils

9836 Multimed Tools Appl (2017) 76:9829–9854

Page 9: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

to shoot the Zombie so as to win and end the game.10 In Fig. 6a, a nearby building onlypartially occludes the player’s FoV (the Zombie is viewed via the in-between roadsegment). In Fig. 6b, though, the Zombie is occluded by a nearby building, therefore,the player is not allowed to shoot.

When the incremental step for the FoV angle (angles) is set to 2°, even the smallestbuildings stored in OSM database are detected. Although improving performance, largerFoVangle incremental steps (e.g. 3°) are not advised as they often result in incorrect buildingdetection (i.e. small buildings may be missed), thus, inaccurate FoV determination (see Fig. 7).

4.2 Ray-polygon intersection (RPI)

Our second FoV determination approach involves the implementation of an efficient, geomet-ric ray intersection method. Initially, building polygons are deconstructed to pairs of vertices,each referring to a polygon side (line segment). Similarly to the RC approach, we generate asequence of ray segments within the FoV angle (consecutive rays are angles degrees far fromeach other) centered at the device’s bearing. The edges of each ray segment are set to thedevice’s location and the ray’s endmost point being dmax far (maximum FoV distance).

To determine FoV we then calculate the intersection points among each ray segment andthe building polygon side lines. For each ray segment, the intersection point which is nearest tothe device is regarded as the furthest FoV point along this particular ray. Figure 8 demonstratesour method through a simplified scenario which involves five rays and a building polygon.The green circles denote the endmost points of the five ray segments while the red circlesdenote the intersection points of the rays with the building polygon sides. The yellow-shadowed area (ABCDEF) represents the estimated FoV polygon. Note that the accuracy ofFoV estimation depends on the density of rays (i.e. their in-between angle). For instance, thetriangle CC1C2 is erroneously considered to be within the player’s FoV (the triangle’s areawould be smaller if rays were denser). However, it becomes evident that ray intersectionmethod is more accurate than the alternative raycasting method with respect to the estimationof the intersection points. In the ray intersection technique, intersection points are preciselydetermined through a simple geometric calculation (see points C, D, E in Fig. 8), while in theraycasting approach accuracy depends on the distance among the consecutive ray steps.

A pseudocode implementation of the RPI algorithm is presented in Algorithm 2.Note that the initialization phase (lines 1–8) is the same with that of the RC algorithm

10 As reported in the TimeWarp evaluation results, the virtual artifacts in AR games should be preferably hiddenwhen the player has no LoS with them. Even though we could effortlessly hide the AR content when eye contactwith the Zombie was infeasible, we chose not to for game design purposes.

Fig. 5 (a) Multi-angle raycasting; (b) approximation of the player’s FoV

Multimed Tools Appl (2017) 76:9829–9854 9837

Page 10: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

(Algorithm 1). The computational complexity of the RPI algorithm (similarly to thecomplexity of the RC algorithm) is O(r ⋅ p).

Fig. 6 (a) Zombie lying within the player’s FoV; (b) Zombie lying out of the player’s FoV. The AR view of theZombie is generated by the BeyondAR framework

9838 Multimed Tools Appl (2017) 76:9829–9854

Page 11: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

4.3 Image-based raycasting (IRC)

Our third alternative method for FoV determination involves an image-based variant of the RCalgorithm presented in Section 4.1. Firstly, we identify buildings through processing ascreenshot of the game space’s OSM map outline and taking advantage of the differentcoloring scheme used to draw buildings on the map interface. The result of this processingis that we detect the buildings’ outline, namely we keep record of the screen’s pixels associatedwith buildings. Then it is straightforward to examine whether a ray hits a building throughexamining the color of pixels corresponding to successive ray steps. Specifically, we ‘project’the geographical coordinates of the ray step’s location to pixel coordinates upon the device’sscreen; then, we extract the respective pixel color of the displayed game space’s OSM mapoutline. In the event that the pixel color matches the color code of the building polygons, theray step is considered to lie within a building (i.e. a hit is detected).

Fig. 8 FoV determination utilizing the ray intersection approach

Fig. 7 Inaccurate FoVdetermination due to applyinglarge incremental steps on the FoVangle

Multimed Tools Appl (2017) 76:9829–9854 9839

Page 12: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

A pseudocode implementation of the IRC algorithm is presented in Algorithm 3. Thecomputational complexity of the IRC algorithm (similarly to the complexity of the RCalgorithm) is O(r ⋅ l).

5 Performance tests

FoV determination is subject to real time constraints in most outdoors mobile games as thetime required to render the simulated FoV considerably affects the overall quality of experi-ence perceived by players. The real time execution requirement also derives from the dynamicsof such applications, considering the mobility patterns of both the players and the virtualcharacters; namely, implemented FoV determination methods are expected to execute fastenough to update FoV upon any change of the player’s or virtual character’s location and/orthe player’s orientation.

This section reports the results of performance tests executed on Order Elimination in orderto assess the impact of various performance parameters under realistic game play conditions.Note that our proposed LoS/FoV determination methods involve some sort of preprocessing ofthe OSM building data which are then stored in appropriate memory structures. Upon thestartup of Order Elimination, data (vertices of building polygons) are yield by the OverpassTurbo API and utilized to generate a list of polygons, then overlaid on the OSMmap interface.We apply a distance threshold to filter out polygons whose centroid is located further than a

9840 Multimed Tools Appl (2017) 76:9829–9854

Page 13: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

specified distance threshold from the player’s location. Nearby buildings are re-calculatedupon every change on the player’s position. The application of a distance threshold slightlylonger (~20 m) than the ray length ensures that: (a) the corners of buildings whose centroidsare slightly further from the ray’s reach are also detected, (b) the list of polygons do not need tobe updated upon minor displacement of the player or the virtual character.

5.1 Experimental setup

The test game space has been set in the center of Brussels (Belgium), as the OSM databasecontains a large number of registered buildings in that area. The game space has been a squarewith side length of 500 m, and the ray step (in the RC and IRC methods) has been set to 3.8 mto minimize the possibility of missing even small buildings. The angle step (among successiverays) has been set to 1° to enable highly accurate building detection. The player’s location hasbeen switched every 2 sec between 2 fixed locations (see Fig. 9). Upon each relocation the listof nearby buildings is updated when employing the RC and RPI methods (the overall numberof buildings considered during the tests have been 1000). However, no similar (screenshot)processing is performed when employing the IRC method (as detailed below in this section,such processing is prohibitively time-consuming for real-time applications); hence, the mapview processing is only performed at the application’s startup when considering IRC.

The duration of all testing sessions has been 60 sec; the device has been constantly rotatedthroughout each testing session (12 degrees/sec, namely 2 full rotations/session). We haveconducted performance tests for all the three approaches presented in Section 4, evaluating theeffect of several performance variables, such as the ray length, the overall FoV angle and thenumber of buildings. The figures reported below represent the average among ten (10)

Fig. 9 The test area (in Brussels, Belgium) used in our performance experiments; the player’s location has beenswitched every 2 sec between 2 fixed locations represented by the blue dots

Multimed Tools Appl (2017) 76:9829–9854 9841

Page 14: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

performance test measurements. All performance tests have been executed using a middle-range Android device (Samsung S3 Neo, 4 core 1400Mhz CPU, 1.5 GB RAM).

5.2 Performance tests on LoS estimation

In our first experiment, we tested the performance of LoS estimation utilizing the threeapproaches discussed in Section 4. On this test we casted a single ray along the device’sbearing, aiming to determine whether the moving Zombie has been hit by this ray. Thetest involved 1000 buildings and 100 m ray length, where the ray has not been occludedby any building. On this test we measured average performance of 18.1, 4.4 and 2.8 msecfor the RC, RPI and IRC methods, respectively (these only refer to the time required todetermine LoS without taking into account the time needed to perform the processing ofbuilding data). The above reported measurements indicate a clear performance advantageof the IRC method.

5.3 Performance tests on FoV determination

The remainder experiments focused on FoV determination. The first testing session evaluatesthe impact of the ray length on the measured performance. We conducted tests with the raylength spanning from 20 to 100 m. Note that the maximum distance allowed to enableinteraction in previous pervasive game studies has been 50 m [12, 15–17], hence, the 100 mmaximum distance used in our tests far exceeds this requirement. Τhe FoVangle has been setto 28o (see Fig. 10a) and 45° (see Fig. 10b); this has been decided not only to provide sufficientFoV for in-game use but also to meet the visual field specifications of popular AR glasses11

and AR frameworks,12 respectively. Our reported results refer to the average execution timerequired to complete the whole FoV determination process employing the three evaluatedmethods.

RC outperforms the other approaches when considering ray lengths below 40 m as on thosecases the ray points inspected in the RC method are still few. However, when the ray length

11 Vuzix M100 Smart Glasses (https://www.vuzix.com/Products/M100-Smart-Glasses) supports a FoV angle of15°, while Epson Moverio BT-200 (http://global.epson.com/newsroom/2014/news_20140107.html) supports23°. Both products fully support smartphone integration.12 The Android Augment Reality Framework (https://github.com/phishman3579/android-augment-reality-framework) and Mixare (http://www.mixare.org/) support a FoV angle of 45°.

(b)(a)

0

100

200

300

400

500

600

20 30 40 50 60 70 80 90 100

RPI RC IRC

Exec

u�on

�m

e (m

sec)

Distance (m)

0

100

200

300

400

500

600

700

20 30 40 50 60 70 80 90 100

RPI RC IRC

Fig. 10 Performance of the evaluated FoV techniques as to the ray length for FoVangle set to (a) 28°; (b) 45°

9842 Multimed Tools Appl (2017) 76:9829–9854

Page 15: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

becomes longer than 40 m, the RPI approach performs better. Evidently, the routine whichdetermines whether a ray point lies within a building polygon (see line 18 in Algorithm 1) iscomputationally less expensive than the routine which calculates the intersection points amonga ray segment and a polygon (see line 15 in Algorithm 2). As expected, the execution timeincreases with the ray length: the RC and RPI methods consider a larger number of buildingsin the FoV calculation while the RC and IRC methods undertake a larger number of ray steps.However, the RPI and IRC methods scale remarkably well with IRC performing better almostin all test sessions. Although the number of ray steps increase in IRC, the checking of therespective pixel colors is executed rather fast (as the processed image pixel colors are kept inmemory). Most importantly, the RC and RPI measurements include the time required forprocessing the building data (this processing is carried out 30 times per session), while nosimilar processing is involved in the IRC method. Interestingly the expansion of the FoVangle(from 28° to 45°) does not much affect the average execution time, thus indicating thedomination of the building’s processing which does not depend on the FoV angle. Note thatthe measured performance of RPI and IRC methods comfortably meets the requirements ofreal-time game applications (e.g. ~5 FoV calculations/sec for FoV angle of 28° and ray lengthof 100 m). On the other hand, the RC approach can be safely applied to game scenarios wherethe maximum interaction distance between a player and an in-game entity is below 50 m.

Another critical factor of FoV determination is the overall FoVangle. Wide viewing anglesare definitely desirable, however they require more rays to cover the respective visualspectrum, hence, more computational cycles. In order to assess the impact of FoV angle onthe overall performance we have conducted two tests, setting the ray length to 50 and 100 m,accordingly. In both tests the FoV angle ranges from 5° up to 45°. This allows testing theproposed FoV determination techniques for diverse game scenarios. For instance, the 5° anglewould suit games that involve shooting a bullet or casting a spell towards an enemy, whereintargeting must be fairly accurate (traditionally, classic video games utilize a single ray toconfirm accurate targeting).13 On the other hand, wider FoV angles would suit games whichshould inform players that a virtual or physical object is ‘within sight’ along the player’smoving direction.

13 The intuition for choosing the 5° angle has been, firstly, to create a sufficient FoV polygon for the Zombie tobe included in and, secondly, to allow sufficiently wide FoV (than that of a single ray) so as to compensate forGPS location fix inaccuracies, which do not allow precise positioning calculations as in computer games.

(b)(a)

0

50

100

150

200

250

300

350

400

450

500

5o 15o 25o 35o 45o

RPI RC IRC

5ο 15ο 25ο 35ο 45ο

Exec

u�on

�m

e (m

sec)

FoV angle (degrees)

0

100

200

300

400

500

600

5o 15o 25o 35o 45o

RPI RC IRC

5ο 15ο 25ο 35ο 45ο

FoV angle (degrees)

Exec

u�on

�m

e (m

sec)

Fig. 11 Performance of the evaluated FoV techniques as to the FoV angle for ray length set to (a) 50 m; (b)100 m

Multimed Tools Appl (2017) 76:9829–9854 9843

Page 16: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

Out tests involved several FoV angles spanning from 5° to 45° (see Fig. 11). The resultsverify the prevalence of IRC among the three evaluated methods. All methods scale well as theincreased FoVangle only determines the number of rays casted and does not affect the numberof buildings processed. Note that the effect of the ray length (i.e. the extension of the ray lengthfrom 50 to 100 m as shown in Fig. 11a and b, respectively) is marginal in all the evaluatedapproaches. In the RC and IRC methods, this is because the -longer- rays typically hit a nearbybuilding in their first steps (hence, the next steps are not examined). In the RPI method, theadded computational overhead for checking the intersection of each ray with a larger numberof buildings is very little. The test results demonstrate that the RPI technique can be safelyincorporated in AR frameworks (i.e. with 45° FoV angle) and meets the requirements of real-time applications.

On our last set of experiments we study the extent to which the number of buildings aroundthe player affects the performance of the examined FoV determination approaches. We testedvarious building set scales (varying from 200 to 800 buildings) with ray length of 100 m andFoVangle of 28°. Obviously, bigger building density scales decrease the average performanceof RC and RPI (Fig. 12) as a larger number of buildings is examined to determine potentialcollision events. This effect is more noticeable in the RC algorithm which examines allbuildings on every ray step. Interestingly, IRC exhibits performance gain as the number ofbuildings increases. This is because ray steps are typically interrupted early as they are morelikely to hit a nearby building. This is not the case in RC though, where the processing ofbuilding data (prior to executing the FoV determination algorithm) dominates the overallexecution time.

Figure 13 illustrates the execution time needed for the completion of 100 individual FoVdetermination procedures for all the three evaluated methods. The peaks values in the RC andRPI graphs mostly correspond to the processing of the buildings set, which is triggered fromdevice’s relocation events. The peak values in the IRC graph are mainly due to situationswhere several consecutive rays are not occluded by any building (hence, the algorithm iteratesthrough the maximum possible number of ray steps).

0

50

100

150

200

250

300

350

400

450

500

200 400 600 800

RPI RC IRC

Exec

u�on

�m

e (m

sec)

#Buildings

Fig. 12 Performance of the evaluated FoV techniques as to the number of surrounding buildings for ray lengthof 100 m and FoV angle of 28°

9844 Multimed Tools Appl (2017) 76:9829–9854

Page 17: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

(a)

0

100

200

300

400

500

600

239

1142

1915

2667

3636

4875

5659

6283

6833

7894

8958

9678

1041

211

221

1246

813

381

1410

814

873

1625

517

407

1819

418

940

1985

520

476

2125

322

010

2276

023

904

2465

425

399

2618

027

061

2742

227

740

2809

728

425

2905

529

769

Exec

u�on

�m

e (m

sec)

Test session �me (msec)

RC

(b)

(c)

0

100

200

300

400

500

600

313

809

1194

1646

2369

2721

3108

3495

3869

4413

5409

6079

6424

6790

7193

7601

8000

8672

9069

9443

9986

1041

610

841

1130

011

712

1224

313

459

1451

016

048

1644

716

840

1721

617

583

1813

218

676

1907

119

459

1984

9

Exec

u�on

�m

e (m

sec)

Test session �me (msec)

RPI

0

50

100

150

200

250

300

350

260

802

1293

1853

2302

2755

3234

3674

4120

4537

5042

5482

5920

6560

7031

7474

7944

8425

8920

9397

9853

1030

610

761

1123

411

980

1241

912

854

1331

613

797

1425

514

753

1520

715

656

1609

916

567

1719

117

627

1808

6

Exec

u�on

�m

e (m

sec)

Test session �me (msec)

IRC

Fig. 13 Execution time required for the completion of 100 individual FoV determination procedures for raylength 50 m and FoV angle 28°: (a) RC method, (b) RPI method, (c) IRC method

Multimed Tools Appl (2017) 76:9829–9854 9845

Page 18: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

5.4 Limitations of the IRC method

As discussed above, the IRC method clearly prevails with respect to performance against RCand RPI methods. This allows to significantly increase the ray length and/or the total FoVangle, whilst maintaining real-time performance. Nevertheless, the IRC method is subject to anumber of significant restrictions:

& Any change on the map view (e.g. due to zoom level adjustment or player’s relocation)impairs the accuracy of building detection due to invalidating the mapping among the geo-coordinates of the ray steps and the screen’s coordinate system. As a result, upon suchchanges a new device’s screenshot should be captured and processed to update the positionof buildings (i.e. to identify the screen’s pixels associated with buildings). However, takinga screenshot upon every map view’s change is infeasible in contemporary smartphoneplatforms, as this process requires several seconds to complete, thereby interrupting real-time game play and increasing the risk of memory overloading.14

& The accuracy of pixel-based building detection is highly sensitive to the device’s screenresolution and pixel density (PPI).15 This implies that the accuracy of IRC-based FoVdetermination is often limited. For instance, the shaded areas shown in Fig. 14 areincorrectly considered to be part of the FoV although they are occluded by a building.Note that this example screenshot has been captured from a high-resolution screen.Namely, the accuracy of the IRC method would be far worse in devices with lowerresolution.

The above detailed restrictions suggest that the IRC method could only be considered ingames: (a) with no requirement for highly accurate occlusion detection; (b) bound to specific

14 http://developer.android.com/training/displaying-bitmaps/load-bitmap.html15 http://developer.android.com/guide/practices/screens_support.html

Fig. 14 Illustration of inaccurateFoV determination in the IRCmethod. The screenshot has beencaptured from a smartphone withscreen resolution 1280 × 720displaying a map with zoom level18

9846 Multimed Tools Appl (2017) 76:9829–9854

Page 19: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

(preregistered) areas whose building data would be preloaded in the application; (c) with apredefined map zoom level which cannot be adjusted by the user.

6 Dealing with localization inaccuracy

A major challenge dealt with in FoV determination methods in outdoors pervasive gamesrelates with the (in)accuracy inherent in GPS localization and OSM data [8]. The accuracyissues of GPS, especially in dense urban environments, has motivated researchers in the fieldof pervasive games to investigate methods to effectively handle imprecise GPS-based local-ization without breaking player immersion [2, 5]. OSM data are also often inaccurate as theyare essentially based on crowdsourced GPS traces [13].

Fortunately, popular mobile platforms allow applications to sense the estimated accuracy ofthe integrated GPS receiver.16 In the example of Fig. 15a, the device is estimated to be locatedat point Awith accuracy equal to r. The worst case scenario is that the device’s true location isB or C. In the absence of any buildings in the vicinity of the device, the calculated FoV for theestimated device’s location is the circular sector ADG. However, due to the GPS accuracyeffect, the area which can be safely assumed to be part of the true FoV is confined in theyellow-shadowed area (note that this area should be even smaller since the true location of thedevice could be towards the upper or the bottom part of the circle). Clearly, the size of this areaincreases as the accuracy value r decreases. In the case that the virtual character is currentlywithin the circular sectors AHE or AFI, it is less safe to assume that it is within sight; as a

16 Android applications may obtain the GPS accuracy, defined as the radius of 68 % confidence. That is, there isa 68 % probability that the true location is inside a circle centered at the estimated location, where the circle’sradius is equal to the accuracy value. For example, if the accuracy value is 10, then there’s a 68 % chance the truelocation of the device is within 10 m of the reported coordinates.

AΒ C

D

Ε F

G

r

H I

Fig. 15 (a) Illustration of the uncertainty in FoV determination due to the inaccuracy of GPS localization; (b) abuilding in front of the player hides the Zombie (in this case, the calculated FoV matches the true player’s FoV)

Multimed Tools Appl (2017) 76:9829–9854 9847

Page 20: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

result, this uncertainty should be somehow ‘translated’ to some sort of visual clue and/or betreated properly by the game engine.

To investigate the effect of GPS accuracy in practical settings we performed thefollowing test: firstly, the BeyondAR view has been divided in 45 equal vertical parts(since BeyondAR supports 45° FoV angle); then a 45° FoV has been calculatedutilizing 1° angle increment steps (so that each ray corresponds to a single verticalpart of the BeyondAR view). For each ray hitting a building the corresponding part ofthe BeyondAR view has been darkened to highlight the part of the players’ FoV whichis blocked by the building. We have found that the calculated FoV has been acceptablyaccurate under realistic game play conditions (with GPS accuracy measured 5–10 m,which is regarded as average in GPS fixes acquired by smartphone devices in urbanenvironments [19]), when the ray length has been kept up to 30 m. This is illustrated inFig. 15b, where the left boundary of the darkened area almost coincides with the leftboundary of the underlying building.

However, the accuracy of GPS fixes often exceeds 10 m, thereby increasing locali-zation uncertainty. Several pervasive game developers have proposed the exploitation ofGPS uncertainty as in-game element, making use of the GPS shadows metaphor (GPSshadows refer to areas with poor GPS accuracy wherein players or virtual characters may‘hide’) [2]. As regards outdoors chase games like Order Elimination, we proposeanalogous treatment: upon detecting poor GPS accuracy, the map view may use differentcolors to indicate the areas which may be safely or uncertainly regarded to be within theFoV. For instance, the grey-shadowed areas in Fig. 16 represent the areas of uncertainty.Unfortunately, no suitable visual indicator exists for the equivalent camera view as (dueto localization uncertainty) the game engine cannot recognize what part of the physicalsurroundings is actually displayed on the device’s screen (potential FoV fine tuningthrough processing the device’s camera image would also be unfeasible given thecomputational capabilities of modern mobile devices). To treat uncertainty as a gameelement, the Zombie could become invisible when laying within the shadow area orrequire more than one bullet to be killed.

Fig. 16 (a) Shadows (representedby grey areas) generated based onGPS accuracy (25 m)

9848 Multimed Tools Appl (2017) 76:9829–9854

Page 21: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

7 Evaluation

The FoV generation technique presented in this article has been evaluated by 12 players.17 Ouremphasis has been to investigate the extent to which the consideration of nearby buildings indetermining the FoVaffected the players’ enjoyment and interest towards the game. In order tomeet this objective, we have developed a test application (sharing the same concept with OrderElimination). In addition to the FoV generation proposed in this work,18 this applicationadditionally implements other two techniques widely used in pervasive games.

When unobstructed, the FoV derived by the first method is visualized as a circular sector(generated by the multi-angle raycasting process). The circular sector is transformed in real-time to a complicated polygon shape when a building is detected in front of the user (seeFig. 17a). The second method does not take buildings into account (see Fig. 17b). Hence, thetriangle rotates according to the player’s orientation and its shape remains unaffected by nearbyobstacles. That method is widely used in AR games where the player is assumed to have alimited FoV; this method has been found to be mostly affected by the occlusion problem inpast studies [12, 15, 17]. The third method involves a circle denoting the area where theZombie should lie in order to allow the player shooting it (see Fig. 17c). The latter method isalso commonly employed in pervasive games where the distance circle might be visible (likein Ingress [16]) or invisible (like in CYSMN? [2]); that method aims at determining players’

17 The evaluators have been recruited through an open invitation advertised in the University of the Aegean,Mytilene, Greece. Ten (10) of the evaluators have been male and two (2) female. Six (6) evaluators have been inthe age group of 19–23, four (4) in the age group of 24–33 and two (2) in the age group 34–39.18 The application distributed to evaluators implemented the RC algorithm. Since this method performs worsethan RPI, it serves the purpose of letting evaluators to test the application under the worst possible performanceconditions. Note that the IRC method has not been an option as we have chosen not to confine the evaluationwithin a pre-registered area or prevent users from zooming-in/out the map view.

Fig. 17 FoV generation types: (a) geolocative raycasting (the Zombie is ‘hidden’ behind a building); (b) limitedFoV generation method that neglects the occlusion due to nearby buildings; (c) interaction with virtual objectsenabled when the proximity condition holds

Multimed Tools Appl (2017) 76:9829–9854 9849

Page 22: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

Fig. 18 (a) Avocation with video games; (b) familiarity with technologies used in Order Elimination

Fig. 19 User assessment of (a) FoV visualization; (b) interest and fun element of the buildings recognitionfeature; (c) most preferable FoV generation method; (d) importance of real-time building recognition; (e)performance of FoV rendering

9850 Multimed Tools Appl (2017) 76:9829–9854

Page 23: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

proximity with important in-game entities (like enemies), where the player is only required toapproach them in order to interact with them, disregarding the player’s direction [17].

In order to evaluate the usability and quality of experience aspects of Order Elimination, thetest application presented above has been handed to the evaluation trial participants, allowingthem sufficient time (1 week) to watch an introductory video and test the application.19 Finally,the evaluators have been invited to fill in anonymously an online questionnaire20 in order toconvey their experience and assess qualitative aspects of Order Elimination.

The participants have been first questioned about their avocation with video games andfamiliarity with certain technologies utilized in Order Elimination. As indicated by the answersto those demographics questions, most players frequently play video games (see Fig. 18a),including popular titles like Bioshock,21 Snipper Elite,22 World of Warcraft23 and AssassinsCreed,24 games in which virtual obstacles are taken into account to determine the FoV, as theplayer cannot interact with in game entities, e.g. shoot a bullet or cast a spell against an enemywhen virtual buildings or obstacles interrupt her FoV. Also, all players have been familiar withGPS-based localization and interactive maps, while some have had experience with mobileAR applications (see Fig. 18b).

Proceeding to the evaluation of FoV aspects, we have firstly asked the participants toassess the simulation of a certain FoV versus the use of the distance circle in the contextof a pervasive game. To a large extent, the players argued that the use of a limited FoVmade the game more realistic, interesting and enjoyable (see Fig. 19a). Moreover theconsideration of surrounding buildings in adapting the players’ FoV has been positivelyperceived by the evaluators as it increased their interest towards the game and theirenjoyment while playing due to being more consistent with humans’ cognitive perceptionof the physical environment (see Fig. 19b).

The above finding agrees with the ranking of the three available FoV modes, as allplayers denoted their preference on the viewing method which takes into account nearbybuildings. Interestingly, the two other FoV modes received the same preference ratio by theplayers revealing that taking into account nearby buildings in generating the FoV is theaspect that mostly affects the player’s quality of experience (see Fig. 19c). Further substan-tiating this argument, the majority of participants argued that the real time building recog-nition while playing the game has positively affected the interest and fun elements of thegame (see Fig. 19d). Finally, the players found the performance (responsiveness) of the FoVgeneration satisfactory for in game usage25 (see Fig. 19e).

19 A video of the application can be found at https://www.youtube.com/watch?v=D–A3fEghbA. The AndroidApplication File (APK) can be downloaded at http://zarcrash.x10.mx/OrderEliminationc.apk.20 The questionnaire can be found at http://zarcrash.x10.mx/OrderEliminationGoogleForm.pdf21 https://www.2kgames.com/bioshock/22 http://www.sniperelite3.com/23 http://eu.battle.net/wow/en/24 http://assassinscreed.ubi.com/en-gb/home/25 The application distributed to evaluators has been configured with the ray length set to 100 m, the FoVangleset to 28° and a game space of 250,000 m2 (square with side length of 500 m) while the mobile devices used byplayers varied in technical specifications (Sony Xperia S, Samsung (Galaxy S4 Mini, Beam, Note 4, S4), Nexus(4 & 6), F&U Tablet and Motorola Moto G 2nd Generation).

Multimed Tools Appl (2017) 76:9829–9854 9851

Page 24: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

8 Conclusions and future research

In this paper we introduced three novel methods to deal with the occlusion handling problem inoutdoors AR applications. Our methods are able to efficiently generate a realistic FoV (consistentwith the device’s location and orientation) through the real-time detection of buildings in thevicinity of the player. The generated FoV may be subsequently used to verify whether a single ormultiple virtual objects are ‘within sight’ or occluded by nearby buildings. All the three methodscast a sequence of rays (covering a specific FoVangle) and estimate the potential location whereeach ray hits a (building) polygon upon. The RC method progressively generates virtual locationsalong a ray segment until a hit is detected; the RPI method utilizes an efficient geometric techniqueto accurately calculate the ‘collision’ points; the IRC method follows the same approach as RC,however, it examines the color of the pixel corresponding to successive ray steps to detect hits.

Our geolocative raycasting methods have been tested in real settings through measuringtheir performance (i.e. the execution time required to complete the FoV generation process)with respect to several performance variables (ray length, FoV angle, number of surroundingbuildings). We have found evidence that real-time FoV rendering is feasible even by modestmobile devices for ray lengths up to 100 m and FoVup to 45°, under stress conditions (denselybuilt urban areas, constant device rotation). Our test results demonstrated the prevalence of theIRC method; however, this is achieved at the expense of poor accuracy and limited flexibility(in real applications the game would be bound to preregistered areas and the players would notbe allowed to adjust the map’s zoom level). The method that has been found to better meet therequirements for real-time performance and anytime/anywhere game play is RPI.

Our geolocative raycasting technique has been incorporated within Order Elimination, anoutdoors AR game. The user evaluation of Order Elimination revealed that the considerationof surrounding buildings may significantly improve the quality of experience perceived byplayers compared to alternative approaches which only take into account straight line distanceto enable interaction among players and in-game objects. The FoV determination methodspresented in this work are applicable to pervasive games in order to resolve the occlusionproblem, by hiding virtual objects when not within the player’s sight. Similarly, our methodscould be useful in a variety of location-based applications (e.g. games, tourist guides, special-purpose navigators, etc.) which need to draw the user’s FoV (or adjust the appearance of POImarkers) on a map interface, play appropriate audio/video clips and so on.

In the future, we plan to implement several performance speedup techniques on our FoVdetermination methods. Firstly, we plan to test the offline processing of topographical informationso as to merge nearby (e.g. semi-detached) buildings in aggregated structures, thereby reducing theoverall number of buildings and boosting performance. Also, we plan to investigate methods forpartitioning the game space in separate areas and applying a distance threshold fairly larger than theray length; then, the set of buildings would be updated upon the player moving from one area toanother, thus significantly reducing the frequency of such updates and increasing performance.

References

1. Behzadan AH, Kamat VR (2010) Scalable algorithm for resolving incorrect occlusion in dynamic augment-ed reality engineering environments. Comput-Aided Civil Infrastruct Eng 25:3–19

2. Benford S, Crabtree A, Flintham M, Drozd A, Anastasi R, Paxton M (2006) Can You See Me Now? ACMTrans Comput-Human Inter 13:100–133

9852 Multimed Tools Appl (2017) 76:9829–9854

Page 25: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

3. Benford S, Magerkurth C, Ljungstrand P (2005) Bridging the Physical and Digital in Pervasive Gaming.

Commun ACM 48:54–574. Billinghurst M, Clark A, Lee G (2014) A Survey of Augmented Reality. Foundations Trends Human–

Comput Inter 8:73–2725. Broll W, Ohlenburg J, Lindt I, Herbst I, Braun AK (2006) Meeting technology challenges of

pervasive augmented reality games, Proc 5th ACM SIGCOMM Workshop Network SystemSupport Games, 28–39

6. Capra M, Radenkovic M, Benford S, Oppermann L, Drozd A, Flintham M (2005) The multimediachallenges raised by pervasive games, Proc 13th Annual ACM Int Conf Mult, 89–95

7. Cheok AD, Sreekumar A, Lei C, Thang LM (2006) Capture the Flag: Mixed-Reality Social Gaming with

Smart Phones. IEEE Pervasive Comput 5:62–638. Ciepłuch B, Jacob R, Mooney P, Winstanley A (2010) Comparison of the accuracy of OpenStreetMap for

Ireland with Google maps and Bing maps, Proc 9th Int Symp Spatial Accuracy Assessment Natural ResEnviron Sci, 337

9. Clark A, Piumsomboon T (2011) A realistic augmented reality racing game using a depth-sensing camera,Proc 10th Int Conf Virtual Reality Continuum Appl Industry, 499–502

10. Dey A, Sandor C (2014) Lessons learned: Evaluating visualizations for occluded objects in handheldaugmented reality. Int J Human-Comput Studies 72:704–716

11. Fischer J, Huhle B, Schilling A (2007) Using time-of-flight range data for occlusion handling in augmentedreality, IPT/EGVE, 109–116

12. Fischer J, Lindt I, Stenros J (2006) Final crossmedia report (part II) – Epidemic menace II evaluation report,Integrated Project Pervasive Gaming

13. Haklay M, Weber P (2008) Openstreetmap: User-generated street maps. Pervasive Comput 7:12–1814. Hayashi K, Kato H, Nishida S (2005) Occlusion detection of real objects using contour based stereo

matching, Proc 2005 Int Conf Augmented Tele-existence, 180–18615. Herbst I, Braun A-K, McCall R, Broll W (2008) TimeWarp: interactive time travel with a mobile

mixed reality game, Proc 10th Int Conf Human Computer Interaction Mobile Devices Services,235–244

16. Hodson H (2012) Google’s Ingress game is a gold mine for augmented reality, New Scientist, 1917. Kasapakis V, Gavalas D (2014) Blending history and fiction in a pervasive game prototype, Proc 13th Int

Conf Mobile Ubiquitous Mult, 116–12218. Kasapakis V, Gavalas D (2015) Pervasive gaming: Status, Trends and design principles. J Netw Comput

Appl 55:213–23619. Kasapakis V, Gavalas D (2015) Geolocative Raycasting for real-time buildings detection in pervasive games,

Proc Int Workshop Network Syst Support Games, 1–320. Kasapakis V, Gavalas D, Bubaris N (2015) Pervasive Games Field Trials: Recruitment of Eligible

Participants through Preliminary Game Phases. Pers Ubiquit Comput 19:523–53621. Kim H, Yang S-j, Sohn K (2003) 3D reconstruction of stereo images for interaction between real and virtual

worlds, IEEE ACM Int Symp Mixed Augmented Reality, 169–17622. Lepetit V, Berger M-O (2000) Handling occlusion in augmented reality systems: a semi-automatic method,

Proc IEEE ACM Int Symp Augmented Reality, 137–14623. Livingston MA, Ai Z, Karsch K, Gibson GO (2011) User interface design for military AR applications.

Virtual Reality 15:175–18424. Ohta Y, Sugaya Y, Igarashi H, Ohtsuki T, Taguchi K (2002) Client/server depth sensing for see-through head-

mounted displays. Presence: Teleoperators Virtual Environ 11:176–18825. Schroeder J (2013) AndEngine for Android game development cookbook, Packt Publishing Ltd26. Shah MM, Arshad H, Sulaiman R (2012) Occlusion in augmented reality, Proc 8th Int Conf Inf Sci Digital

Content Technol, 372–37827. Thomas BH (2012) A survey of visual, mixed, and augmented reality gaming. ACM Comput Entertain 10:

1–3328. Tian Y, Guan T, Wang C (2010) Real-time occlusion handling in augmented reality based on an object

tracking approach. Sensors 10:2885–290029. Tian Y, Long Y, Xia D, Yao H, Zhang J (2015) Handling occlusions in augmented reality based on 3D

reconstruction method. Neurocomputing 156:96–10430. Wetzel W, Blum L, McCall R, Oppermann L, Broeke TS, Szalavári Z (2009) Final prototype of TimeWarp

application, IPCity31. Yang T, Pan Q, Li J, Li SZ (2005) Real-time multiple objects tracking with occlusion handling in dynamic

scenes, IEEE Comput Soc Conf Comput Vision Pattern Recognition, 970–97532. Zhu J, Pan Z, Sun C, Chen W (2010) Handling occlusions in video‐based augmented reality using depth

information. Comput Animation Virtual Worlds 21:509–521

Multimed Tools Appl (2017) 76:9829–9854 9853

Page 26: Occlusion handling in outdoors augmented reality gameszarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · To the best of our knowledge, ManHunt2 [17] is the only pervasive game

Kasapakis Vlasios PhD student (subject area: BPervasive Games^) at the Dept of Cultural Technology &Communication (University of the Aegean). Holds a degree (2007) in Cultural Technology and Communicationfrom the Dept of Cultural Technology & Communication (University of the Aegean) and an MSc (2009) inCultural Informatics and Communication from the same department. Carried out research in the fields of mobile& pervasive games/systems and co-authored 12 scientific publications. Adjunct lecturer at Athens School of FineArts (2010–2011) and instructor on various lectures about Pervasive games and Game design. Currentlyemployed as research associate of the Computer Technology Institute and Press BDiophantus^ (Patras, Greece)in EU-funded research projects.

Damianos GavalasAssociate Professor at the Dept of Cultural Technology & Communication (University of theAegean) and Adjunct Professor at the Hellenic Open University of the Aegean. Holds a degree (1995) inInformatics from the University of Athens and an MSc and PhD (1997 and 2001, respectively) in ElectronicEngineering from the University of Essex (UK). Carried out research in the fields of mobile & pervasiveapplications/systems, web technologies, wireless networking and co-authored over 120 scientific publications.Supervised two PhD dissertations and dozens of final year and MSc projects. Editorial board member of threescientific journals in the areas of ubiquitous and mobile computing. Technical program committee member ondozens of leading conferences and guest-editor in highly ranked journals in the field of mobile computing andwireless communications. Participated as key researcher and/or coordinator on several research and developmentEuropean and National projects.

9854 Multimed Tools Appl (2017) 76:9829–9854