augmented reality in cultural heritage: field of view awareness in...

15
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/309621431 Augmented reality in cultural heritage: Field of view awareness in an archaeological site mobile guide Article in Journal of Ambient Intelligence and Smart Environments · October 2016 DOI: 10.3233/AIS-160394 CITATIONS 3 READS 121 3 authors, including: Some of the authors of this publication are also working on these related projects: Aminess View project PROSEN View project Vlasios Kasapakis University of the Aegean 28 PUBLICATIONS 88 CITATIONS SEE PROFILE Damianos Gavalas University of the Aegean 162 PUBLICATIONS 1,946 CITATIONS SEE PROFILE All content following this page was uploaded by Damianos Gavalas on 25 October 2017. The user has requested enhancement of the downloaded file.

Upload: others

Post on 05-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

Seediscussions,stats,andauthorprofilesforthispublicationat:https://www.researchgate.net/publication/309621431

Augmentedrealityinculturalheritage:Fieldofviewawarenessinanarchaeologicalsitemobileguide

ArticleinJournalofAmbientIntelligenceandSmartEnvironments·October2016

DOI:10.3233/AIS-160394

CITATIONS

3

READS

121

3authors,including:

Someoftheauthorsofthispublicationarealsoworkingontheserelatedprojects:

AminessViewproject

PROSENViewproject

VlasiosKasapakis

UniversityoftheAegean

28PUBLICATIONS88CITATIONS

SEEPROFILE

DamianosGavalas

UniversityoftheAegean

162PUBLICATIONS1,946CITATIONS

SEEPROFILE

AllcontentfollowingthispagewasuploadedbyDamianosGavalason25October2017.

Theuserhasrequestedenhancementofthedownloadedfile.

Page 2: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

Augmented Reality in Cultural Heritage: Field of View Awareness in an Archaeological Site Mobile Guide

Vlasios Kasapakis a,*, Damianos Gavalas

a and Panagiotis Galatis b

a Department of Cultural Technology and Communication, University of the Aegean, Mytilene, Greece b School of Science & Technology, Hellenic Open University, Patras, Greece

Abstract. Over the past few years, augmented reality (AR) is increasingly diffused among location-based applications. A sit-uation often encountered in AR applications is the partial or full occlusion of -physical or virtual- objects by physical obsta-cles. Existing outdoors AR applications overlook this issue displaying visual indicators about nearby annotated objects even when those are hidden behind a building, hence, being out of the users’ field of view (FoV). This paper presents a geolocative raycasting technique which enables real time building recognition towards the device’s pointing direction, thereby estimating the FoV of the users. The objective is either to hide or utilize appropriate visual metaphors for occluded objects / locations thereby improving the user’s perception about its surrounding space. The motivation for developing our occlusion handling technique has been drawn from KnossosAR, an outdoors mobile AR guide implemented for the archaeological site of Knossos (in Crete, Greece). We have conducted field trials which provided preliminary evidence of the efficiency, effectiveness and utility of KnossosAR (including the incorporated FoV estimation approach).

Keywords: Augmented Reality; occlusion; raycasting; field of view; cultural heritage; archaeological site.

1. Introduction

Augmented reality (AR) applications typically re-quire only a limited amount of the user’s field of view (FoV) to be rendered with computer-generated graphics while the major part of the user’s FoV is dominated by the physical world [4, 27]. The allow-ance of users to view the physical world provides them a better sense of where they are and what is around them. Nevertheless, in practical outdoors set-tings it often occurs that a physical obstacle occludes an annotated object/location. For instance, in urban environments surrounding buildings are highly likely to occlude a point of interest (POI). Those POIs are treated indiscriminately by mobile location-based AR frameworks, regardless of whether they are actually within the FoV of the user or not. However, display-ing occluded objects (e.g. overlaying a marker or any sort of interpretive information about a hidden POI) often results in misconceptions and wrong pursuance

*Corresponding author. E-mail: [email protected]

of tasks amongst users [25, 28] thereby compromis-ing the clarity and explicitness of AR applications.

Höllerer et al. [12] have recognized the problem of POIs occluded by physical obstacles (such as build-ings) and argued that AR projections should be ap-propriately adjusted to convey correct depth infor-mation. However, existing occlusion handling ap-proaches either require (offline) registration of the physical environment (therefore, they cannot be easi-ly relocated elsewhere) or involve computationally hard image-processing tasks.

Nevertheless, typical outdoors AR applications en-tail far stricter requirements: real-time performance; anytime/anywhere execution (hence, support for con-suming open geodata, especially topography/building data); suitability for executing on average mobile equipment. Moreover, many applications would ben-efit from accurately estimating the user’s FoV, which is a problem more generic than occlusion estimation: while occlusion detection refers to testing the Line of Sight (LoS) condition between the user and a specific point (see Fig. 1a), FoV refers to the whole area be-

Page 3: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

ing within the user’s view in the 2D or 3D space (see Fig. 1b), Unlike existing methods, our focus is on methods for determining LoS/FoV while also satisfy-ing the above listed requirements of outdoors AR applications.

POI A

 

POI A

POI B

(a) (b)

Fig. 1. (a) LoS estimation (POI A is occluded by a physical obsta-cle as the line connecting the device and POI intersects with the obstacle’s polygon); (b) FoV estimation (only the obstacle and POI B are considered to lie within the user’s view as they are tangent to the FoV polygon).

In classic video games, the visibility of virtual ob-jects is estimated utilizing the raycasting technique. Raycasting refers to the act of casting imaginary light beams (rays) from a source location (typically the point of view of the character or object controlled by the player) and recording the objects hit by the rays [23]. Herein, we extend this idea in outdoors AR ap-plications wherein, unlike video games, the virtual space is integrated with the physical one, is not pre-registered and occlusion is typically caused by sur-rounding buildings. In particular, we introduce a geo-locative raycasting technique that allows AR applica-tion developers to detect buildings (or custom-generated obstacles) in outdoors location-based AR applications, thereby reliably resolving the object occlusion issue. Note that our proposed method is appropriate to capture the occlusion of both virtual and physical objects by real obstacles. We have test-ed the portability and performance of the raycasting algorithm on real settings, utilizing open map (build-ing) data of the OpenStreetMaps (OSM).

The motivation for developing our occlusion han-dling technique has been drawn from the develop-ment of KnossosAR, an outdoors mobile AR guide implemented for the Unesco world heritage -archaeological- site of Knossos (in Crete, Greece). Namely, the feedback received on early prototypes of KnossosAR (by the members of a focus group) re-vealed the need for real-time FoV estimation in out-doors mobile AR applications. We have conducted field trials which provided preliminary evidence of the efficiency, effectiveness and utility of Knos-sosAR (including the incorporated FoV estimation approach).

The remainder of this article is structured as fol-lows: Section 2 presents previous research related to our work. Section 3 introduces our geolocative raycasting algorithm for determining FoV on mobile AR applications. Section 4 discusses the design and implementation details as well as the user evaluation results of KnossosAR. Finally, Section 4 concludes our work.

2. Related work

When designing AR experiences, it is important to take into account the perceptual (the ability to recog-nize and interpret visual stimuli) and cognitive (the ability to reason about those stimuli) abilities of hu-mans [9]. This especially holds for AR application scenarios supporting the amorphous exploration of the environment, e.g. in tourism [18, 34], heritage sites [30], mobile education [31] or games [27]; those are addressed to mobile users, largely unfamiliar with the environment, therefore the virtual overlay has to “enrich and explain, rather than clutter and confuse, the user’s physical surroundings” [3]. This means that, irrespective of application and display, it is fun-damentally important to deliver clear representation of meaningful information, in a way that enhances perceptual learning and prevents cognitive overload. Along this line, the detection of (and the use of ap-propriate means for displaying) occluded structures and objects challenges the objective to enhance the clarity and explicitness of mobile AR applications [7, 20, 35]. Further to addressing occlusion, other crucial considerations for mobile AR applications are porta-bility, high performance, context sensitivity and sup-port for resource-constrained handheld devices [13, 31, 35].

The problem of occlusion in AR is noticeable in a variety of mobile AR, location-based guides devel-oped for tourist destinations and cultural heritage sites. TripAdvisor2 and mTrip3 are popular mobile travel applications which provide reviews of travel-related content. Both of them offer an AR projection mode for POIs, superimposing AR markers upon the device’s camera views. Α similar approach is adopt-ed by Layar4 [19] and Wikitude5 [34], two popular mobile AR frameworks. However, none of the

2 http://www.tripadvisor.com/ 3 http://www.mtrip.com/ 4 https://www.layar.com/ 5 http://www.wikitude.com/

Page 4: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

abovementioned projects takes into account the oc-clusion caused by nearby buildings when projecting POIs onto the users’ screen (see Fig. 2).

Fig. 2. Example TripAdvisor AR view: markers referring to two POIs (restaurants) occluded by other buildings are projected to the mobile device’s screen without conveying depth information.

The occlusion problem has received considerable attention in the field of AR research. The approaches proposed for handling the occlusion problem in AR are mainly classified into two types [29]: model-based and depth-based approaches. Model-based methods require the accurate (offline) 3D modeling of the real environment’s objects [11, 15], thus, limit-ing their portability. Depth-based methods rely on some sort of image processing (e.g. stereo matching) to acquire depth information of real objects, which is computationally expensive, hence, not applicable to real-time applications [16, 21]. Moreover, other ap-proaches feature short FoV distance [2], lack support for smartphones [8] or are sensitive to environmental light to perform accurate FoV determination (often using cameras), thereby failing to function during nighttime [33].

Notably, several relevant commercial products have been released recently, like Arduino Ping, Pro-ject Tango and Structure Sensor, which may be easily integrated with smartphones and utilized for FoV determination. Nevertheless, those products feature short FoV determination distance (maximum 4m); thus, they would not be applicable to outdoors AR applications.

To the best of our knowledge, the only portable lo-cation-based application partially addressing the oc-clusion problem is ManHunt [14] a pervasive game which only examines whether the player and a virtual character lie on the same road segment (through in-specting the walking directions from the player’s to the virtual character’s location, as derived from the

Google Directions API). However, ManHunt -incorrectly- assumes lack of LoS when the involved actors lie on different streets (although a park could lie in between, thus ensuring FoV clearance). Aside LoS, no mobile AR application deals with the prob-lem of determining the device’s FoV, namely the 2D polygon which approximates the player’s sight.

To address the aforementioned issues, we intro-duce an efficient geolocative raycasting method which allows to detect buildings within the device’s in location-based and AR game environments, there-by reliably resolving the object occlusion issue. Our raycasting techniques suggests a portable scheme which may be incorporated in any outdoors AR ap-plication and be utilized at any urban setting, provid-ed that sufficient topographical data exist.

3. Geolocative Raycasting

This section introduces our novel geolocative raycasting method for FoV determination in outdoors location-based applications. In addition to the user’s exact location, our method requires the calculation of the device’s orientation (bearing6) based on meas-urements taken from the accelerometer and magne-tometer sensors of the user’s device. Thereafter, the bearing is set as the center of the user’s FoV.

The raycasting algorithm progressively generates virtual locations along a straight line (each virtual location is at distance ds far from the previous one, along a ray of dmax length) towards the user’s facing direction. A single ray would likely be sufficient to check the LoS condition, however, it is insufficient to accurately estimate the user’s FoV. Thus, the same process (i.e. the casting of rays) is repeated every angles degrees, from the leftmost to the rightmost FoV’s angle (anglel and angler, respectively) consid-ering the current bearing of the device as the bisector of the FoV’s angle. A pseudocode implementation of the raycasting algorithm is presented in Algorithm 1.

Algorithm 1 (Raycasting)

// The left and right angles (edges) of the FoV 1. initialize

/* The incremental steps for the FoV angle and ray steps */

2. initialize , // The maximum FoV distance (ray length) 3 initialize

6 Bearing refers to the angle of a moving object’s direction from

the North.

Page 5: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

// Current FoV angle and ray length 4. θ = , d = // Measured by the magnetometer (in degrees) 5. measure azimuth

/* User geo-coordinates measured by the GPS receiver */

6. measure lat, lon

/* Polygons (buildings) extracted from OSM data whose centroid is within a specified radius from the user (in ascending order of distance between the polygon’s centroid and the user’s location) */

7. initialize P // The FoV polygon (initially empty) 8. initialize FoV 9. while θ < do 10. bearing = azimuth + θ

/* Indicates whether a ray hits a polygon (building) */

11. hit = false // Coordinates of consecutive ray steps 12. {newlat, newlon} = {0, 0}

13. while d < do

// Ray step latitude

14. newlat = asin(sin(lat)*cos(d) +

cos(lat)*sin(d)*cos(bearing))

15. a = atan2(sin(bearing)*sin(d)*cos(lat),

cos(d)-sin(lat)*sin(newlat)) // Ray step longitude 16. newlon = ((lon + a + 3π) % 2π) - π // Iterate through polygons (buildings) 17. for each b

/* if the current ray step lies within a polygon */

18. if contains(newlat, newlon, b) then

/* insert the vertex (collision point) into the FoV polygon */

19. 20. hit = true

/* No need to consider the re-mainder polygons */

21. break 22. end if 23. end for 24. if hit == true then 25. break // This ray has been blocked 26. end if 27. d = d + step 28. end while 29. if hit == false then 

/* insert the endmost point of the ray in theFoV polygon */

30. 31. end if 32. θ = θ + 33. end while

The computational complexity of the algorithm is O(r . l . p), where r denotes the number of rays cover-ing the FoV (depends on the overall FoV angle and the incremental step for the FoV angle), l denotes the number of steps per ray (depends on the maximum ray length and the incremental step for the ray steps) and p denotes the number of buildings considered in the raycasting process (depends on the maximum ray length and the buildings’ density).

The raycasting method is illustrated in Fig. 3a where 10 rays determine the users FoV in an area featuring buildings stored in the OSM database (white-colored dots denote points invisible from the device’s current stand point). When one of the ray steps (i.e. virtual location) is found to lie inside a polygon (building) of the surrounding buildings’ list7, it is inferred that the ray has been blocked by a build-ing, hence, further ray steps along that line are un-necessary.

When a ray step hits a building, the point of im-pact of the blocked ray is saved in a vector; upon the completion of the raycasting process, those collision points may be utilized to draw a polygon on an OSM interface, providing a visual (2D) representation of the user’s FoV (see Fig. 3b). Finally, it is easy to inspect whether a POI location lies inside the FoV’s polygon, thus determining whether the POI is within the user’s sight. This information is crucial for mo-bile AR applications which have to apply appropriate visual metaphors for rendering graphical entities (typically markers) to convey the depth (i.e. the posi-tioning) of POIs relatively to buildings shown in the devices’ display. For instance, our example applica-tion shown in Fig. 3c and Fig. 3d adjusts the mark-er’s background color depending on whether the POI (church) is inside or outside the users’ FoV, respec-tively. Another option would be to consider the total number of the rays which hit the building so as to adjust the AR marker transparency accordingly, thus providing visual clues for the percentage of the user’s FoV covered by the POI8.

It is noted that the screenshots presented in Fig. 3 are captured from a mobile AR application which we have developed for testing purposes utilizing OSM data and the BeyondAR9 framework.

7 The calculation of the inclusion of a ray step into the polygon of

a building was performed utilizing the Android Map Utilities li-brary (https://github.com/googlemaps/android-maps-utils).

8 A video presenting the function of the test application is availa-ble at: https://youtu.be/zCoI0RW1CcI

9 The BeyondAR framework is available at: http://beyondar.com/

Page 6: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

 

(a) (b)

 (c) (d)

Fig. 3. (a) Multi-angle raycasting generating the users’ FoV; (b) FoV representation; (c) POI partially inside the users’ FoV; (d) POI outside the users’ FoV.

Note that the accuracy of FoV estimation depends on (a) the density of rays (i.e. their in-between angle angles), and (b) the distance ds among successive ray steps. Clearly, the FoV determination accuracy im-proves as the angles and ds values decrease. In the illustration of Fig. 4, the raycasting involves 5 rays with eight (8) ray steps each. In this example, the estimated FoV (see polygon ABCDEF in Fig. 4a) considerably differs from the real FoV (see polygon ABC’D’E’F in Fig. 4b) due to the relatively sparse rays and ray incremental steps.

(a) 

(b) 

Fig. 4. (a) Estimated FoV; (b) real FoV. White dots represent ray incremental steps while black dots denote occluded ray hits.

Experimental validation and performance tests. The performance of the raycasting algorithm has

been tested by performing a stress test. We have cho-sen to simulate the execution of the raycasting algo-rithm in the center of Brussels (Belgium), as the OSM database contains a large number of registered build-ings in that area (see Fig. 5). Building data (i.e. poly-gon vertices’ coordinates) have been yield OSM Overpass Turbo API10. To limit the number of build-ings examined, we apply a distance threshold from the user’s location so as to filter out polygons whose cen-troid is located further than that. The application of a distance threshold slightly longer than the length of the ray ensured that the corners of buildings whose centers are slightly further from the ray’s reach are also detected.

The testing space has been a square with side length of 500 m, and the ray step to 3.5 meters to ex-clude the possibility of missing even small buildings. Updates of the nearby buildings list have been trig-gered every 2 sec, applying a certain distance thresh-old from nearby buildings (the total number of build-ings during the tests presented below have been 873, on average)11. The device has been constantly rotated throughout the test (approximately 25 rotations in a 60 sec testing session). We have conducted two (2) performance tests, capturing the effect of several per-formance variables, such as the ray length, distance threshold and total raycasting angle. All performance tests have been executed using an ‘average’ Android device (Samsung S3 Neo, 4 core 1400Mhz CPU, 1.5 GB RAM, costing ~160 euros).

In the first testing session the ray length has been set to 100 m and the raycasting angle to 45o (see Fig. 5a). During the test a total of 103 raycastings have

10 The Overpass Turbo API is available at:

http://overpass-turbo.eu/ 11 A distance threshold slightly longer than the ray has been ap-

plied in order to decrease the number of polygons the ray has to take into account and therefore improve execution performance.

Page 7: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

been executed with a mean of 580msec per raycasting (1.5 raycastings/sec), taking into account 222 nearby buildings. In the second test session we have used the same settings as in the first test except of the ray angle (i.e. the FoV width) which has been decreased down to 4o (see Fig. 5b). In this test the raycasting perfor-mance has significantly improved with a total of 426 raycastings executed in a 60sec testing session, indi-cating a mean of 140 msec per raycasting (7.1 raycastings/sec). Overall, the performance of the raycasting algorithm has been demonstrated to meet real-time requirements even under stress conditions (large number of buildings, constant rotation, wide and long FoV). This highlights the algorithm’s ap-plicability to KnossosAR as the visitors in archaeo-logical sites typically move at a slower pace and the number of POIs/obstacles is lower 12.

(a) (b)

Fig. 5. Our test area in Brussels (Belgium). Raycasting perfor-mance test with (a) 100m ray length, 45o field of view; (b) 100m ray 4o field of view.

4. Case Study: KnossosAR, an Archaeological Site Mobile AR Guide

The diffusion of smartphones and tablets has paved the way for a multitude of AR applications in cultural heritage, most of which are enhanced muse-um guides, visually augmenting physical exhibits

12 A typical performance metric in the computer graphics litera-

ture is the frame rate, namely, the frequency (rate) at which an imaging device displays consecutive images (frames). Frame rate is usually expressed in frames per second (fps). For in-stance, in the case of the raycasting stress testing application the graphical user interface (including the visualization of the FoV) is refreshed upon the completion of a full raycasting (execution of the FoV estimation algorithm). Therefore, the metrics raycastings/sec and fps essentially coincide. Graphical applica-tions achieving performance higher than 1 fps are generally considered as real-time; however, the transition from one frame to another remains unnoticed by human viewers for 8-10 fps [24]. A performance of the raycasting algorithm lower than the abovementioned threshold would negatively affect the quality of experience of KnossosAR users as they would notice graph-ical interface updates (sudden moves of AR markers). Moreo-ver, it would compromise the accuracy of the estimated FoV as the raycasting algorithm would not be able to closely follow the updates of the user device’s location and/or direction.

with background or interpretive information [1, 15]. Many among the known storytelling-driven projects, which use AR to convey the history of a place in the context of a guided tour, are implemented for the outdoors; in these cases, the mobile device is used to get AR views of a building, to receive additional lo-cation-based information, or to listen to audio and 3D-enhanced narrations [6, 32].

However, existing markerless (i.e. sensor-based) AR applications developed for handheld devices only consider the location and direction of the user to pro-ject AR content. Namely, they do not take into ac-count the visibility of annotated artifacts or buildings so as to adapt AR content accordingly and convey depth information. This is one of the main considera-tions of KnossosAR, a mobile AR guide developed for the Unesco world heritage -archaeological- site of Knossos13, Greece.

4.1. Objectives

The primary purpose of KnossosAR is to utilize mobile AR technology to support guided tours of secondary school students (either alone or in groups) while on educational visit on outdoors archaeological sites.

In particular, the educational and learning objec-tives for students supported by KnossosAR have been specified in consultation with educators and archaeologists, as follows: Become acquainted with the palace of Knossos

(architecture, decoration, art of pottery and fres-coes) and compare its complexity with that of the –Minotaurous- maze.

Become acquainted with characteristic elements of life in the Minoan Crete: occupations, housing, food, clothing and toilette.

Comprehend the causes that led to the destruction of the Minoan civilization. Contemporary theories of learning suggest that the

above listed educational objectives should be pursued through approaching the historical context revealed in Knossos experientially [17]. Namely, KnossosAR could serve as a technological assistant to locate POIs in the archaeological site and derive contextual interpretive information via various media.

Through using the application, students should be able to build sufficient knowledge background by undertaking the role of explorer, based on the princi-ples of discovery and exploratory learning. Collabo-

13 Knossos is the largest Bronze Age archaeological site on Crete

(Greece) and is considered as the oldest city in Europe.

Page 8: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

rative and cooperative learning should also be en-couraged so as to capitalize on one another’s re-sources and abilities and enable the development of social skills. Finally, the application should cater for the emotional development of students. Dominant emotions are expected to be the tension resulting from the action, the joy of discovery, the fulfillment of achievement.

KnossosAR has been developed following an iter-ative software development cycle (see Fig. 6): re-quirements analysis; design; prototype implementa-tion; testing; prototype finalization; official user evaluation. Requirements analysis

•Identification of user groups

•Interviews with involved stakeholders

•Take into account user acceptance criteria

•Determination of functional and non‐functional requirements

Application design

•Content selection

•Content authoring

•Use cases

•Selection of mobile AR platform

•User interface design (mockups)

Prototype implementation

Testing

•Lab tests

•Beta version testing

•Evaluation by focus groups

Prototype finalization

•Bug fixing

•Application enhancements: audio announcements, dual AR/map view, hiding of POIs occluded by physical obstacles

Official user trials

•Questionnaire editing

•Selection of evaluators

•Trial executions

•Compilation of questionnaire and interviews 

Fig. 6. Illustration of KnossosAR development cycle.

4.2. Requirements analysis

The requirements analysis of KnossosAR adopted the Volere methodology [22]. The application re-quirements have been elicited by involved stakehold-ers (archaeologists experts in Knossos, professional guides and curators from local cultural heritage insti-tutions, educators, students). Our requirements analy-sis also consolidated ‘good practices’ derived from similar mobile AR projects and user acceptance stud-ies [5, 10]. The main functional and non-functional requirements identified for KnossosAR are as fol-lows: The application should support the autonomous

tour guidance of users in the archaeological site; users should be able to follow their own explora-tive routes with their preferred pace and be able to retrieve information about POIs/landmarks under a variety of augmentation forms, including textual descriptions, audio narrations, images and 3D graphics.

The application design should make no assump-tions on any kind of supportive infrastructure (e.g. WiFi installation, attachment of fiducial markers on POIs); hence, it should be developed upon a sensor-based AR framework and be able to func-tion as a standalone application so as not to be sen-sitive to network disconnections and avoid roam-ing charges.

User input should be rare and limited. The application should offer an intuitive interface

so as not to distract the user from his/her main vis-iting purpose, namely the experiential exploration of the archeological site.

The application should support educational objec-tives as to motivating students to discover infor-mation about specific elements of the site.

The guided tour should be offered in a playful manner (i.e. incorporate gamification elements) aiming at increasing the user commitment and en-gagement in the exploration of the site.

The application design should be easy to replicate and port to other cultural heritage sites.

4.3. Application design

The application content has been dictated by ar-cheologists and educators based on the rationale of selecting a set of POIs regarded as being most im-portant while also serving the learning objectives of educational visits, as discussed in Section 4.1. Those POIs are: The Throne Room with the famous alabaster throne

and frescoes, where it is speculated that King Mi-nos met with the priesthood.

The Stepped Portico (covered stairway) leading to the palace complex.

The South Entrance, where the plaster relief “Prince of lilies” or “Priest-king Relief” has been discovered.

The royal apartments where one can admire the natural lighting system and air conditioning of premises and frescoes like that of dolphins and sea urchins in the queen’s room.

The compartments with the giant storage jars, used in trade shipment with other peoples in the Middle East, Egypt and the Aegean.

The Sewer System, leading rainwater away from the palace complex. In the same area there are signs of the great destruction of the palace. Upon the selection of POIs, we then proceeded to

content authoring: text (POI description) authoring,

Page 9: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

recording of audio clips (narration of textual descrip-tion), editing of photographs, videos, etc.

The application design phase also involved the specification of a number of use case scenarios. Namely, the detailed formal (UML) definition of the interactions between the involved actors and the ap-plication components to achieve specific goals (tasks).

Based on the use case diagrams we then derived the UI mockups which exposed the basic functionali-ty of the envisioned application and served as a tool to solicit early feedback from archeologists and edu-cators as well as from the end users (students). This feedback has been take into account so as to adjust several UI and interaction elements prior to prototyp-ing the application.

4.4. Iterative prototyping and testing

KnossosAR has been developed as a standalone application for the Android platform. The Android Augmented Reality Framework14 has been chosen to support the projection of AR views as it also facili-tates the implementation of several desired features such as a visual metaphor of a ‘radar’ used to display the location of POIs (represented by dots) relatively to the user’s location and direction. The development of KnossosAR comprised iterative prototyping and testing phases. The output of each prototyping phase has undergone extensive ‘lab’ tests as well as evalua-tion by focus groups. The focus group comprised a team of three technologically literate individuals who provided analytical feedback as regards spotted soft-ware bugs as well as usability flaws and missing functionality.

The first prototyping phase comprised the imple-mentation of a fully functional prototype. The inter-pretive information edited for the selected POIs (see Section 4.3) has been incorporated into the AR framework. In addition to the AR view, a standard map view has been developed to offer users an alter-native means for locating POIs (the map view indi-cates both the current location of the user and POI locations by color-coded markers). The most im-portant remarks and suggestions for improvements highlighted by the focus group members during the first testing phase have been: At some points, the AR view included markers for

POIs hidden from certain viewpoints as this has

14 The Android Augmented Reality Framework is available at:

https://github.com/phishman3579/android-augment-reality-framework

been the default function of the AR framework. This often resulted in confusion and misconcep-tion, therefore, it has been suggested to allow hid-ing markers associated with occluded POIs15.

While the application has been configured to au-tomatically display visual indications about anno-tated POIs when the user came close to them, this often remained unnoticed as the users typically walked around the site only occasionally checking their device’s screen.

Based on the elicited feedback, the original proto-type has been adapted as follows:

The markers corresponding to occluded POIs have been hidden. Essentially, this has been realized through the incorporation of the geolocative raycasting technique presented in Section 3. To compensate for the hidden information, we have opted to implement a third view (further to the AR and the map views): a dual view which comprises a split screen combining the AR and map views. This allows users to have a broader perception if the site; namely, to locate POIs (in the map section of the screen) which could otherwise remain un-spotted due to being out of the user’s FoV, thus, hidden in the AR view.

On the event of approaching a POI in less than 5 m distance, the user is -optionally- notified via an au-dio announcement and/or device’s vibration.

Android Augmented Reality Framework

Android OS

FoV Determination /POI in FoV verification

Sensors & GPS receiver

KnossosAR

Hardware layer

Operating system layer

AR framework     layer

Application layer

Physical obstacles and POIs data (floor plan polygons)

Fig. 7. KnossosAR architecture diagram.

The architecture of the final version of Knos-

sosAR derived from the iterative prototyping and

15 Some of the focus group members argued that the display of AR

markers based on proximity would be sound in a city guide wherein, for instance, the user would like to know the restau-rants located up to a certain distance from his/her current loca-tion. Nevertheless, this is undesirable in the functional context of KnossosAR where the main purpose of the user is to recog-nize POIs as s/he walks around the archaeological site.

Page 10: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

testing phase is illustrated in Fig. 7. The bottom layer refers to the device’s hardware, particularly, the magnetometer and the GPS receiver whose meas-urements are required to estimate the camera’s view. The AR framework lays above the OS layer and is utilized by the end application. KnossosAR incorpo-rates our raycasting algorithm (see Section 3) which determines the device’s FoV taking into account the polygon data of nearby POIs and physical obstacles within the archaeological site of Knossos (note that all application data are stored locally, namely, the application is able to work offline). The FoV estima-tion method determines the POIs currently visible to the user; those POIs are then fed to the underlying AR framework and projected (via AR markers) on the device’s display according to the device’s loca-tion and orientation. It is noted that the raycasting angle in KnossosAR is set to 45o (that is, equal to the angle supported by the Android Augmented Reality Framework) and the ray length may be set up to 100m by the user, while the algorithm takes into ac-count 26 polygon floor plans (both POIs and obsta-cles). In this particular setting, the average execution time of the FoV estimation routine has been 94msec per raycasting (i.e. 10.6 raycastings/sec) for 100m radius, definitely sufficient for real-time execution.

The hiding of AR markers associated with occlud-ed POIs is demonstrated in Fig. 8. Since no open geodata have been available for the archaeological site of Knossos, the building (polygon) coordinates (both for the POIs and the buildings that may possi-bly occlude POIs) have been manually edited. Fig. 9 illustrates the ground plan of the archaeological site designating POIs and obstacles.

(a) 

(b)

Fig. 8. (a) Displaying and (b) hiding an AR marker associated with an occluded POI.

Fig. 9. Ground plan of the archaeological site with color indication of POIs (yellow-colored), buildings obstructing the users’ view (red-colored) and buildings of zero height, hence, not regarded as obstacles (blue-colored).

In order to implement audio announcements we have adopted methods commonly practiced in acous-tic spaces [26]. Upon approaching a POI the user listens to a short audio message announcing the POI’s identity (title). As a result, the users are disen-gaged from continuously looking up their device’s screen, thus, they are left to focus on the actual ex-ploration of the archeological site and pursue a more relaxed visit style. In addition to receiving audio noti-fication, a popup panel provides the user access to POI-relevant information (textual description, audio narration, slide show) (see Fig. 10). This panel also appears when the user taps on a marker in the AR view.

Page 11: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

(a)

(b)

Fig. 10. (a) Popup panel providing access to interpretive infor-mation relevant to a POI; (b) slide show.

Last, the dual AR/map view of KnossosAR is shown in Fig. 11.

Fig. 11. The dual view which combines the AR and map views.

4.5. Field trials

KnossosAR has been tested on site, through ‘for-mal’ field trials. The objectives of the field trials have not only been to test the innovative functional elements of the application (such as our occlusion handling method), but also to offer insights on the way that similar applications are actually used as

well as on factors that affect the quality of experience perceived by users.

KnossosAR has been evaluated by 16 students (12 male, 4 female) 17-19 years old (average 17.6), in the context of an educational visit in the archaeological site of Knossos16. The participants have tested the application in four groups (of four individuals each) in order to facilitate their live observation by the de-velopers. The testing sessions have been executed using tablet devices supplied by the development team. Most (68.8%) of the students had visited the archaeological site in the past, either privately or in the context of an educational (school) visit. All of them have been experienced mobile application users and familiar with mobile interactive map interfaces. 62.5% have been aware of the existence of electronic guides in museums and cultural heritage sites; how-ever, only 12.5% had actually used one in the past. Notably, none of them has been aware of the capabil-ities and usage of the AR applications.

Initially, the students have been briefed by the de-velopers about the main functional elements of Knos-sosAR. Students have been then invited to participate in a ‘treasure hunting’-like game wherein the objec-tive has been to ‘discover’ (i.e. approach) the 6 se-lected POIs (see Section 4.3) in a 30 min session. Upon the completion of the evaluation session, the participants have been requested to fill in a question-naire so as to convey their overall quality of experi-ence and document any remarks. Finally, a semi-structured interview followed in order to offer partic-ipants the opportunity to clarify any issues and sug-gest further improvements. Fig. 12 shows snapshots from the KnossosAR field trials.

(a)

16 A video recorded during the field trials is available from:

https://www.youtube.com/watch?v=RnohY2JdkzI

Page 12: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

(b)

(c)

Fig. 12. Field trials: (a) Briefing of participants; (b) browsing interpretive information about a POI; confirmation of the ‘discov-

ery’ of 4 (out of 6) POIs.

The questionnaire (see Table 1) aimed at eliciting the perceived value and quality of experience of the participants about the application as a whole as well as its individual functional elements. The participants experienced no difficulty to get acquainted with the usage of the application (S1). The incorporation of the AR views and the gamification of the guided tour increased the fun element (S2). The interpretive in-formation consumed through the application allowed participants to acquire new knowledge about the ar-chaeological site (S3), while the visualization of the POIs assisted in locating important points which oth-erwise would likely remain unnoticed (S4).

The occlusion handling technique (participants have been informed during the briefing session that AR markers associated with non-visible POIs would

be hidden) has been positively valuated (S5). Be-sides, some participants argued that they could re-ceive hints about the existence of occluded POIs through inspecting the ‘radar’; this displayed visual indications of all ‘in-range’ POIs regardless of their visibility. FoV estimation has been found accurate by most users (S6); inaccuracies have only occasionally been reported and are attributed to poor GPS signal reception (e.g. below relatively sheds). Participants have positively valued the application’s performance since the movement of AR markers over the camera view has been smooth while rotating the device due to the fast completion of the FoV estimation process (S7).

The participants have found useful most of the means provided for locating POIs within the archaeo-logical site (S8). Among them, the AR markers and the audio messages announcing proximity to POIs have been most highly appreciated. As regards the means for consuming interpretive information, audio narration and images/photographs have been the most commonly used (S9); reading textual infor-mation has been less popular as it was claimed to distract the user’s attention from looking at the POI itself. The overall valuation of KnossosAR has been positive with respect to the perceived quality of expe-rience (S10). When prompted to characterize the -AR interface of the- application in one sentence, the ver-balization has been around the concepts of entertain-ment and innovation (e.g. ‘fun’, ‘interesting’, ‘origi-nal’, ‘groundbreaking’) as well as usefulness (e.g. ‘helpful’, ‘nice to have’).

Table 1

Overall valuation of KnossosAR using a Likert scale: 1-5 (1:Not at all, 5: Very much).

Statement Median Average

S1The application has been easy to use.

5 4.6

S2The application has been pleasant to use.

5 4.4

S3

The usage of the application allowed me to acquire knowledge about then archae-ological site (that I would miss otherwise).

4 4.1

S4

The application assisted me to locate points of interest within the archeological site (that I would miss otherwise).

5 4.8

S5I have appreciated the hiding of AR markers representing POIs out of my field of view.

4 3.9

Page 13: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

S6

AR markers corresponding to POIs have been accurately hidden/displayed according to the POIs visibility.

4 3.7

S7

I did not experience any freez-ing or temporal interruption in the application’s graphical interface while rotating the device’s camera.

5 5

S8 The following application elements assisted me in locating points of interest within the archaeologi-

cal site: a Floating (AR) markers 5 4.4 b Radar 3 3.5

c Interactive map (including visual indication of the user’s current location)

4 3.8

d Audio (POI proximity) an-nouncement

5 4.8

S9 The following application elements assisted me in eliciting interpretive information about points of

interest: a Audio narration 4 4.3 b Textual information 3 2.9 c Photographs 4 4

S10

The overall quality of experi-ence with the usage of the mobile guide has been posi-tive.

4 4.2

In the semi-structured interview held in the end of the testing sessions, the participants have made sev-eral suggestions for further improvements, such as: Incorporation of richer interpretive information:

3D reconstruction of selected ancient ruins; pro-vision of information about the archaeological site as a whole further to the ‘fragmented’ in-formation about POIs.

Use of visual clues (e.g. special color codes) to denote already visited POIs.

Offer alternative ways to handle hidden POIs (i.e. those being out of the user’s FoV): use of transparency for AR markers associated with hidden POIs; provision of audio information about hidden POIs.

5. Conclusion and Future Research

This article introduced a geolocative raycasting technique which enables real-time detection of sur-rounding buildings and allows to address the occlu-sion problem which is commonly dealt with in loca-tion-based AR applications. Our method involves casting a sequence of rays (consecutive rays are sepa-

rated by a specific angle) with each ray progressively generating virtual locations until a ‘hit’ is detected. The performance evaluation of our method demon-strated its efficiency and appropriateness for portable outdoors AR applications. Our method makes a sig-nificant contribution to the field of ambient intelli-gence (AmI) as it enables portable/wearable devices with limited resources to efficiently estimate the FoV context and adapt the output of applications support-ing everyday life activities accordingly.

The motivation for implementing the FoV estima-tion method has been drawn from the iterative design and prototyping process of KnossosAR, a context-aware mobile AR guide for the archaeological site of Knossos, which seamlessly integrates AR projections of interpretive information in a non-linear storytell-ing context. Unlike existing outdoors AR applica-tions, KnossosAR addresses the occlusion problem by appropriately handling occluded POIs/landmarks, thus serving as a proof-of-concept for our geolocative raycasting technique.

The execution of user evaluation trials confirmed the perceived usefulness, ease of use and enjoyment when using KnossosAR. The combination of the physical objects with the virtual information trig-gered the curiosity and stimulated the interest of stu-dents to physically explore the archaeological site. The use of mobile AR technology exposed students to an alternative interaction style, which they easily mastered. Initial mistrust, gave its place to a sense of accomplishment when they succeed in locating the ‘hidden’ POIs.

The main lessons learned from the field trials have highlighted several directions for further improve-ments, as follows: Performance represents a major success factor

for FoV estimation since prolonged execution time may affect the rendering of UI elements, thus, the overall user experience. As a result, the incorporation of speed-up techniques (e.g. proac-tive detection of nearby POIs/obstacles) may be necessary, especially in applications that involve larger number of polygons.

Audio narration is often referred to as less obtru-sive means for presenting interpretative infor-mation to visitors of cultural heritage sites com-pared to visual presentation. Future research could investigate suitable ways for incorporating FoV estimation in audio guides, i.e., for provid-ing FoV-aware audio narration.

The hiding of markers is not necessarily the pre-ferred option of all users for treating POIs being

Page 14: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

out of the current user’s FoV. Along this line, we intend to perform UX studies investigating alter-native options. Future releases of KnossosAR will allow users to specify their preferred option.

Acknowledgements

The authors would like to thank the archaeologist Alexandra Karadimou for the insightful information she provided about the archaeological site and the useful feedback on the first prototype of KnossosAR. Many thanks also go to the secondary school teachers Michael Stamatoulakis and Stavros Kostomanolakis for their assistance in the preparation and execution of the field trials.

References

[1] A. Angelopoulou, D. Economou, V. Bouki, A. Psarrou, L. Jin, C. Pritchard, F. Kolyda, Mobile augmented reality for cultural heritage, Mobile Wireless Middleware, Operating Systems, and Applications, 93, 2011, pp. 15-22.

[2] A.H. Behzadan, V.R. Kamat, Scalable algorithm for resolv-ing incorrect occlusion in dynamic augmented reality engi-

neering environments, Computer‐Aided Civil and Infra-structure Engineering, 25, 2010, pp. 3-19.

[3] B. Bell, T. Höllerer, S. Feiner, An annotated situation-awareness aid for augmented reality, Proceedings of the 15th annual symposium on User interface software and technolo-gy, ACM, 2002, pp. 213-216.

[4] M. Billinghurst, A. Clark, G. Lee, A Survey of Augmented Reality, Foundations and Trends Human–Computer Interac-tion, 8, 2014, pp. 73-272.

[5] M. Billinghurst, A. Duenser, Augmented reality in the class-room, Computer, 7, 2012, pp. 56-63.

[6] G. Casella, M. Coelho, Augmented heritage: situating aug-mented reality mobile apps in cultural heritage communica-tion, Proceedings of the International Conference on Infor-mation Systems and Design of Communication, ACM, 2013, pp. 138-140.

[7] A. Dey, C. Sandor, Lessons learned: Evaluating visualiza-tions for occluded objects in handheld augmented reality, In-ternational Journal of Human-Computer Studies, 72, 2014, pp. 704-716.

[8] J. Fischer, B. Huhle, A. Schilling, Using Time-of-Flight Range Data for Occlusion Handling in Augmented Reality, IPT/EGVE, 2007, pp. 109-116.

[9] C. Furmanski, R. Azuma, M. Daily, Augmented-reality visu-alizations guided by cognition: Perceptual heuristics for combining visible and obscured information, Proceedings of the International Symposium on Mixed and Augmented Re-ality, IEEE, 2002, pp. 215-320.

[10] A.-C. Haugstvedt, J. Krogstie, Mobile augmented reality for cultural heritage: A technology acceptance study, Proceed-ings of the International Symposium on Mixed and Aug-mented Reality, IEEE, 2012, pp. 247-255.

[11] KK. Hayashi, H. Kato, S. Nishida, Occlusion detection of real objects using contour based stereo matching, Proceed-

ings of the International Conference on Augmented Tele-Existence, ACM, 2005, pp. 180-186.

[12] T. Höllerer, S. Feiner, D. Hallaway, B. Bell, M. Lanzagorta, D. Brown, S. Julier, Y. Baillot, L. Rosenblum, User interface management techniques for collaborative mobile augmented reality, Computers & Graphics, 25, 2001, pp. 799-810.

[13] V. Kasapakis, D. Gavalas, Pervasive gaming: Status, Trends and design principles, Journal of Network and Computer Applications, 5, 2015, pp. 213-236.

[14] V. Kasapakis, D. Gavalas, Blending history and fiction in a pervasive game prototype, Proceedings of the 13th Interna-tional Conference on Mobile and Ubiquitous Multimedia, ACM, 2014, pp. 116-122.

[15] J. Keil, L. Pujol, M. Roussou, T. Engelke, M. Schmitt, U. Bockholt, S. Eleftheratou, A digital look at physical museum exhibits: Designing personalized stories with handheld Aug-mented Reality in museums, Digital Heritage International Congress, IEEE, 2013, pp. 685-688.

[16] H. Kim, S.-j. Yang, K. Sohn, 3D reconstruction of stereo images for interaction between real and virtual worlds, In-ternational Symposium on Mixed and Augmented Reality, IEEE and ACM, 2003, pp. 169-176.

[17] D.A. Kolb, R.E. Boyatzis, C. Mainemelis, Experiential learning theory: Previous research and new directions, Per-spectives on thinking, learning, and cognitive styles, 1, 2001, pp. 227-247.

[18] C.D. Kounavis, A.E. Kasimati, E.D. Zamani, Enhancing the tourism experience through mobile augmented reality: Chal-lenges and prospects, International Journal of Engineering Business Management, 4, 2012.

[19] P.E. Kourouthanassis, C. Boletsis, G. Lekakos, Demystifying the design of mobile augmented reality applications, Multi-media Tools and Applications, 74, 2015, pp. 1045-1066.

[20] M.A. Livingston, Z. Ai, K. Karsch, G.O. Gibson, User inter-face design for military AR applications, Virtual Reality, 15, 2011, pp. 175-184.

[21] Y. Ohta, Y. Sugaya, H. Igarashi, T. Ohtsuki, K. Taguchi, Client/server depth sensing for see-through head-mounted displays, Presence: Teleoperators and Virtual Environments, 11, 2002, pp. 176-188.

[22] S. Robertson, J. Robertson, Mastering the requirements process: Getting requirements right, 2nd Edition, Addison - Wesley, 2012.

[23] J. Schroeder, AndEngine for Android game development cookbook, Packt Publishing Ltd, 2013.

[24] M. Shah, Image-space approach to real-time realistic ren-dering, ProQuest, 2007, pp. 5.

[25] M.M. Shah, H. Arshad, R. Sulaiman, Occlusion in augment-ed reality, Proceedings of the 8th International Conference on Information Science and Digital Content Technology, IEEE, 2012, pp. 372-378.

[26] U.P. Svensson, Modelling acoustic spaces for audio virtual reality, Proceedings of the 1st Benelux Workshop on Model based Processing and Coding of Audio IEEE, 2002.

[27] B.H. Thomas, A survey of visual, mixed, and augmented reality gaming, Computers in Entertainment, 10, 2012, pp. 1-33.

[28] Y. Tian, Y. Long, D. Xia, H. Yao, J. Zhang, Handling occlu-sions in augmented reality based on 3D reconstruction meth-od, Neurocomputing, 156, 2015, pp. 96-104.

[29] Y. Tian, T. Guan, C. Wang, Real-time occlusion handling in augmented reality based on an object tracking approach, Sensors, 10, 2010, pp. 2885-2900.

[30] F. Tscheu, D. Buhalis, Augmented Reality at Cultural Herit-age sites, Information and Communication Technologies in Tourism, 2016 pp. 607-619.

Page 15: Augmented reality in cultural heritage: Field of view awareness in …zarcrash.x10.mx/MySite/MySite/namari-by-shapingrain... · 2017. 10. 30. · Augmented Reality in Cultural Heritage:

[31] H.-K. Wu, S.W.-Y. Lee, H.-Y. Chang, J.-C. Liang, Current status, opportunities and challenges of augmented reality in education, Computers & Education, 62, 2013, pp. 41-49.

[32] Y. Xu, N. Stojanovic, L. Stojanovic, A. Cabrera, T. Schuchert, An approach for using complex event processing for adaptive augmented reality in cultural heritage domain: experience report, Proceedings of the 6th International Con-ference on Distributed Event-Based Systems, ACM, 2012, pp. 139-148.

[33] T. Yang, Q. Pan, J. Li, S.Z. Li, Real-time multiple objects tracking with occlusion handling in dynamic scenes, Com-

puter Society Conference on Computer Vision and Pattern Recognition, IEEE, 2005, pp. 970-975.

[34] Z. Yovcheva, D. Buhalis, C. Gatzidis, Smartphone augment-ed reality applications for tourism, e-Review of Tourism Re-search, 10, 2012, pp. 63-66.

[35] Z. Yovcheva, D. Buhalis, C. Gatzidis, Engineering augment-ed tourism experiences, Information and Communication Technologies in Tourism, Springer, 2013, pp. 24-35.

View publication statsView publication stats