exploring augmented perception: a user acceptance study · the five traditionally recognized human...

9
Exploring Augmented Perception: A User Acceptance Study Daniel Waller, Christina Albers, Philipp Feuerbach, Marta Nevermann, Marc Richter, Stephanie Vogt, Paul Lubos, Gerd Bruder University of Hamburg Mittelweg 177 20148 Hamburg Tel.: +49 (0)40 / 94 28 38 - 0 Fax: +49 (0)40 / 42 838 - 6594 E-Mail: [email protected] Abstract: Research into augmented reality makes up a forward-looking part in the field of HCI. This study aims to assess the usefulness of vision enhancing augmented reality for performing everyday tasks. Using virtual reality technologies we simulated an augmented reality situation. Twenty test subjects were divided into two experimental and a control group. They were told to solve tasks using enhanced vision for heat, darkness and sound. We then observed how they performed in terms of vision switches and time needed. Par- ticipants took questionnaires before and after the simulation to assess simulator sickness, frustration levels, demographic and qualitative data. Our study shows participants solved given tasks significantly faster using simulated augmented perception technologies. Partici- pants embraced enhanced vision for solving everyday tasks in our virtualisation. This merits further research in implementing enhanced visions on augmented reality technology. Keywords: Augmented Perception, Oculus Rift, Vision 1 Introduction The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant sense in most cases. Researchers estimate that humans receive about 80% of all information from the sense of sight [DO12]. The human ability to perceive visual information in dark settings is very limited. Even during daylight hours the perceivable visual spectrum of an adult is constricted to a range between 400 and 700 nm [SES10]. Animals generally differ in their capabilities to receive information about their surroundings. Discussing the limitations of human sight brought up the idea of extending it with some outstanding abilities found in the animal kingdom. Some animals, like snakes, are able to perceive light in the infrared spectrum using specialized sensors on their heads [NH82], giving them the ability to sense warm objects even in the complete absence of visible light. Other animals, like the bat, are able to detect objects without any kind of light by using echolocation[GWM60]. We reasoned that the

Upload: others

Post on 03-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

Exploring Augmented Perception: A User AcceptanceStudy

Daniel Waller, Christina Albers, Philipp Feuerbach, Marta Nevermann, Marc Richter,Stephanie Vogt, Paul Lubos, Gerd Bruder

University of HamburgMittelweg 17720148 Hamburg

Tel.: +49 (0)40 / 94 28 38 - 0Fax: +49 (0)40 / 42 838 - 6594E-Mail: [email protected]

Abstract: Research into augmented reality makes up a forward-looking part in the fieldof HCI. This study aims to assess the usefulness of vision enhancing augmented reality forperforming everyday tasks. Using virtual reality technologies we simulated an augmentedreality situation. Twenty test subjects were divided into two experimental and a controlgroup. They were told to solve tasks using enhanced vision for heat, darkness and sound.We then observed how they performed in terms of vision switches and time needed. Par-ticipants took questionnaires before and after the simulation to assess simulator sickness,frustration levels, demographic and qualitative data. Our study shows participants solvedgiven tasks significantly faster using simulated augmented perception technologies. Partici-pants embraced enhanced vision for solving everyday tasks in our virtualisation. This meritsfurther research in implementing enhanced visions on augmented reality technology.

Keywords: Augmented Perception, Oculus Rift, Vision

1 Introduction

The five traditionally recognized human senses are sight, hearing, taste, smell, and touch;of which vision is the dominant sense in most cases. Researchers estimate that humansreceive about 80% of all information from the sense of sight [DO12]. The human abilityto perceive visual information in dark settings is very limited. Even during daylight hoursthe perceivable visual spectrum of an adult is constricted to a range between 400 and 700nm [SES10]. Animals generally differ in their capabilities to receive information about theirsurroundings. Discussing the limitations of human sight brought up the idea of extending itwith some outstanding abilities found in the animal kingdom.

Some animals, like snakes, are able to perceive light in the infrared spectrum usingspecialized sensors on their heads [NH82], giving them the ability to sense warm objectseven in the complete absence of visible light. Other animals, like the bat, are able to detectobjects without any kind of light by using echolocation[GWM60]. We reasoned that the

Page 2: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

possibility to enhance the abilities of the visual system could help people to solve simpletasks more efficiently. Augmenting sight could be of special interest for people sufferingfrom vision impairment, blindness[KD09, KKP+12] disorders of any other sense, e.g. hearing[CFA+10]. Perceiving otherwise unrecognized information through the use of technology iscalled Augmented Perception [Wri14].

The development of Virtual Reality (VR) Technologies and its commercial success enableus to explore augmenting perception through simulation. In a recent article S. Goma focuseson the possibilities of augmenting visual perception, arguing that it might give people theability to solve more complex problems in the future, even going so far as to calling augmentedperception our next evolutionary step.[Gom15] We hope to show that simulating the effects ofaugmenting perception in VR can serve as a basis to understand and estimate the usefulnessof certain AR applications. By developing a virtual reality application and implementinga set of different graphical shaders we simulated a room where test subjects were able toswitch between different visions enhancing their normal visual spectrum, to solve a set ofsimple tasks. In analyzing user feedback and covert measures, such as time used to completetasks, we show the usefulness of the ability to augment the human visual system.

2 Methods

Three different augmented visions (night, thermal, and sound vision) were modeled in vir-tual reality and test subjects were instructed to solve a set of different tasks. For furtherassessment of the user experience covert measurements were taken and participants had tofill in questionnaires rating their experience. A laboratory of our department was closelymodeled in VR and made to look as realistic as possible to ensure high transferability ofresults [BHAW99].

Participants22 people participated in the study, in exchange for course credit and cookies. One partic-ipant could not complete the tasks because of technical problems. The participants wereaged between 18 and 30 (M=24, SD=2.95) and were acquired via mailing lists of our facultyand via personal contacts. All participants signed an information and consent form andthe possibility to abort the experiment at any time was prior to any data being collected.Participants were then assigned randomly to one of three experimental groups.

MaterialsAn existing 3D interior model was used as the base for our VR application which wasimproved in detail using additional objects modeled in Blender 1. The model and all visionenhancements where implemented in Unity3D2 using C#.

Thermal VisionThermal vision was implemented using a custom shader programed to calculate color basedon reflective properties of the underlying material. This way we simulated thermal properties

1Blender Foundation, www.blender.org2Unity3d, www.unity3d.com

Page 3: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

using Unitys lighting system to mimic the output produced by profesional thermal imagingsystems, as seen in figure 1.

(a) (b) (c)

Figure 1: Comparisson of professional thermo-imaging technology (a) and our simulation(b). (c) A rendering of the VR lab through the sound vision shader

Night VisionUsing Unitys built-in OpenGL shader programming capabilities we developed a shader sim-ulating night vision. The achieved night vision effect had a high visual similarity to currentnight vision technology [Chr13] and night vision effects present in popular culture (videogames, movies, etc.).

Sound VisionWe chose to visualize sounds in the virtual environment with the help of Unity’s built-

in particle system in order to profit from its internal physics simulation. To keep particleemission and audio playback in sync we precomputed a list of time positions at which toemit particles. Objects producing sound emit translucent particles in a pulsating fashion.Taking cues and inspiration from the work of Wadhwa, Davis, and Rubinstein et al. on’Phase-Based Video Motion Processing’ [WRDF13] and the ’Visual Microphone’ [DRW+14]we decided to change the appearance and speed of the particles to make it look as if theparticles were distorting the air around the sound source like the air distortions over a flame.

For an immersive user experience we used an Oculus Rift (Development Kit 2) as well asnoise cancelling headphones (Bose QC25 Acoustic Noise Cancelling). To navigate back andforth in VR a presenter mouse (Gyration Air Mouse GO PLUS) was used, walking directionwas determined using the Oculus’ head-tracking system, enabling full 360◦ movement tonavigate through the virtual laboratory.

Before and after participating in the simulation, subjects had to fill in a series of question-naires, covering demographic characteristics, Lateral Preference Inventory [Cor93], simulatorsickness questionnaire (SSQ) pre-experimental [KLBL93], SSQ (post experimental), and theSlater-Usoh-Sleet (SUS) questionnaire [SUS94]. Additionally, questions were provided con-cerning the tasks, the modeled visions, and the overall user experience.

DesignParticipants were divided into three smaller groups which differed in the variation of twoindependent variables: first, the sequence of tasks and second the option whether or not they

Page 4: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

could use different visions to solve the tasks. One group started in a night situation withall lights turned off (night condition). Another group started with light already switchedon (daylight condition). The scenario for the control group was the same as in the daylightcondition but they couldn’t use AR features (control condition).

The assignment to the groups was randomized and equally distributed. The dependentvariables were both the time it took participants to fulfill single tasks and the frustrationlevel they perceived in the meantime.

ProcedureAfter signing the consent form and completing the first part of the questionnaire participantshad to stand up and were equipped with an Oculus Rift, noise cancelling headphones and apresenter mouse.

Figure 2: Experiment setup

Subjects subsequently had to solve four different tasks in a certain order, which dependedon the assigned group. Participants were told that they could alter the way they see things,but not about the exact augmented perceptions.

Before the first task was assigned, participants had the time to familiarize themselveswith the controls of the provided setup. The four tasks consisted of easy missions participantshad to fulfill: ’Find the hottest cup in the room.’, ’Find the ringing cellphone and turn itoff.’, ’Find a light switch to turn the light back on.’ and ’Find the hottest computer in theroom.’. Single tasks as well as the whole experiment could be aborted at any time. Whilethe tasks were being solved we covertly recorded data like time needed and switches beingmade.

Following the completion of the tasks, participants were asked to fill in the post experi-mental part of the questionnaire and were debriefed.

Hypotheses and AnalysisThe primary outcome was to show the helpful support of vision enhancement to solve every-day tasks and therefore merit further research in the direction of augmented perception. The

Page 5: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

first goal was to find out whether there was a significant difference in mean task completiontimes between all three groups.

A two-sided t-test was used for this. The second step was to check for differences incompletion time between the two experimental groups, for which a two-sample t-test wasused. Based on this step, the two experimental groups had to be regarded as independentsets of data. It was then evaluated, whether the tasks differed in difficulty by looking at meantask completion times for each participant and also between groups. For this, a repeatedmeasure ANOVA was used. Using one-way ANOVAs, it was further assessed how manyswitch actions had to be performed by each member of each group to successfully solve atask. Participant frustration levels were compared using two-sample t-tests.

The participants opinion on the usefulness of Augmented Perception was measured viaa seven point Likert scale. At last the feeling of presence was evaluated through the Slater-Usoh-Steed questionnaire. Each of six items was rated on a seven point Likert scale, thehigher the vote, the more intense the feeling of presence. Summing up the results of eachparticipant produced a presence value. A value of p < 0.05 was considered statisticallysignificant.

3 Results

Contrary to the initial assumption the two experimental groups showed a significant differ-ence in their mean task completion times; t(50.35) = 2.8223, p = 0.00681. They thereforeneeded to be regarded independently.

Day Night Control

050

100

150

200

250

300

Task Completion Time

Tim

e (s

)

Figure 3: Task completion time

Day Night

24

68

1012

14

Vision Switches

Amou

nt o

f Vis

ion

Switc

hes

Figure 4: Vision switches

Comparing the control groupM = 81.46, SD = 56.1 to the day groupM = 60.4, SD = 31

and the night group M = 39.62, SD = 23.55 showed a significant difference for the nightgroup t(29.895) = 3.4062, p = 0.001898 but not for the day group t(34.626) = 1.6371, p >

0.05. Neither the day F (3, 24) = 0.87273, p > 0.05 nor the night group F (3, 24) = 1.7991, p >

0.05 showed a significant variation in the time needed to solve different tasks using one-wayanovas. Another one-way ANOVA showed that there was no significant variation in theamount of switches needed for different tasks F (3, 24) = 2.3598, p > 0.05. This shows thatthe task difficulty was balanced. Contrary to our initial assumption the frustration levels

Page 6: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

of the control group did not differ significantly from the groups able to use augmentedperception also no difference in frustration levels was found between day and night group.

When asked wether a respective vision was useful for daily situations, participants re-ported a mean of M = 4.57 for thermal vision, M = 4.14 for sound and M = 6.14 for nightvision, resulting in a combined mean of M = 4.95.

12

34

56

7

Evaluation of Usefulness

Like

rt Sc

ale

Figure 5: Perceived usefulness of augmented perception

4 Discussion

The significant difference between the two experimental groups (night and day group) intheir mean task completion times was unanticipated and needs to be addressed. The nightgroup did not only need less time but also used significantly fewer switches compared to theday group. Apparently, participants of the night group developed a better understandingof the different visions and were therefore able to solve tasks more efficiently. This couldbe attributed to a small sample size and a better understanding of technology within nightparticipants. Another theory could be a ’thrown in at the deep end’-effect, because nightparticipants were forced to use visual augmentation from the very start. Further investigationwith a larger sample size might be necessary.

In the day and night condition the test subjects chose whether to use the vision-enhancingfeatures or not. The fact that all test subjects took use of the features shows the willingnessto use augmenting vision voluntarily without any instructions.

Another point of critique is that a large amount (18) of participants were already familiarwith the topic. Moreover, 87% of participants already had experience with the Oculus Rift.Concerning the setup it may be criticized that during the whole experiment there was alwaysone person standing quite close to the participant to untangle the cables. This was necessaryto make sure no one would trip while moving around wearing both earphones and the OculusRift which were physically connected to the computer as seen in figure 3.

The utility of the different features for a user’s everyday life was furthermore stated ina special part of the post experimental questionnaire. Participants were asked to describesituations in everyday life they could imagine to make use of one of the three augmented

Page 7: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

visions. The suggestions ranged from very obvious to amusing: “to check if the stove or theheater were still warm”, “to avoid injuries by warning the user about extreme temperatures”,“for espionage”, but also “to detect the annoying noisy neighbors”, and “to play hide andseek”.

Limitations and Further DirectionsAs already mentioned it might be necessary to repeat certain parts of the experiment with alarger and more diverse sample. The familarity with the labroratory might have had a largerinfluence on the results than expected. As the experiment took place in VR the transfer ofthe findings into AR is limited. Variances in the size of the visual field could lead to a rangeof different behaviour of test subjects in AR. However, the experiment aims to form a basisto understand and estimate the usefulness of vision-enhancing features in AR. Concerningfurther development it would be interesting to have a closer look at the qualitative part ofthe questionnaire: We asked participants to write down suggestions on how to improve thedesign of the different visions. A few people argued that night and thermal vision could beinterchangable since thermal vision also made it possible to navigate in the dark. Also, thethermal vision was criticized for being too poor in detail and colors. Another appealing ideawas to display the temperature of the objects the user focuses on. It would be conceivableto combine the two mentioned visions in a next step. Although with a richer design of thethermal sight a combination might be too much information for the user at once due tovisual clutter [RLN07].

A couple of participants suggested to add colors to the sound vision, as an indicator forvolume or the predominant frequency. A further critique was that in noisy places the featuremight become confusing. We would consider adding a dynamic volume threshold that wouldhave to be exceeded in order for a sound source to be displayed, so only sources significantlylouder than their surroundings would be seen in noisier areas. If this idea turns out to beviable, adding color could be the next design step.

Overall our results show that under some circumstances the proposed visual modificationsare successful in reducing the time needed to complete a task. Moreover, most participantsreported the desire to use Augmented Perception in everyday situations. We therefore sug-gest further research in augmented perception topics.

References

[BHAW99] Doug A. Bowman, Larry F. Hodges, Don Allison, and Jean Wineman. Theeducational value of an information-rich virtual environment. Presence: Teleop-erators and Virtual Environments, 8(3):317–331, June 1999.

[CFA+10] Julie Carmigniani, Borko Furht, Marco Anisetti, Paolo Ceravolo, Ernesto Dami-ani, and Misa Ivkovic. Augmented reality technologies, systems and applications.Multimedia Tools and Applications, 51(1):341–377, 2010.

Page 8: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

[Chr13] K. Chrzanowski. Review of night vision technology. Opto-Electronics Review,21(2):153–181, March 2013.

[Cor93] Stanley Coren. The lateral preference inventory for measurement of handedness,footedness, eyedness, and earedness: Norms for young adults. Bulletin of thePsychonomic Society, 31(1):1–3, 1993.

[DO12] Merrill DBowan and OD Neurodevelopmental Optometrist. Integrating visionwith the other senses. Website, 2012.

[DRW+14] Abe Davis, Michael Rubinstein, Neal Wadhwa, Gautham J. Mysore, Frédo Du-rand, and William T. Freeman. The visual microphone: passive recovery of soundfrom video. ACM Trans. Graph., 33(4):79:1–79:10, July 2014.

[Gom15] S. E. Goma. Next gen perception and cognition: augmenting perception and en-hancing cognition through mobile technologies. SPIE/IS&T Electronic Imaging,10:493940I–93940I, March 2015.

[GWM60] Donald R Griffin, Frederic A. Webster, and Charles R. Michael. The echolocationof flying insects by bats. Animal Behaviour, 8(3):141–154, 1960.

[KD09] SeungJun Kim and Anind K. Dey. Simulated augmented reality windshield dis-play as a cognitive mapping aid for elder driver navigation. In Proceedings of theSIGCHI Conference on Human Factors in Computing Systems, CHI ’09, pages133–142. Citeseer, ACM, 2009.

[KKP+12] Brian F. G. Katz, Slim Kammoun, Gaëtan Parseihian, Olivier Gutierrez, AdrienBrilhault, Malika Auvray, Philippe Truillet, Michel Denis, Simon Thorpe, andChristophe Jouffrais. Navig: augmented reality guidance system for the visuallyimpaired. Virtual Reality, 16(4):253–269, 2012.

[KLBL93] Robert S. Kennedy, Norman E. Lane, Kevin S. Berbaum, and Michael G. Lilien-thal. Simulator sickness questionnaire: An enhanced method for quantifyingsimulator sickness. The international journal of aviation psychology, 3(3):203–220, 1993.

[NH82] Eric A. Newman and Peter H. Hartline. The infrared vision of snakes. ScientificAmerican, 1982.

[RLN07] Ruth Rosenholtz, Yuanzhen Li, and Lis Nakano. Measuring visual clutter. Jour-nal of Vision, 7(2):17, 2007.

[SES10] Cecie Starr, Christine Evers, and Lisa Starr. Biology: concepts and applicationswithout physiology. Cengage Learning, 2010.

Page 9: Exploring Augmented Perception: A User Acceptance Study · The five traditionally recognized human senses are sight, hearing, taste, smell, and touch; of which vision is the dominant

[SUS94] Mel Slater, Martin Usoh, and Anthony Steed. Depth of presence in virtualenvironments. Presence: Teleoperators & Virtual Environments, 3(2):130–144,1994.

[WRDF13] Neal Wadhwa, Michael Rubinstein, Frédo Durand, and William T. Freeman.Phase-based video motion processing. ACM Trans. Graph. (Proceedings SIG-GRAPH 2013), 32(4), 2013. Retrieved 24.03.2016.

[Wri14] W. Geoffrey Wright. Using virtual reality to augment perception, enhance senso-rimotor adaptation, and change our minds. Frontiers in Systems Neuroscience,8(56), 2014.