concept for navigating the visually impaired using a ... · concept for navigating the visually...

5
Concept for Navigating the Visually Impaired using a Tactile Interface around the Head Oliver Beren Kaul and Michael Rohs Leibniz University Hannover (LUH) Hannover, Germany <lastname>@hci.uni-hannover.de ABSTRACT We propose 3D guidance for the visually impaired through a tactile user interface around the head. The outlined system uses a guidance algorithm from earlier work and five essential tactile navigation instruction paerns in order to steer visually impaired users precisely around obstacles and up or down staircases, while also providing warnings about possible future collisions. KEYWORDS Tactile Guidance, Blind Navigation, Wearable INTRODUCTION AND RELATED WORK Visually impaired people require assistance in their daily life, especially in navigating unknown environments. Navigation involves two main components: mobility and orientation as defined by Loomis et al. [14]. Mobility or micro-navigation involves sensing the near-field environment and working out a way around static or dynamic obstacles. Orientation or macro-navigation involves being oriented (e.g., by detecting landmarks), path-planning on a broader scale and detecting when a Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Position paper submied to CHI 2019 Workshop on Hacking Blind Navigation, CHI’19, May 2019, Glasgow, UK © 2019 Copyright held by the owner/author(s).

Upload: others

Post on 10-Jun-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Concept for Navigating the Visually Impaired using a ... · Concept for Navigating the Visually Impaired using a Tactile Interface around the Head Oliver Beren Kaul and Michael Rohs

Concept for Navigating the VisuallyImpaired using a Tactile Interfacearound the Head

Oliver Beren Kaul and Michael RohsLeibniz University Hannover (LUH)Hannover, Germany<lastname>@hci.uni-hannover.de

ABSTRACTWe propose 3D guidance for the visually impaired through a tactile user interface around the head.The outlined system uses a guidance algorithm from earlier work and five essential tactile navigationinstruction patterns in order to steer visually impaired users precisely around obstacles and up ordown staircases, while also providing warnings about possible future collisions.

KEYWORDSTactile Guidance, Blind Navigation, Wearable

INTRODUCTION AND RELATEDWORKVisually impaired people require assistance in their daily life, especially in navigating unknownenvironments. Navigation involves two main components: mobility and orientation as defined byLoomis et al. [14]. Mobility or micro-navigation involves sensing the near-field environment andworking out a way around static or dynamic obstacles. Orientation or macro-navigation involvesbeing oriented (e.g., by detecting landmarks), path-planning on a broader scale and detecting when a

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without feeprovided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and thefull citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contactthe owner/author(s).Position paper submitted to CHI 2019 Workshop on Hacking Blind Navigation, CHI’19, May 2019, Glasgow, UK© 2019 Copyright held by the owner/author(s).

Page 2: Concept for Navigating the Visually Impaired using a ... · Concept for Navigating the Visually Impaired using a Tactile Interface around the Head Oliver Beren Kaul and Michael Rohs

destination has been reached [9]. A navigation system for the blind typically includes the followingcomponents:

Figure 1: HapticHead, a vibrotactile inter-face around the head [12].

• A position and orientation component (e.g., GPS, DGPS, or Bluetooth beacons)• A geographic information system (GIS) component for path-planning• A user interface (e.g., auditory feedback in conjunction with a white cane or a white cane withvibrotactile output)

Occupying the sense of hearing for a navigation system is a safety concern as walking on thestreets requires unobstructed hearing, even for sighted people. This work proposes to realize theuser interface component as a tactile interface that (a) is worn on the body while keeping the handsfree, (b) does not interfere with the sense of hearing like auditory navigation interfaces, and (c) isintuitive and easily understood while leaving no room for misinterpretation by users. While thereare already numerous tactile interfaces for guiding the blind, they usually focus on arranging asmall number of actuators in a ring configuration on a body position such as the wrist [15] or waist[4–6, 17] or they use a large number of actuators as a vision substitution system on the back [1],tongue [2, 3], or forehead [8]. Arranging only a small number of actuators in a single dimension orring configuration means that the number of recognizable tactile patterns and their intuitiveness arelimited. When increasing the number of actuators, the number of possible patterns grows as wellbut their intuitiveness is limited when laying them out in a simple grid, requiring learning effort andbeing susceptible to misinterpretation. The system outlined in this work attempts to provide intuitivetactile guidance feedback, in which not just the direction (such as “head a little bit to the left”) butalso certain necessary navigation instructions (e.g., “stop”, “stairs up”) are given.

Figure 2: HapticHead hidden under abeanie for social acceptability

In previous work we presentedHapticHead [10–12], a vibrotactile display around the head consistingof a bathing cap with a chin strap and a total of 24 vibrotactile actuators (see Fig. 1, 2). We wereable to show that our prototype can be used in 3D guidance and localization scenarios for normallysighted users in both virtual (VR) and augmented reality (AR) with relatively high precision and lowtask completion times. The previous work also included a scenario where blindfolded users were ableto feel the presence of real physical objects in the 3D space around them and subsequently were ableto find and touch them (see Fig. 3).

This paper explores ideas for various spatial vibrotactile patterns around the head for the purpose of(a) blind user navigation and (b) notifications about important navigational instructions. We introducethe following research questions in light of the Hacking Blind Navigation workshop at CHI 2019 andwill attempt to answer them in future work:

• What kinds of spatial tactile around-the-head patterns are suitable for (visually impaired) userguidance and how easily understood are those?

Page 3: Concept for Navigating the Visually Impaired using a ... · Concept for Navigating the Visually Impaired using a Tactile Interface around the Head Oliver Beren Kaul and Michael Rohs

• How well can users be guided through an obstacle course including stairs using previouslydetermined intuitive tactile guidance patterns on the head?

INTUITIVE TACTILE PATTERNS

Figure 3: Blindfolded participant in priorexperiment, feeling the direction and dis-tance to physical objects [12].

We conducted an in-depth interview with three staff-members of the “Educational Center for theBlind,” located in Hannover, Germany. While all of them were teaching mostly underage visuallyimpaired pupils skills in navigation and daily life, one staff member was strongly affected by visualimpairment herself. We asked them about how visually impaired people orient themselves and avoidobstacles in indoor and outdoor environments and what kind of feature set they ideally expect from anavigation system. Furthermore, we specifically asked which kinds of essential navigation instructionswere deemed important when thinking about potentially navigating without a white cane.

It was found that, apart from the general direction, four additional navigation instructions weredeemed essential. Furthermore, due to our own experience in conducting studies with blindfoldedparticipants, we identified one more necessary pattern: an instruction to get participants to focus theirattention back on the navigation taskwhen diverging toomuch from the ideal path. Several participantsin past experiments lost their attention on the navigation task after a while and subsequently bumpedinto obstacles. The core set of essential instructions we found is:

• GO / START – e.g., street light turns green• STOP – e.g., imminent collision warnings• UP – e.g., stairs up• DOWN – e.g., stairs down• ATTENTION – if the user drifts off too much from the calculated ideal path

Different levels of vibrotactile intensity and frequency are hard to distinguish and even interferewith each other, while stimulus location and duration are easier to identify and can thus achieve ahigher bandwidth of communicated information [7]. A unique feature of the HapticHead interfaceover related work is the spherical shape all around the head. This can be used in order to createricher and potentially more intuitive tactile patterns compared to other interfaces by using spatialmetaphors. For example, when users feel a pattern moving from the chin over the sides of the headto the top and they know that they have a limited choice of the aforementioned instructions, theyintuitively know it most probably means “UP,” without any training, because of the intuitive spatialmapping.

CONTINUOUS 3D PATH GUIDANCE FOR NAVIGATIONIn our past work we introduced a 3D guidance algorithm for arbitrary actuator configurations suchas HapticHead [12]. This guidance algorithm proved to be quite efficient and fast in guiding study

Page 4: Concept for Navigating the Visually Impaired using a ... · Concept for Navigating the Visually Impaired using a Tactile Interface around the Head Oliver Beren Kaul and Michael Rohs

participants to look in the indicated direction in 3D, including elevation. The same algorithm can beused in navigation scenarios as well. While elevation is not as important in traditional flat outdoornavigation scenarios, it can certainly help navigating spaces which are not just horizontal, such asstairs and ways up a hill. Furthermore, a positive side-effect of using the proposed algorithm mightbe to help with the tendency of some visually impaired individuals not to hold their head straight upwhen walking (forward, left or right head posture) [16]. By permanently indicating the right directionincluding elevation, a visually impaired individual could be able to synchronize their walking directionand head posture.

OPEN CHALLENGES AND FUTUREWORKThe outlined system has already largely been implemented and a first – still unpublished – indoorobstacle navigation study on the suitability and efficiency of the system has been conducted. Thestudy had 15 blindfolded participants, out of which two were severely visually impaired. We are nowin the process of refining the system and conducting another user study with more visually impairedparticipants. Open challenges that remain are (a) the noise generated by vibrotactile actuators close tothe ear openings through bone conduction and (b) finding an unobtrusive and wearable position andorientation component which works precisely enough for outdoor navigation and dynamic obstacleavoidance outside a lab setting. One example of dynamic obstacles would be people coming towardsthe user.

Challenge (a) did not affect all users as about half of the participants in our studies indicated thatthey did not mind the noise as it was soft enough. However, the noise might be lowered even more byusing a different actuator type such as the LoSound engine by Lofelt [13], which operates at a lowerresonance frequency of 65 Hz, thus lowering the perceived sound level at similar tactile amplitudes.However, lowering the frequency has the disadvantage of also making the tactile sensation lessprominent, as humans perceive vibrations best at frequencies between 150 and 300 Hz [7].

Challenge (b) might be solved in the future by technologies for precise self-localization of mobile orwearables devices. Examples projects with this aim include Google ARCore1 and Apple ARKit2. While1https://developers.google.com/ar/

2https://developer.apple.com/arkit/ static obstacle detection is already possible, it is still difficult to detect dynamic obstacles fast enough,inform the user, and allow him or her to react.

REFERENCES[1] PAUL BACH-Y-RITA, CARTER C. COLLINS, FRANK A. SAUNDERS, BENJAMINWHITE, and LAWRENCE SCADDEN. 1969.

Vision Substitution by Tactile Image Projection. Nature 221, 5184 (mar 1969), 963–964. https://doi.org/10.1038/221963a0[2] P Bach-y Rita, K A Kaczmarek, M E Tyler, and J Garcia-Lara. 1998. Form perception with a 49-point electrotactile

stimulus array on the tongue: a technical note. Journal of rehabilitation research and development 35, 4 (1998), 427–430.https://doi.org/Article

Page 5: Concept for Navigating the Visually Impaired using a ... · Concept for Navigating the Visually Impaired using a Tactile Interface around the Head Oliver Beren Kaul and Michael Rohs

[3] Paul Bach-y Rita and Stephen W. Kercel. 2003. Sensory substitution and the humanâĂŞmachine interface. Trends inCognitive Sciences 7, 12 (dec 2003), 541–546. https://doi.org/10.1016/J.TICS.2003.10.013

[4] Akansel Cosgun, E. Akin Sisbot, and Henrik I. Christensen. 2014. Evaluation of rotational and directional vibration patternson a tactile belt for guiding visually impaired people. In IEEE Haptics Symposium, HAPTICS. IEEE Computer Society,367–370.

[5] German Flores, Sri Kurniawan, Roberto Manduchi, Eric Martinson, Lourdes M. Morales, and Emrah Akin Sisbot. 2015.Vibrotactile guidance for wayfinding of blind walkers. IEEE Transactions on Haptics 8, 3 (jul 2015), 306–317. https://doi.org/10.1109/TOH.2015.2409980

[6] Wilko Heuten, Niels Henze, Susanne Boll, and Martin Pielot. 2008. Tactile wayfinder: A non-visual support system forwayfinding. Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges (2008), 172–181.https://doi.org/10.1145/1463160.1463179

[7] Lynette A. Jones and Nadine B. Sarter. 2008. Tactile Displays: Guidance for Their Design and Application. Human Factors:The Journal of the Human Factors and Ergonomics Society 50, 1 (feb 2008), 90–111. https://doi.org/10.1518/001872008X250638

[8] H Kajimoto, Y Kanno, and S Tachi. 2006. Forehead electro-tactile display for vision substitution. Proc Euro-Haptics (2006), 11. http://lsc.univ-evry.fr/~eurohaptics/upload/cd/papers/f62.pdf%5Cnpapers2://publication/uuid/DA67D1A2-09FE-47CF-BC2D-F12B7393E338

[9] Brian F. G. Katz, Slim Kammoun, Gaëtan Parseihian, Olivier Gutierrez, Adrien Brilhault, Malika Auvray, Philippe Truillet,Michel Denis, Simon Thorpe, and Christophe Jouffrais. 2012. NAVIG: augmented reality guidance system for the visuallyimpaired. Virtual Reality 16, 4 (nov 2012), 253–269. https://doi.org/10.1007/s10055-012-0213-6

[10] Oliver Beren Kaul, Kevin Meier, and Michael Rohs. 2017. Increasing Presence in Virtual Reality with a Vibrotactile GridAround the Head. Springer International Publishing, Cham, 289–298. https://doi.org/10.1007/978-3-319-68059-0_19

[11] Oliver Beren Kaul and Michael Rohs. 2016. HapticHead: 3D Guidance and Target Acquisition through a Vibrotactile Grid.In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA ’16. ACMPress, New York, New York, USA, 2533–2539. https://doi.org/10.1145/2851581.2892355

[12] Oliver Beren Kaul and Michael Rohs. 2017. HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidancein Virtual and Augmented Reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems -CHI ’17. ACM Press, New York, New York, USA, 3729–3740. https://doi.org/10.1145/3025453.3025684

[13] Lofelt GmbH. 2018. Lofelt engine. https://lofelt.com/technology[14] Jack M Loomis, Reginald G Golledge, Roberta L Klatzky, James R Marston, and Gary L Ed Allen. 2007. Assisting

wayfinding in visually impaired travelers. Applied spatial Cognition From Research to Cognitive Technology 60587 (2007),179–202. http://books.google.com/books?hl=en&lr=&id=YmdqR4xQIOEC&oi=fnd&pg=PA179&dq=Assisting+Wayfinding+in+Visually+Impaired+Travelerss&ots=Ks7YmRAFLs&sig=eJSwO1j1VvWBLJx52Ocjm99Js3o

[15] Sabrina Panëels, Margarita Anastassova, and Lucie Brunet. 2013. TactiPEd: Easy Prototyping of Tactile Patterns. Springer,Berlin, Heidelberg, 228–245. https://doi.org/10.1007/978-3-642-40480-1_15

[16] Graziela Morgana Silva Tavares, Caroline Cunha do Espírito Santo, Cristina Maria Santos Calmon, Paula MartinsNunes, Thiele De Cássia Libardoni, Larissa Sinhorim, Daniel Henrique Mota, and Gilmar Moraes Santos. 2014. Posturalcharacterization in visually impaired young adults: preliminary study. Manual Therapy, Posturology & RehabilitationJournal (2014). https://doi.org/10.17784/mtprehabjournal.2014.12.205

[17] K Tsukada and M Yasumura. 2004. ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation. Vol. 3205.384–399 pages. http://link.springer.com/chapter/10.1007/978-3-540-30119-6_23