a study of body gestures based input interaction for...

26
1 Report: State of the Art Seminar A Study of Body Gestures Based Input Interaction for Locomotion in HMD-VR Interfaces in a Sitting Position By Priya Vinod (Roll No: 176105001) Guided By Dr. Keyur Sorathia Department of Design Indian Institute of Technology, Guwahati, Assam, India

Upload: others

Post on 02-Feb-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

1

Report: State of the Art Seminar

A Study of Body Gestures Based Input Interaction

for Locomotion in HMD-VR Interfaces in a Sitting

Position

By

Priya Vinod

(Roll No: 176105001)

Guided By

Dr. Keyur Sorathia

Department of Design

Indian Institute of Technology, Guwahati,

Assam, India

Page 2: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

2

Table of Contents 1. Abstract 4

2. Introduction to VR 5

3. Navigation in VR 5

3.1. Definition of Navigation 5

3.2. Importance of Navigation in VR 6

3.3. Taxonomy of virtual travel techniques 8

3.4. Quality Factors of effective travel techniques 8

4. The Literature on Categories of Travel Techniques 8

4.1. Artificial Locomotion Techniques 9

4.2. Natural Walking Techniques 9

4.2.1. Repositioning Systems 9

4.2.2. Redirected Walking 10

4.2.3. Proxy Gestures 11

5. Literature review on Proxy Gestures for Travel in VR 13

5.1. Standing Based Natural Method of Travel in VR 13

5.1.1. Issues in Standing Position for travel in VR 15

5.2. Sitting Based Natural Method of Travel in VR 16

5.2.1. Research Gap in sitting based natural method of travel 21

6. Research Questions 22

7. Methodology 22

8. References 23

Page 3: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

3

Figures

Figure 1 Taxonomy of Virtual Travel Techniques.

Figure 2 Nilsson, Serafin, and Nordahl’s (2016b) taxonomy of virtual travel techniques.

Figure 3 Artificial Locomotion Techniques.

Figure 4 Three categories of natural walking techniques.

Figure 5 Four examples of repositioning systems: (a) a traditional linear treadmill, (b) motorized

floor tiles, (c) a human-sized hamster ball, and (d) a friction-free platform.

Figure 6 The four types of gains used for perspective manipulation: (a) translation gain, (b)

rotation gain, (c) curvature gain, and (d) bending gain. Purple and blue lines indicate the real and

virtual transformations, respectively.

Figure 7 Three examples of proxy gestures: (a) the traditional WIP gesture, (b) tapping in place,

and (c) arm-swinging. The purple arrows illustrate the movement of the body parts used to

perform the gesture.

Figure 8 Gestures used for WIP (a) Marching (b) Wiping (c) Tapping.

Figure 9 Demonstration of Wii Lean Method.

Figure 10 (a) User using Myo Arm Band for Armswing (b) User Using the Myo Arm Band on foot

for exploring the VE.

Figure 11 User using the controllers of HTC Vive for arm swinging.

Figure 12 Leaning based interfaces for virtual travel in VE.

Figure 13 Pinchmove being used bimanually by the user.

Figure 14 Illustrations of different use cases of triggerwalking.

Figure 15 Leap Motion mounted on the Oculus Rift and hand poses for the LMTravel Technique.

Figure 16 Head Tilt is used to indicate the direction of travel.

Figure 17 User performing the (a) Tap and the (b) push gesture for locomotion.

Figure 18 (a) Head Direction Steering Technique (b) Four Head Direction Steering Technique.

Figure 19(a) Hand gestures in left hand.

Figure 19(b) Hand gestures in right hand.

Figure 20 (a) Point and Teleport (b) Point and Teleport with direction.

Figure 21 Travel task performed using the non dominant hand.

Page 4: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

4

1. Abstract

Recent developments in the technology have finally brought virtual reality (VR) out of the laboratory and into the hands of developers and consumers. However, a number of challenges remain to be solved. Virtual travel is one of the most common and universal tasks performed inside virtual environments (VEs), yet enabling users to navigate in VEs is not a trivial challenge. A virtual space may be virtually infinite in size. Thus, the user should be able to move freely to the extent that the virtual topography and architecture allows it. However, in the real world, the user’s movement is confined to a limited physical space. One of the most important challenges is providing users with natural travel experience in VR worlds that are larger than the tracked physical space and providing users with the appropriate multisensory stimuli in response to their interactions with the virtual world. To explore large VEs, users must switch to an artificial locomotion technique (ALT) that is activated using a controller. Although numerous studies have investigated artificial locomotion technique, they still demand to hold a physical device that is cumbersome, increasing fatigue and induces spatial disorientation.

Locomotion based on proxy gestures requires the user to perform gestures that serve as a proxy for actual steps. It is possible to distinguish between different subcategories depending on what part of the body is used to perform the proxy gesture. Although locomotion using proxy gestures provide a natural mode for travel, it still has drawbacks of fatigue when used to explore a larger environment. Hence, in order to mitigate the fatigue due to prolonged standing, we explore the natural gestural inputs for travel in sitting based posture. The current literature is limited in investigating the effectiveness of body-gestures in a sitting based position for travel while simultaneously performing multiple tasks such as selection and manipulation in the VEs.

Moreover, although many techniques have been developed for moving in VE, each technique works well in different scenarios and is tested and evaluated in different test environments and on different tasks. To the best of our knowledge, no studies have investigated the effect of environment size, path complexity, and presence or absence of landmarks on the sitting based locomotion technique in HMD based VR environment. This state of art seminar report presents an overview of the different categories in locomotion techniques. Further, a detailed literature review of the standing based and sitting based travel techniques using proxy gestures is presented along with their limitations and research gaps. It includes the review of both the HMD based and mobile VR interfaces. Further, the document unfolds two core research questions (RQ) with an aim to pursue them during the course of this Ph.D. The document details out an overview of the methodology that would be followed for the further studies.

Page 5: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

5

2. Introduction to VR

Virtual Reality is a way for humans to visualize, manipulate and interact with computers and extremely complex data (Isdale 1998). It is a simulation in which computer graphics are used to create a realistic-looking world. Moreover, the synthetic world is not static but responds to the user's inputs (such as gestures, verbal commands, etc.). This defines a key feature of virtual reality which is real-time interactivity. The term real time here means the computer is able to detect a user's input and modify the virtual world instantaneously. People like to see things change on the screen in response to their commands and become captivated by the simulation (Burdea 2003).

Virtual reality (VR) is an emerging technology with a variety of potential benefits for many

aspects of rehabilitation assessment, treatment, and research (Schultheis 2001). Continuing advances in VR technology, along with system cost reductions, have supported the development of more usable, useful, and accessible VR systems that can uniquely target the treatment of a wide range of psychological disorders (Rizzo 2005). VR exposure has been used in treatments of Post Trauma Stress Disorder (PSTD) patients and has found positive outcomes (Pair 2006). VR has rapidly evolved into a technology that today offers a cost-effective means of supporting the development of human skills in all manner of applications, from automotive engineering to defense, surgery to education, retail, petrochemical exploration, and heritage to micro-robotics by means of interactive training (Stone 2001). Apart from the usage of VR in medical and training field, the introduction of low-cost head mounted devices such as Google cardboard have brought the immersive virtual experience into the homes of consumers and into the hands of players.

Researchers in 3D VEs have often categorized functionalities into four basic interactions they are selection, manipulation, navigation and data input (Mine, 1995; Bowman et al., 2004). Selection allows users to specify an object, with which they wish to manipulate or interact. Manipulation has three parts: selection, spatial transformation, and release. The spatial transformations can be divided into translation, rotation, and scale (Mendes 2016). Navigation refers to the movement of the user from one place to another in the VE (Bowman et al. 1998). It is further subdivided into two categories i) wayfinding and ii) travel. Data input in Virtual Reality has been considered an important task, with a wide range of use cases, from system commands to annotations in collaborative work situations (D. A. Bowman, Rhoton, & Pinho, 2002).

3. Navigation in VR

3.1. Definition of Navigation

Navigation is one of the most important 3D interaction tasks in immersive VE. Bowman et al. 1997 highlighted the navigation as “movement of in and around a virtual environment”. The authors describe that it is possible to subdivide navigation into two components wayfinding and travel. Wayfinding—the cognitive component of navigation—involves higher level processes, such as path planning and decision making. Thus, wayfinding amounts to orienting oneself in the environment and determining the path to the desired location, and may rely on natural and artificial aids such as landmarks, maps and signs. Travel—the motor component of navigation— that involves lower-level actions, such as controlling the orientation and position of the virtual viewpoint, and movement velocity. Examples of the travel

Page 6: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

6

component in the real world could be the physical acts of walking or manipulation of steering wheel and throttle of a vehicle.

3.2. Importance of navigation in VE

Navigation is an important task accompanied by selection or manipulation in a VE. The need for context appropriate navigation design is critical for three main reasons: many applications require the user to navigate large VEs, navigation in VEs is difficult, and disorientation is upsetting (Lawton 1994). Natural navigation is a multisensory activity and several sources provide the user with information about the surrounding environment as well as the act of travel itself. This sensory information is broadly classified as optical flow, vestibular feedbacks, and the proprioceptive feedbacks. The information originating from the vestibular system supports various oculomotor and postural reflexes, and it plays a central role for spatial updating of the user (Waller and Hodgson 2013). Spatial updating refers to the ability of a person to determine his true body position, motion, and altitude relative to the surroundings. The proprioceptive cues are the ability to sense the orientation of our body in the environment. Hence any mismatch in these sensory cues and sensation of self-motion causes motion sickness to the user which in turn would affect the user’s immersive experience. Hence there is a need for enabling unconstrained and natural locomotion in virtual worlds that are larger than the tracked physical space and providing users with appropriate multisensory stimuli in response to their interaction with the VE to improve the presence and overall user experience of the user.

3.3. Taxonomy of virtual travel techniques

3.3.1. Bowman Taxonomy of virtual travel techniques In order to understand travel techniques and their effects more deeply, Bowman et al 1997

categorized them and broke them down into their lower-level components. The taxonomy splits a technique into three components. They are

Figure 1 : Taxonomy of Virtual Travel Techniques

Page 7: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

7

Direction/Target Selection refers to the method by which the direction or object of travel is specified. Depending on whether control of direction is continuous or not, the user may either “steer” (choose a direction), or simply choose a target object. Gaze-directed steering, in which the user moves in the direction she is looking, and pointing, where the user points in the direction she wants to go, are two popular steering techniques. Velocity/Acceleration Selection techniques allow the user to vary the speed of travel. Many VE applications dispense with this entirely and use a constant travel velocity. However, several techniques have been proposed, including continuous gestures to select velocity, the use of props such as foot pedals, or adaptive system-controlled speed. Conditions of Input refers to the input required by the system in order to begin, continue, and end travel. The user may be in constant motion, in which case no input may be required. Alternately, the system may require continuous input to determine the user’s state, or simple inputs at the beginning and/or end of a movement.

3.3.2. Nilsson, Serafin, and Nordahl’s (2016b) taxonomy of virtual travel techniques While this decomposition provides usefulness through which to view individual travel

techniques, it does not provide a broader picture of how virtual travel may be accomplished. Inspired by existing categorizations of travel techniques (Bowman et al. 2004; Slater and Usoh 1994; Suma et al. 2012; Wendt 2010), Nilsson et al. (2016b) divide existing travel techniques into dichotomous categories. Specifically, they distinguish between travel techniques based on whether the user is stationary or moving, whether the techniques involve virtual vehicles or not, and whether the techniques qualify as mundane or magical. Figure 1 visualizes the taxonomy that is described in more detail throughout the following.

Figure 2 : Nilsson, Serafin, and Nordahl’s (2016b) taxonomy of virtual travel techniques

Mobile and Stationary Travel Techniques Travel techniques are categorized based on the presence or absence of physical movement

(Bowman et al. 1999; Wendt 2010). This distinction is important because some commercially available VR systems do not include positional tracking. Specifically, this is the case in regard to VR systems

Page 8: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

8

powered by mobile phones (e.g., Samsung Gear VR, Google Daydream, and Google Cardboard), which only include tracking of the user’s head orientation. Mundane and Magical Travel Techniques

Any technique that allows the user to travel in a manner that cannot be accomplished in the real world qualifies as magical, and techniques that mimic real travel are considered mundane. Magical travel techniques may allow users to travel great distances within the VE without requiring any physical movement. Sometimes mundane travel techniques are easier to use because they are based on a familiar type of interaction, and the scenario itself may demand a technique that is possible in real life for example in training scenarios. Vehicular and Body-Centric Travel Techniques

Techniques that simulate travel by means of a virtual vehicle are called vehicular travel techniques and techniques that simulate movement generated by using the body to exert forces to the environment (e.g., walking, running, or swimming) are called body-centric travel techniques.

3.4. Quality Factors of effective travel techniques Bowman et al. proposed a list of quality factors which represent specific attributes of effectiveness

for virtual travel techniques. An effective travel technique promotes: 1. Speed - Appropriate Velocity 2. Accuracy - Proximity to the desired target 3. Spatial Awareness - The user’s implicit knowledge of his position and orientation within the

environment during and after travel 4. Ease of Learning - The ability of a novice user to use the technique 5. Ease of Use - The complexity or cognitive load of the technique from the user’s point of view 6. Information Gathering - The user’s ability to actively obtain information from the environment

during travel 7. Presence - The user’s sense of immersion or “being within” the environment The quality factors allow a level of indirection in mapping specific travel techniques to particular VE

applications. Application developers can then specify what levels of each of the quality factors are important for their application, and choose a technique which comes closest to that specification. For example, in an architectural walkthrough, high levels of spatial awareness ease of use, and presence might be required, whereas high speeds might be unimportant. On the other hand, in an action game, one might want to maximize speed, accuracy, and ease of use, with little attention to information gathering.

4. Literature on Categories of Travel Techniques

Humans navigate their surroundings in real world by foot, and they generally do so with relative ease and without assigning much explicit attention to the performed movements or the sensory stimuli produced as a result of these movements. The literature on travel techniques are broadly divided into two categories (i) Artificial Locomotion Techniques (ii) Natural Walking Techniques. They are discussed as below

4.1. Artificial Locomotion Technique Currently, to navigate beyond the confines of available tracking space, many VR applications

require users to switch to an alternative locomotion technique (ALT) to reach destinations that they cannot walk to. ALTs are activated by the user using their controller and include popular techniques, such as teleportation, full locomotion (e.g., directional movement), vehicle movement, or

Page 9: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

9

viewpoint translations. ALTs suffer from a number of limitations regarding the presence, VR sickness, and efficiency. Because we use our legs for locomotion in the real world, transitioning from natural walking to a handheld controller-activated locomotion method may break presence (Tregillus and Folmer 2016). Overloading the hands with navigation functionality is not only unnatural, but also increases cognitive load and decreases efficiency (LaViola Jr et al. 2001) –especially when a controller is also used for performing time critical actions like shooting, and interacting with objects.

VR sickness is a major concern for the mass adoption of VR (Nelson 2015) and is –among other factors– caused by a sensory mismatch between the visual and the proprioceptive/vestibular system. Specifically, vection, i.e., the visually-induced perception of self-motion in the absence of real movement, is a significant cause of VR sickness (Bonato et al. 2009). Natural walking gives rise to vestibular and proprioceptive afferents, which provides powerful cues about self-motion (Bent et al. 2004). The absence of such feedback in controller-activated ALTs, however, has been found to increase the occurrence of VR sickness (Bowman et al. 2004; Usoh et al. 1999). Despite the above-said limitations of ALTs they do have major advantages based on the application of usage. Techniques such as teleportation or full locomotion, require significantly less physical effort to be activated than using natural walking, traversing a larger space and consequently allow users to navigate much faster. Some of the controllers are given as below

a) Gamepad b) Speed Pad c) HTC Vive Controllers d) Oculus Rift Controllers

Figure 3 : Artificial Locomotion Techniques

4.2. Natural Walking in Virtual Environments In the literature numerous solutions to the problem of allowing users to naturally traverse

through large VEs have been proposed. Generally, these techniques fall into one of three categories: repositioning systems, proxy gestures, and redirected walking.

Figure 4 Three categories of natural walking techniques

4.2.1. Repositioning Systems Repositioning systems essentially counteract the forward movement of the user and thereby

ensure that he or she remains in a relatively fixed position. Repositioning systems offer stationary virtual

Page 10: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

10

travel. The repositioning systems are further classified into (i) Active Repositioning and (ii) Passive Repositioning system. Active repositioning often relies on elaborate mechanical setups in order to cancel the user’s forward movement. One of the simplest examples of an active repositioning system is the traditional linear treadmill (Feasel et al. 2011). An inherent disadvantage of such treadmills is that the user can only walk forward, and if the application requires turning, this will have to be done in an indirect manner such as using a joystick. Natural walking has also been facilitated using omnidirectional treadmills that allow the user to freely walk in any direction. A potential limitation of such techniques is that that motion of the treadmill may cause the user to lose his or her balance during turns and sidesteps. Other examples of repositioning systems include motorized floor tiles that move in the opposite direction of the walker’s direction (Iwata et al. 2005), cancellation of the walker’s steps through strings attached to his or her shoes (Iwata et al. 2007), and a human-sized hamster ball (Medina et al. 2008). Three examples of active repositioning systems can be seen in Figures 5 (a) to (c). Passive repositioning offers a simpler and less expensive alternative to active repositioning. Generally, passive repositioning systems rely on friction-free platforms that prevent the forces generated during each step from moving the user forward. The Virtuix Omni, Cyberith’s Virtualizer, and KatVR’s Kat Walk are examples of commercial versions of this approach to repositioning users. An example of a friction-free platform can be seen in Figure 5(d).

Figure 5 Four examples of repositioning systems: (a) a traditional linear treadmill, (b) motorized floor tiles, (c) a human-

sized hamster ball, and (d) a friction-free platform

4.2.2. Redirected Walking Redirected Walking, is an interactive locomotion technique for VEs, captures the benefits of real

walking while extending the possible size of the VE. Real walking, although natural and producing a high subjective sense of presence, limits VEs to the size of the tracked space. Redirected Walking addresses this limitation by interactively and imperceptibly rotating the virtual scene about the user (Razzaque et al. 2001). Redirected walking involve physical walking and therefore qualifies as mobile travel techniques following the taxonomy outlined by Nilson 2016b as discussed above. Generally speaking, redirected walking refers to a collection of approaches that make it possible to control the user’s path through the physical environment by manipulating the stimuli used to represent the VE (Suma et al. 2012). It is possible to distinguish between techniques that accomplish redirection based on either perspective manipulation or environmental manipulation. Perspective Manipulation - Redirection techniques relying on perspective manipulation apply changes to the user’s virtual point of view. Common ways of redirecting users using this approach include the

Page 11: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

11

application of imperceptible translation, curvature, rotation, and bending gains to the movement of the virtual camera. Environmental Manipulation- Redirection techniques relying on environmental manipulation accomplish the redirection by changing the properties of the VE. Suma et al. (2011a) devised an approach redirecting users through subtle manipulation of the virtual architecture inspired by change blindness (i.e., the inability of an individual to detect changes in the environment (Matlin 2009)). Moreover, Suma et al. (2012) proposed a technique dubbed Impossible Spaces that makes it possible to compress virtual interior environments into comparatively smaller physical spaces by means of self-overlapping architecture. Finally, if the aim is not to replicate the spatial layout of a real space, then the technique Flexible Spaces can provide unrestricted walking within a dynamically generated interior VE (Vasylevska and Kaufmann 2017b).

Figure 6 The four types of gains used for perspective manipulation: (a) translation gain, (b) rotation gain,

(c) curvature gain, and (d) bending gain. Purple and blue lines indicate the real and virtual transformations,

respectively

4.2.3. Proxy Gestures Locomotion based on proxy gestures requires the user to perform gestures that serve as a proxy for actual steps. It is possible to distinguish between different subcategories depending on what part of the body is used to perform the proxy gesture. Because the aim is to produce a walking experience that resembles real walking, proxy gestures often rely on lower-body movement. The most common approach is the so-called walking-in-place (WIP) techniques. When traveling through virtual worlds using such techniques, the user performs stepping-like movements on the spot.

Page 12: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

12

Figure 7 Three examples of proxy gestures: (a) the traditional WIP gesture, (b) tapping

in place, and (c) arm-swinging. The purple arrows illustrate the movement of the body parts used to perform the gesture

Some of the advantages of the WIP technique are the WIP techniques are convenient, inexpensive, can provide some proprioceptive feedback and are comparable to real walking in terms of user performance on simple spatial tasks. WIP techniques have been implemented using commercially available hardware such as Microsoft’s Kinect, Nintendo’s Wii Balance Board, and the inertial data obtained from the sensors of a mobile HMD. Most WIP techniques involve a gesture reminiscent of marching on the spot or walking up a flight of stairs, tapping the heels etc. We would further discuss proxy gestures in the following chapters.

Selection of Proxy gesture for my research All three may serve as potential solutions to the problem of traveling large VEs. With that being

said, they are not equally feasible if applied outside a laboratory setting where the spatial and technological constraints are even more prominent, i.e., in the household of an average consumer. Moreover, recent technological developments such as the Microsoft Kinect and the Oculus Rift usher that IVEs no longer has to be confined to the laboratories of public and private institutions. However, the limited space available to many consumers effectively renders redirection techniques ineffective since these are contingent upon some degree of movement on behalf of the user. At worst this movement can be dangerous since immersed users may be unaware of real-world obstacles and get hit by them. Hence it requires a user using the HMD to be monitored by another person to interfere when the users reach the boundaries exploring the VEs. Similarly, repositioning techniques require huge mechanical setups, such as omnidirectional treadmills, cannot be considered a feasible alternative within a foreseeable future to use these techniques at home. For the moment this leaves proxy gesture techniques as the most promising of the three approaches. While proxy gesture techniques in their current forms cannot compete with real walking in terms of simplicity, straightforwardness, naturalness studies do suggest that they may elicit a stronger sensation of a presence than techniques where users push a button to propel them forward. Although proxy gestures does not impart the vestibular feedbacks as compared to other techniques their inherent nature of safety for the users, convenience and cost effectiveness have been highlighted as factors for taking up my research on studying the naturalness of the proxy gestures for travelling in large VEs. We have discussed the literature on proxy gestures for travel in VR in standing position, their advantages and identified research gaps. Later we have discussed the proxy gestures for travel in VR in sitting position and identified the research gaps.

Page 13: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

13

5. Literature review on Proxy Gestures for Travel in VR

Travel refers to the movement of the user from one place to another in the VE. This section presents the literature review of proxy gestures used in (a) Standing Position and (b) Sitting Position. This section includes techniques applied in all technology platforms, including desktop based VR, projection-based VR interfaces, HMD based VR interfaces and mobile-based VR interface. The objective to include all technology platforms is to cover all potential travel techniques used in the past two decades.

5.1. Standing Based Natural Method of Travel in VR

In the standing based natural method of travel, the user has to perform the proxy gestures in a standing position. There are a number of gestural inputs that have been developed based on the biomechanical gait cycle of the human walking system. It is possible to distinguish between different subcategories depending on what part of the body is used to perform the proxy gesture. These gestures can be classified as upper body gestures and lower body gestures. The different gestures studied in the literature are elaborated as below. Gestural Input for Locomotion

Walking-in-place (WIP) techniques are the most common lower body gestural inputs for travel. When traveling through virtual worlds using such techniques, the user performs stepping-like movements on the spot. The steps in place may be registered based on a physical interface detecting discrete gait events (e.g., Bouguila et al. (2005, 2003)) or using motion tracking systems enabling continuous detection of the position and velocity of limbs (e.g., Bruno et al. (2013), Feasel et al. (2008), Slater et al. (1993), and Wendt et al. (2010)).

Figure 8 Gestures used for WIP (a) Marching (b) Wiping (c) Tapping

The steps in place registered by the motion tracking systems are Marching Gesture where user alternately lifts each foot off the ground by raising the thighs in front of the body. This gesture produces proprioceptive feedbacks as real walking but requires more energy consumption. Nilsson et al. 2013a proposed the Wiping gesture and the Tapping gesture for WIP. The wiping gesture resembles where the user alternately bends each knee, moving one lower leg backward, and a tapping gesture resembles where the user, in turn, lifts each heel without breaking contact with the ground. The gestures were evaluated in terms of perceived naturalness, presence, and real-world positional drift. The tapping gesture was significantly more natural than the wiping gesture and was experienced as significantly less strenuous than the other two techniques. Finally, the tapping gesture resulted in significantly less

Page 14: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

14

positional drift. Later Nilsson et al. 2013b studied about the perceived naturalness of virtual locomotion methods devoid of explicit leg movements. The two gestures studied were hip movement in which in order to produce virtual movement the user would alternately swing the hips right and left while keeping both feet grounded. This method generated proprioceptive feedback and reduced unintended positional drift (UPD). The other gesture is the arm swinging where the user-produced virtual movement solely by swinging both arms back and forward. While the proprioceptive feedback was generated by this method it may limit the user’s ability to use the arms and hands during locomotion (e.g. manipulating virtual objects or gesturing as part of social interactions). Williams et al. 2013 developed a simple method of “walking in place” (WIP) using the Microsoft Kinect to explore a VE with a head-mounted display (HMD). As part of their analysis, they compared the spatial orientation for Gaze direction locomotion to Torso direction locomotion. Harris et al. 2014 describe a method of physically leaning to explore a VE on a Nintendo Wii Fit Balance Board (figure 9 ). This was an inexpensive method of exploring a large immersive VE.

Figure 9 Demonstration of Wii Lean Method

McCullough et al. 2015 using an inexpensive wearable device called the Myo armband implemented a simple arm swinging algorithm that allows a user to freely explore an HMD-based VE. To use the arm swinging method, subjects wear the Myo armband on the thickest part of their forearm, just below their elbow as shown in figure 10(a). Wilson et al. 2016 implemented “walking in place” using the simple inexpensive accelerometer sensor in the Myo band as shown in figure 10(b).

(a) (b)

Figure 10 (a) User using Myo Arm Band for Armswing (b) User Using the Myo Arm Band on foot for exploring the VE

Page 15: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

15

Suen Pai et al. 2017 proposed a method where the user simply needs to swing their arms naturally to navigate in the direction where the arms are swung, without any feet or head movement using the controllers. The user performing armswing is shown in figure 11.

Figure 11 User using the controllers of HTC Vive for arm swinging

All of the above methods work for the head-mounted displays such as Oculus Rift and HTC Vive however, these gestures are difficult to implement in mobile contexts as it typically relies on bulky controllers or an external camera which are unavailable with the low-cost head mounted devices. Tregillus and Folmer 2016 proposed the VR –Step which is a WIP implementation that uses a real-time pedometer to implement virtual locomotion for mobile VR. VR-STEP requires no additional instrumentation outside of a Smartphone’s inertial sensors. A user study was performed to compare VR-STEP with a commonly used auto-walk navigation method and found no significant difference in performance or reliability, though VR-STEP was found to be more immersive and intuitive.

5.1.1. Issues in standing position for travel in VR Overall, an extensive study has been made on the travel techniques that use proxy gestures in a

standing posture. A number of gestural inputs have been developed based on the biomechanical gait cycle of the human walking system. The different variations of the WIP techniques such as Marching, Wiping, Tapping, WIP-K, WIP-A have been discussed. The armswing gesture using controllers, myo arm band and non-controllers have also been elaborated. The leaning based methods such as the human joystick have also been discussed. All these techniques measure different quality parameters in different VEs developed by the designers. These techniques as such have not been compared with the different sized VE and their effects on the user for longer usage have not been studied.

Many of the real-time applications of VR such as an architectural walkthrough, tourism, post-traumatic rehabilitation etc require the users to traverse a large VE which requires more time and imparts more physical strain on the users. Moreover, along with the normal users other users such as elderly and the disabled who have motor constraints would also feel difficult to traverse such VEs in a standing position. Hence in order to mitigate the fatigue due to prolonged standing, we explore the gestural inputs for travel in sitting based posture in the following chapter.

Page 16: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

16

5.2. Sitting Based Natural Method of travel in VR Using proxy gestures in standing position induces proprioceptive feedback for the user.

However, when the user performs the gesture for traversing large VEs it causes fatigue and strain as it involves repeating the gestures for a longer duration. Hence the user would feel a less physical strain in a sitting position. Furthermore, these proxy gestures in a sitting position can be catered to different user groups such as the elderly and the physically disabled to explore the VE. Many of the methods that are performed in the standing position can be performed in a sitting position also. This section discusses the literature of different categories in the sitting based methods such as leaning based, controller gesture-based; controller-less gesture-based their advantages and research gaps.

Leaning Based Methods One possible solution for both increasing ease of movement and mitigating motion sickness is user-powered motion cueing - small, physical motions indicating that there is motion without having to actually move the complete distance, by leaning while seated or standing. User-powered motion cueing can help create a simple and relatively inexpensive locomotion system, whereas the classic approach of using computer-driven and motorized platforms requires large and expensive setups to create motion that is similar to the real-world motion. Moreover, interfaces that utilize user-powered motion cueing can be less dangerous and technically complex than large motorized platforms, which often require safety harnesses (Minetti et al. 2003), (Lichtenstein et al. 2007) because they use simple perceptual deceptions instead of physically moving users in order to give them the feeling of movement. For example, Beckhaus and Riecke both developed seated interfaces with active motion cueing so that users could navigate in 3D environments without the space requirements of the larger physical system or where walking was infeasible (Beckhaus 2005, Riecke 2006). Beckhaus et al. 2005 designed the chair IO stool-based interface, called ChairIO, a chair-based computer interface that supports 3D motion – both directional and rotational. ChairIO allows control along four axes: tilt left/right, tilt front/back, move up/down, and rotate left/right. To operate ChairIO, users sit on the device and shift their body weight to tilt and rotate the chair in any direction, and this physical movement is mapped to the viewpoint/direction movement in the VE. The ChairIO interaction is augmented by a game console gun to form a new interaction method for First Person Shooters. Users found ChairIO succeeds in navigation tasks, has a fast learning curve, is fun, and is natural to use. Kitson et al. 2015 used navichair for evaluating an embodied interface using a pointing task to navigate VE. They used a user-powered stool chair called Swopper, whose seat is mounted on springs. This chair acts as an input device where the user tilts the seat forward/backward by shifting their weight to move in that desired direction. The user controls simulated yaw rotations by rotating the chair slightly to the left/right away from the default forward direction. The degree of chair tilt was mapped to the simulated motion using a velocity rate control. That is, the simulated rotation speed was proportional to the amount of chair rotation away from the default forward orientation. Later MuvMan was another user-powered stool chair (similar to NaviChair), except the chair itself is stiffer, has a small backrest, and has a default forward inclination that allows the user to sit upright and control the chair’s motion with less movement. MuvMan was positioned in low height mode instead of its normal high stool mode because participants in the pilot tests did not feel safe in the high stool mode, and had less control when pushing the chair seat backward. Although it has the same height as the NaviChair, there is a fundamental difference in the precision and smoothness of turning. Another lean based method was the Head-Directed Interface which uses the head movement of the head-mounted display (HMD) to control movement, i.e., pitch head forward/backward to translate forward/backward and rotate head around yaw axis to turn. Riecke et al. 2017 used a Swivel Chair which is an everyday office chair where users tilt the chair forward/backward to move in that direction and rotate the chair to turn. They chose this

Page 17: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

17

interface because it is very accessible, i.e., already in the workplace, more affordable than the NaviChair and MuvMan, and feels very safe and comfortable with a high backrest.

Figure 12 Leaning based interfaces for virtual travel in VE

Controller Gesture Based Suen Pai et al. 2018 proposed PinchMove, a highly accurate navigation mechanic utilizing pinch gestures and manipulation of the viewport for confined environments that prefer accurate movement. They ran a pilot study to first determine the degree of simulator sickness caused by this mechanic, and a comprehensive user study to evaluate its accuracy in a VE. They found that utilizing an 80°tunneling effect was deemed suitable for PinchMove in reducing motion sickness. It is a navigation solution in near-field VEs for accurate movement. Both unimanual and bimanual navigation were compared.

Figure 13 Pinchmove being used bimanually by the user

To close the gap between travel over long and short distances, Sarupuri et al. 2017 introduced TriggerWalking, a biomechanically-inspired locomotion user interface for efficient realistic virtual walking. The idea is to map the human’s embodied ability for walking to a finger-based locomotion technique. Using the triggers of common VR controllers, the user can generate near-realistic virtual bipedal steps. They analyzed head oscillations of VR users while walking with a head-mounted display, and used the data to simulate realistic walking motions with respect to the trigger pulls. They evaluated how the simulation of walking biomechanics affects task performance and spatial cognition. Triggerwalking can be used in both standing and sitting posture. Figure 14 below shows the illustrations of different use-cases of triggerwalking.

Page 18: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

18

Figure 14 Illustrations of different use cases of triggerwalking

Controller-less Gesture travel techniques The following techniques describe the different gestures that have been developed without using the controllers. Cardoso 2017 developed the LMTravel which is based on hand gestures using leap motion that allow controlling the locomotion: 1) Movement start/stop. Opening both hands triggers the start of movement (Figure 15-a); closing both hands stops the movement (Figure 15-b). 2) Movement speed. The number of fingers stretched defined the movement speed: one finger corresponds to the lowest speed; all 5 fingers stretched correspond to the highest speed (Figure 15-c). 3) Rotation. The tilt angle of the right hand is mapped to the camera's rotation. The user study for this technique revealed that without any physical support, the required position was not comfortable and prolonged use resulted in fatigue similar to the gorilla arm effect. The technique can also be used without rotation control by the right hand, using instead the rotation from the headset (i.e., the movement is directed by the user’s gaze). In this case, once the movement has started, the right hand can be lowered and speed can be controlled by the left hand (Figure 15-d).

Figure 15 Leap Motion mounted on the Oculus Rift and hand poses for the LMTravel Technique

Page 19: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

19

Leaning input enables omnidirectional navigation but currently relies on bulky controllers, which are not feasible in mobile VR contexts. Tregillus et al. 2017 proposed the use of head-tilt –implemented using inertial sensing– to allow for handsfree omnidirectional VR navigation on mobile VR platforms. As a result of bipedalism, humans lean their body in the direction they walk; to align with the gravitational vertical. Hence a gaze-based navigation is augmented with head-tilt input to enable handsfree omnidirectional VR navigation.

Figure 16 Head Tilt is used to indicate the direction of travel

Ferracani et al. 2016 proposed two gestures which are the tap and the push gesture for navigation. Tap gesture is common in the real world where people use the index finger to show a walking direction as shown in figure 17(a); Push Gesture consists of closing and opening the hand while translating the hand itself forward with respect to the user’s elbow. In the real world, it is the typical gesture to control locomotion machines moving a lever as shown in figure 17(b). The perceived naturalness and effectiveness of the locomotion methods were assessed through qualitative and quantitative measures. The results suggested that Tap method was the fastest and was less prone to collisions in almost every task.

(a) (b)

Figure 17 User performing the (a) Tap and the (b) push gesture for locomotion

Caggianese et al. 2015 proposed the Head Direction Steering Technique in which an interface is presented to the user for the head direction steering technique as shown in the below figure 18(a). Interacting with that control, the user is allowed to travel only in the head direction. In contrast, the second proposed technique, four head direction steering, relaxes the previous constraints introducing three more visual widgets (Figure 18(b)) and consequently the same number of travel directions. In this case, the user is allowed to move in the four cardinal directions related to the viewpoint orientation. Therefore, although both techniques exploit the head direction to choose the travel direction, they prove to be different in terms of the travel constraints.

Page 20: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

20

(a) (b)

Figure 18 (a) Head Direction Steering Technique (b) Four Head Direction Steering Technique

Zhang et al. 2017 proposed the Double Hand Gesture Interaction technique. The main objective of the study was to design a simple, comfortable, and low-fatigue double hand gesture to control an avatar (first-person view) movement that allows the user to become fully immersed in the VR environment. The left hand was used for moving and the right hand was used for turning. Figure 19(a) shows hand gesture for walking/running straight action. (a) gesture for walking forward, (b) gesture for running forward, (c) gesture for walking backward, (d) gesture for running backward, and (e) gesture for stopping. Figure 19(b) shows hand gesture for turning action. (a) gesture for turning left, (b) gesture for turning right, (c) gesture for stop turning, and (d) gestures combined for turning left when walking forward.

Figure 19(a) Hand gestures in left hand

Figure 19(b) Hand gestures in right hand

Bozgeyikli 2016 proposed the Point & Teleport technique, in which users can point to wherever they want to be in the virtual world and the virtual viewpoint will be teleported to that position. In this design, to trigger the teleportation, users should point to the same place or close vicinity for two seconds. After that, the teleportation is triggered and the virtual avatar is instantaneously moved to that position. There is two variants of this method which is a) Point and teleport technique b) Point and teleport with direction specification technique as shown in figure 20.

Figure 20 (a) Point and Teleport (b) Point and Teleport with direction

Page 21: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

21

Tomberlin et al. 2017 proposed Gauntlet which is a travel technique for immersive environments that uses non-dominant hand tracking and a fist gesture to translate and rotate the viewport as shown in figure 22. The technique allows for simultaneous use of the dominant hand for other spatial input tasks. Applications of Gauntlet include FPS games, and other application domains where navigation should be performed together with other tasks.

Figure 21 Travel task performed using the non dominant hand

5.2.1. Research Gap in Sitting Based Method of traveling The above literature indicates the immense use of body gestures using both controller-based

and controller-less gestures for locomotion in VE. This includes the use of finger-based gestures such as triggerwalking , the full hand-based gestures as in pinch move, head gestures as in shake your head and tilt and bare hand gestures used using the leap motion. Most of these work discussed were designed to perform only the locomotion task. They measure some of the quality measures as discussed in section 3.3 by comparing them with the artificial locomotion techniques. People commonly perform various tasks while changing the position and rotation of their head, that is, their viewport. Also, in many VR applications simultaneous object manipulation and navigation is an essential requirement. Most of the methods do not accommodate gestures for other functionalities in VR. Hence the performance of these gestures with the real-time scenarios is a clear need to study.

Although the use of body gestures to interact increases the naturalness, it may be tiresome if the body gestures are of high intensity and are used for longer duration. Furthermore, besides the physical tiredness, there is also the cognitive effort on the user as the gesture may be hard to associate to the command and so it requires the users to keep trying to remember what they should do instead of just doing it. These problems should also be taken into account when proposing locomotion gestures. Moreover, in all the techniques discussed above each of them use a different VE with different size and for different durations. Some methods use a straight trajectory for navigation whereas some method uses a trajectory with bends and turns. The presence and absence of landmarks in the environment also plays an important role in measuring the spatial orientation of the specified technique. To the best of our knowledge none of the methods are sufficiently explored and tested for the different size, path complexity, presence or absence of landmarks in the VE as these factors play a major role on the performance of the locomotion technique.

This research aims to address this gap and study the use of gestures in locomotion on the technology platform of HMD based VR interfaces due to its extensive application areas.

Page 22: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

22

6. Research Question 1. What body gestures should be designed for effective locomotion in HMD-VR for a sitting

position in a multitasking environment such as locomotion and selection? 2. What body gestures should be designed for effective locomotion in HMD-VR for a sitting

position in a multitasking environment such as locomotion and manipulation? 3. What is the effect of virtual environment size, path complexity, presence or absence of

landmarks on the design locomotion technique for the above two conditions?

7. Methodology to design experiments for presented research questions The aim of this research is to find the body gestures that can provide the user with a way of turning

and moving the virtual camera direction for movement in the virtual world. Moreover, the user must be able to move long distances in the VE without trespassing real-world boundaries and without becoming fatigued. We aim at finding those gestures that make the interaction to be more natural and user-friendly for navigation in a small to large VE. Since locomotion task is accompanied with manipulation task in a real-time scenario we would like to explore those gestures that would be natural for travel in sitting based position and those that can simultaneously work with the other well-defined gestures of manipulation available in the literature.

We will perform the first study to explore the gestures using a gesture elicitation study and select the gestures using the frequency, appropriateness to sitting position and other ergonomic conditions. Based on these gestures, new locomotion method will be developed and its performance will be evaluated in a multitasking environment (such as locomotion and selection , locomotion and manipulation) by comparing with the most commonly used methods of locomotion. Based on the results and users feedback we would refine the technique if some of the parameters observed require some improvement.

This new technique of locomotion will be tested with the already established method of locomotion in different size VEs, path complexity and in the presence and absence of the landmarks. The experiment will measure the task completion time, accuracy, and cognitive load, ease of use, ease of learning, presence and perceived naturalness. Finally, based on these results, a series of design guidelines or a framework design will be proposed as a core contribution to this thesis. The guidelines or framework will enable practitioners and/or researchers to use the new locomotion technique in a sitting position and in a multitasking environment based on the size of the VE, the path complexity and the presence or absence of landmarks for different groups of users.

Page 23: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

23

8. References 1. Burdea, G. C., & Coiffet, P. (2003). Virtual reality technology. John Wiley & Sons. 2. Isdale, J. (1998). What is virtual reality. Virtual Reality Information Resources http://www. isx.

com/~ jisdale/WhatIsVr. html, 4.

3. Mendes, D. F. M. T. (2016). Manipulation of 3D Objects in Immersive Virtual Environments. Future,

2017(36), 37.

4. Bowman, D. A., Koller, D., & Hodges, L. F. (1998). A methodology for the evaluation of travel

techniques for immersive virtual environments. Virtual reality, 3(2), 120-131.

5. Bowman, D. A., Rhoton, C. J., & Pinho, M. S. (2002, September). Text input techniques for

immersive virtual environments: An empirical comparison. In Proceedings of the Human Factors

and Ergonomics Society Annual Meeting (Vol. 46, No. 26, pp. 2154-2158). Sage CA: Los Angeles, CA:

SAGE Publications.

6. Lawton, C. A. (1994). Gender differences in way-finding strategies: Relationship to spatial ability

and spatial anxiety. Sex roles, 30(11-12), 765-779.

7. Chase, W. G. Visual information processing, in Handbook of perception and human performance,

Vol II: Cognitive processes and performance, K.R. Boff, L. Kaufman, & J.P. Thomas (Eds.), John Wiley

and Sons, 1986, pp. 28-l to 28-71.

8. Vinson, N. G. (1999, May). Design guidelines for landmarks to support navigation in virtual

environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems

(pp. 278-285). ACM.

9. Schultheis, M. T., & Rizzo, A. A. (2001). The application of virtual reality technology in rehabilitation.

Rehabilitation psychology, 46(3), 296.

10. Rizzo, A.A. & Kim, G. (2005). A SWOT analysis of the field of Virtual Rehabilitation and Therapy.

Presence: Teleoperators and Virtual Environments. 14(2): 1-28.

11. Pair, J., Allen, B., Dautricourt, M., Treskunov, A., Liewer, M., Graap, K., & Reger, G. (2006, March). A

virtual reality exposure therapy application for Iraq war post traumatic stress disorder. In Virtual

Reality Conference, 2006 (pp. 67-72). IEEE.

12. Stone, R. (2001). Virtual reality for interactive training: an industrial practitioner's viewpoint.

International Journal of Human-Computer Studies, 55(4), 699-711.

13. Niels C. Nilsson, Stefania Serafin, and Rolf Nordahl. 2016b. Walking in place through virtual worlds.

In International Conference on Human-Computer Interaction (HCII’16). Springer, Cham, Toronto,

Canada, 37–48.

14. Doug A. Bowman, Ernst Kruijff, Joseph J. LaViola Jr., and Ivan Poupyrev. 2004. 3D User Interfaces:

Theory and Practice. Addison-Wesley Professional, Redwood City, CA.

15. Mel Slater and Martin Usoh. 1994. Body centred interaction in immersive virtual environments.

Artificial Life and Virtual Reality 1 (1994), 125–148.

16. Evan A. Suma, Gerd Bruder, Frank Steinicke, David M. Krum, and Mark Bolas. 2012. A taxonomy for

deploying redirection techniques in immersive virtual environments. In 2012 IEEE Virtual Reality

Short Papers and Posters (VRW’12). IEEE, Costa Mesa, CA, 43–46.

17. Jeremy D. Wendt. 2010. Real-Walking Models Improve Walking-in-Place Systems. Ph.D.

Dissertation. University of North Carolina at Chapel Hill.

Page 24: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

24

18. Doug A. Bowman, Elizabeth T. Davis, Larry F. Hodges, and Albert N. Badre. 1999. Maintaining

spatial orientation during travel in an immersive virtual environment. Presence: Teleoperators and

Virtual Environments 8, 6 (1999), 618–631.

19. Jeremy D.Wendt, Mary C. Whitton, and Frederick P. Brooks. 2010. GUDWIP: Gait-understanding-

driven walking-in-place. In Proceedings of the 2010 IEEE Virtual Reality (VR’10). IEEE, 51–58.

20. Joseph J LaViola Jr, Daniel Acevedo Feliz, Daniel F Keefe, and Robert C Zeleznik. 2001. Hands-free multi-scale navigation in virtual environments. In Proceedings of the 2001 symposium on Interactive 3D graphics. ACM, 9–15.

21. Noah Nelson. 2015. NPR: All Tech Considered. Virtual Reality’s Next Hurdle: Overcoming ’Sim Sickness’ http://www.npr.org/sections/alltechconsidered/2014/08/05/ 338015854/virtual-realitys-next-hurdle-overcoming-sim-sickness. (August 2015).

22. Frederick Bonato, Andrea Bubka, and Stephen Palmisano. 2009. Combined pitch and roll and cybersickness in a virtual environment. Aviation, space, and environmental medicine 80, 11 (2009), 941–945.

23. Leah R Bent, J Timothy Inglis, and Bradford J McFadyen. 2004. When is vestibular information important during walking? J. of neurophysiology 92, 3 (2004), 1269–1275.

24. Doug A Bowman, Ernst Kruijff, Joseph J LaViola Jr, and Ivan Poupyrev. 2004. 3D user interfaces: theory & practice. Addison-Wesley.

25. Sam Tregillus and Eelke Folmer. 2016. VR-STEP: Walking-in-Place using Inertial Sensing for Hands Free Navigation in Mobile VR Environments. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 1250–1255.

26. Martin Usoh, Kevin Arthur, Mary C Whitton, Rui Bastos, Anthony Steed, Mel Slater, and Frederick P Brooks Jr. 1999. Walking> walking-in-place> flying, in virtual environments. In Proc. of the 26th conference on Computer graphics and interactive techniques. 359–364.

27. Jeff Feasel, Mary C. Whitton, Laura Kassler, Frederick P. Brooks, and Michael D. Lewek. 2011. The integrated virtual environment rehabilitation treadmill system. IEEE Transactions on Neural Systems and Rehabilitation Engineering 19, 3(2011), 290–297.

28. Hiroo Iwata, Hiroaki Yano, Hiroyuki Fukushima, and Haruo Noma. 2005. CirculaFloor. IEEE Computer Graphics and Applications 25, 1 (2005), 64–67.

29. Hiroo Iwata, Hiroaki Yano, and Masaki Tomiyoshi. 2007. String walker. In Proceedings of SIGGRAPH 2007. ACM, 20.

30. Razzaque, S., Kohn, Z., & Whitton, M. C. (2001, September). Redirected walking. In Proceedings of EUROGRAPHICS (Vol. 9, pp. 105-106).

31. Evan A. Suma, Seth Clark, Samantha Finkelstein, Zachary Warte, David Krum, and M. Bolas. 2011a. Leveraging change blindness for redirection in virtual environments. In 2011 IEEE Virtual Reality Conference (VR’11). IEEE, 159–166.

32. Margaret W. Matlin. 2009. Cognition. Seventh Edition. New York: John Wiley & Sons, Inc., Hoboken, NJ.

33. Khrystyna Vasylevska and Hannes Kaufmann. 2017b. Towards efficient spatial compression in self-overlapping virtual environments. In 2017 IEEE Symposium on 3D User Interfaces (3DUI’17). IEEE, 12–21.

34. Laroussi Bouguila, Evequoz Florian, Michele Courant, and Beat Hirsbrunner. 2005. Active walking interface for humanscale virtual environment. In 11th International Conference on Human-Computer Interaction (HCII’05), Vol. 5. ACM, 22–27.

35. Laroussi Bouguila, Masaru Iwashita, Beat Hirsbrunner, and Makoto Sato. 2003. Virtual locomotion interface with ground surface simulation. In Proceedings of the International Conference on Artificial Reality and Telexistence (ICAT’03).

Page 25: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

25

36. Luis Bruno, Joao Pereira, and Joaquim Jorge. 2013. A new approach to walking in place. In Human-Computer Interaction (INTERACT’13). 370–387

37. Jeff Feasel, Mary C. Whitton, and Jeremy D. Wendt. 2008. LLCM-WIP: Low-latency, continuous-motion walking-in-place. In Proceedings of the 2008 IEEE Symposium on 3D User Interfaces (3DUI’08). IEEE, 97–104.

38. Mel Slater, Anthony Steed, and Martin Usoh. 1993. The virtual treadmill: A naturalistic metaphor for navigation in immersive virtual environments. In First Eurographics Workshop on Virtual Reality, M. Goebel (Ed.). Springer, Vienna, Barcelona, Spain, 71–86.

39. Nilsson, N. C., Serafin, S., Laursen, M. H., Pedersen, K. S., Sikstrom, E., & Nordahl, R. (2013a, March). Tapping-in-place: Increasing the naturalness of immersive walking-in-place locomotion through novel gestural input. In 2013 IEEE Symposium on 3D User Interfaces (3DUI)(pp. 31-38). IEEE.

40. Nilsson, N. C., Serafin, S., & Nordahl, R. (2013b, November). The perceived naturalness of virtual locomotion methods devoid of explicit leg movements. In Proceedings of Motion on Games(pp. 155-164). ACM.

41. Williams, B., McCaleb, M., Strachan, C., & Zheng, Y. (2013, August). Torso versus gaze direction to navigate a ve by walking in place. In Proceedings of the ACM Symposium on applied perception (pp. 67-70). ACM.

42. Harris, A., Nguyen, K., Wilson, P. T., Jackoski, M., & Williams, B. (2014, November). Human joystick: Wii-leaning to translate in large virtual environments. In Proceedings of the 13th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry(pp. 231-234). ACM.

43. McCullough, M., Xu, H., Michelson, J., Jackoski, M., Pease, W., Cobb, W., ... & Williams, B. (2015, September). Myo arm: swinging to explore a VE. In Proceedings of the ACM SIGGRAPH Symposium on Applied Perception (pp. 107-113). ACM.

44. Wilson, P. T., Kalescky, W., MacLaughlin, A., & Williams, B. (2016, December). VR locomotion: walking> walking in place> arm swinging. In Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry-Volume 1 (pp. 243-249). ACM.

45. Pai, Y. S., & Kunze, K. (2017, November). Armswing: using arm swings for accessible and immersive navigation in AR/VR spaces. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia (pp. 189-198). ACM.

46. A. E. Minetti, L. Boldrini, L. Brusamolin, P. Zamparo, and T.McKee, “A feedback-controlled treadmill (treadmill-on-demand) and the spontaneous speed of walking and running in humans,” J.Appl. Physiol., vol. 95, no. 2, pp. 838–843, Aug. 2003.

47. L. Lichtenstein, J. Barabas, R. L. Woods, and E. Peli, “A Feedbackcontrolled Interface for Treadmill Locomotion in Virtual Environments,” ACM Trans Appl Percept, vol. 4, no. 1, Jan. 2007.

48. B. E. Riecke, “Simple user-generated motion cueing can enhance self-motion perception (Vection) in virtual reality,” in Proceedings of the ACM symposium on Virtual reality software and technology, Limassol, Cyprus, 2006, pp. 104–107.

49. S. Beckhaus, K. J. Blom, and M. Haringer, “A new gaming device and interaction method for a First-Person-Shooter,” Proc. Comput. Sci. Magic, vol. 2005, 2005.

50. S. Beckhaus, K. J. Blom, and M. Haringer, “Intuitive, hands-free travel interfaces for virtual environments,” in New Directions in 3D User Interfaces Workshop of IEEE VR 2005, 2005, pp. 57–60.

51. A. Kitson, B. E. Riecke, A. M. Hashemian, and C. Neustaedter, “NaviChair: Evaluating an Embodied Interface Using a Pointing Task to Navigate Virtual Reality,” in Proceedings of the 3rd ACM Symposium on Spatial User Interaction, New York, NY, USA, 2015, pp. 123–126.

52. Kitson, A., Hashemian, A. M., Stepanova, E. R., Kruijff, E., & Riecke, B. E. (2017, March). Comparing

Page 26: A Study of Body Gestures Based Input Interaction for ...embeddedinteractions.com/Files/SOAS_Report_Priya_Vinod.pdf · Report: State of the Art Seminar A Study of Body Gestures Based

26

leaning-based motion cueing interfaces for virtual reality locomotion. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on (pp. 73-82). IEEE.

53. Pai, Y. S., Chen, Z., Chan, L., Isogai, M., Kimata, H., & Kunze, K. (2018, September). Pinchmove: improved accuracy of user mobility for near-field navigation in virtual environments. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (p. 7). ACM.

54. Sarupuri, B., Hoermann, S., Steinicke, F., & Lindeman, R. W. (2017, October). Triggerwalking: a biomechanically-inspired locomotion user interface for efficient realistic virtual walking. In Proceedings of the 5th Symposium on Spatial User Interaction (pp. 138-147). ACM.

55. Cardoso, J. (2016, November). Comparison of gesture, gamepad, and gaze-based locomotion for VR worlds. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology (pp. 319-320). ACM.

56. Tregillus, S., Al Zayer, M., & Folmer, E. (2017, May). Handsfree omnidirectional VR navigation using head tilt. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 4063-4068). ACM.

57. Ferracani, A., Pezzatini, D., Bianchini, J., Biscini, G., & Del Bimbo, A. (2016, October). Locomotion by natural gestures for immersive virtual environments. In Proceedings of the 1st International Workshop on Multimedia Alternate Realities (pp. 21-24). ACM.

58. Caggianese, G., Gallo, L., & Neroni, P. (2015, August). Design and preliminary evaluation of free-hand travel techniques for wearable immersive virtual reality systems with egocentric sensing. In International Conference on Augmented and Virtual Reality (pp. 399-408). Springer, Cham.

59. Bozgeyikli, E., Raij, A., Katkoori, S., & Dubey, R. (2016, October). Point & teleport locomotion technique for virtual reality. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play (pp. 205-216). ACM.

60. Zhang, F., Chu, S., Pan, R., Ji, N., & Xi, L. (2017, May). Double hand-gesture interaction for walk-through in VR environment. In Computer and Information Science (ICIS), 2017 IEEE/ACIS 16th International Conference on (pp. 539-544). IEEE.