vr-smart - a virtual reality system for smart …by vinay mishra, m.s. washington state university...

91
VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART HOMES By VINAY MISHRA A thesis submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN MECHANICAL ENGINEERING WASHINGTON STATE UNIVERSITY School of Mechanical and Materials Engineering August 2010

Upload: others

Post on 21-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART

HOMES

By

VINAY MISHRA

A thesis submitted in partial fulfillment of the requirements for the degree of

MASTER OF SCIENCE IN MECHANICAL ENGINEERING

WASHINGTON STATE UNIVERSITY

School of Mechanical and Materials Engineering

August 2010

Page 2: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

ii

To the Faculty of Washington State University:

The members of the Committee appointed to examine the thesis of VINAY

MISHRA find it satisfactory and recommend that it be accepted

____________________________

Sankar Jayaram, Ph.D., Chair

____________________________

Uma Jayaram, Ph.D.

____________________________

Diane J. Cook, Ph.D.

Page 3: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

iii

ACKNOWLEDGMENT

I would like to thank my advisor Dr. Sankar Jayaram for his valuable guidance

and support during this thesis work. I also like to express my gratitude to Dr. Uma

Jayaram for her constant advice and to Dr. Diane J. Cook for serving on my committee. I

am especially thankful to OkJoon Kim from whom I have learnt so much during my stay

in VRCIM lab. I appreciate the support and help from other members of lab; Lijuan,

Nathan, Matthew, James and Smruti.

I am thankful to my wife Ekta for her endless love, encouragement and support. I

am grateful to my parents for their trust and confidence in me. I would like to thank all of

my friends in India and Pullman.

Page 4: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

iv

VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART

HOMES

Abstract

By Vinay Mishra, M.S. Washington State University

August 2010

Chair: Sankar Jayaram

This thesis work presents a novel approach to assist the inhabitants of a home

using virtual reality technology. The need for this work arises due to a growing elder

population in the U.S. and the physical disabilities they experience from old age. It is

estimated that 12.4% of the population in U.S. is of age 65 years or older, and by 2030

almost 1 in 5 persons will be 65 or older (1). Further, 28.6% of persons of age 65 years

and older have a physical disability (2). This thesis focuses on integrating CAD and

virtual reality technologies to assist people with disabilities in participating in the design

of a smart home, training for living in the smart home and using these technologies to

assist with the actual living in the smart home.

We have developed an integrated CAD and virtual reality system which can be

used in the design, training, and actual use phases of a smart home. Using the work

presented in this thesis, a room modeled in a CAD system can be imported into a virtual

environment (also created as a part of this work) for an immersive environment

experience to the inhabitant. The HMD, or the user‟s head, and room objects are attached

Page 5: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

v

with the sensors which are part of a six DOF tracking system. Methods have been created

to allow the inhabitant to move objects around in the room and then later issue an audio

query for the location of the object. The system generates an audio response with the

object‟s position relative to the person‟s current position and orientation.

The conceptual room is divided into 12 polar segments and directional guidance is

provided using these segments. Results of the testing showed that although the overall

system shows promise, the success of the system depends on the accuracy and calibration

of the tracking system. Testing also showed that this system could significantly assist

disabled people find objects easily in the room once implemented just as an assisting

system without the display system.

Page 6: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

vi

TABLE OF CONTENTS

ACKNOWLEDGMENT ............................................................................................ III

ABSTRACT ................................................................................................................ IV

TABLE OF CONTENTS ........................................................................................... VI

LIST OF FIGURES.................................................................................................... IX

LIST OF TABLES ....................................................................................................... X

CHAPTER ONE ............................................................................................................ 1

INTRODUCTION ....................................................................................................... 1

CHAPTER TWO........................................................................................................... 4

BACKGROUND AND LITERATURE REVIEW ........................................................ 4

2.1 SMART ENVIRONMENT ........................................................................................ 4

2.2 VIRTUAL REALITY SYSTEMS FOR SMART ENVIRONMENTS .................................... 6

2.3 MOTION CAPTURE SYSTEMS: AN OVERVIEW........................................................ 8

2.4 SMART HOME PROJECT AT WASHINGTON STATE UNIVERSITY ............................ 10

CHAPTER THREE .................................................................................................... 11

PROBLEM STATEMENT AND PROPOSED SOLUTION ....................................... 11

3.1 PROBLEM STATEMENT ....................................................................................... 11

3.2 PROPOSED SOLUTION ........................................................................................ 12

3.2.1 Design, Evaluation, and Training ................................................................ 12

3.2.2 Integration with the Use of the Smart Home ................................................. 13

3.2.3 Scope of Work .............................................................................................. 13

CHAPTER FOUR ....................................................................................................... 16

SYSTEM REQUIREMENT ANALYSIS ................................................................... 16

4.1 FUNCTIONAL REQUIREMENTS OF MOTION CAPTURE SYSTEM COMPONENT ......... 16

4.2 FUNCTIONAL REQUIREMENTS OF AUDIO QUERY/RESPONSE SYSTEM COMPONENT

16

4.3 FUNCTIONAL REQUIREMENTS OF VIRTUAL REALITY SYSTEM ............................. 17

CHAPTER FIVE ......................................................................................................... 18

DESIGN OF SYSTEM ARCHITECTURE ................................................................. 18

5.1 VIRTUAL REALITY (VR) SYSTEM ....................................................................... 19

5.1.1 CAD System ................................................................................................. 19

5.1.2 CAD data extraction .................................................................................... 20

5.1.3 OGRE Graphics Engine ............................................................................... 21

5.2 MOTION CAPTURE SYSTEM ................................................................................ 22

5.3 AUDIO QUERY/RESPONSE SYSTEM ..................................................................... 23

CHAPTER SIX............................................................................................................ 25

OBJECT ORIENTED DESIGN AND ANALYSIS ..................................................... 25

6.1 INTRODUCTION .................................................................................................. 25

Page 7: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

vii

6.2 CLASS DESIGN OF ROOM SCENE ......................................................................... 25

6.2.1 Objectives and Responsibilities .................................................................... 26

6.2.2 Implementation of Classes ........................................................................... 27

6.2.3 Sequence diagrams for room objects ............................................................ 30

6.3 CLASS DESIGN OF OGRE APPLICATION .............................................................. 32

6.3.1 Objectives and Responsibilities .................................................................... 33

6.3.2 Implementation of Classes ........................................................................... 34

6.4 IMPLEMENTATION OF AUDIO QUERY/RESPONSE SYSTEM ..................................... 38

CHAPTER SEVEN ..................................................................................................... 40

IMPLEMENATION .................................................................................................. 40

7.1 INTRODUCTION .................................................................................................. 40

7.2 IMPLEMENTATION OF THE CAD ROOM SCENE .................................................... 40

7.3 IMPLEMENTATION OF FLOCK-OF-BIRDS SYSTEM: ................................................ 41

7.3.1 Birds System ................................................................................................ 42

7.3.2 Head Mounted Device ................................................................................. 44

7.4 IMPLEMENTATION OF THE VIRTUAL ROOM SCENE .............................................. 45

CHAPTER EIGHT ..................................................................................................... 48

SYSTEM RESPONSE TEST STUDY ........................................................................ 48

8.1 INTRODUCTION .................................................................................................. 48

8.2 CATEGORIZATION OF ROOM OBJECTS ................................................................ 48

8.3 DEVELOPMENT OF TEST CASES .......................................................................... 49

8.4 RESPONSES OF THE TEST SUBJECTS ..................................................................... 51

8.5 SYSTEM RESPONSE DEVELOPMENT ..................................................................... 52

CHAPTER NINE ........................................................................................................ 55

SYSTEM TESTING................................................................................................... 55

9.1 INTRODUCTION .................................................................................................. 55

9.2 TEST SCENARIO DEVELOPMENT ......................................................................... 55

9.3 TEST SCENARIO 1 .............................................................................................. 56

9.4 TEST SCENARIO 2 .............................................................................................. 57

9.5 TEST SCENARIO 3 .............................................................................................. 58

9.6 TEST SCENARIO 4 .............................................................................................. 58

9.7 TEST SCENARIO 5 .............................................................................................. 59

9.8 TEST SCENARIO 6 .............................................................................................. 59

9.9 TEST SCENARIO 7 .............................................................................................. 61

9.10 TEST SCENARIO 8 .............................................................................................. 61

9.11 TEST SCENARIO 9 .............................................................................................. 61

9.12 TEST SCENARIO 10 ............................................................................................ 62

9.13 TEST RESULTS ................................................................................................... 63

CHAPTER TEN .......................................................................................................... 66

CONCLUSION .......................................................................................................... 66

10.1 INTRODUCTION .................................................................................................. 66

10.2 RESULT ANALYSIS ............................................................................................ 66

10.3 FUTURE WORK .................................................................................................. 67

Page 8: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

viii

REFERENCES ............................................................................................................ 69

APPENDIX A .............................................................................................................. 74

Page 9: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

ix

LIST OF FIGURES

Figure 1: ICF model as applied to aging with disability. .................................................. 7 Figure 2: VR-Smart System Components ...................................................................... 18 Figure 3: Functional Architecture of VR-Smart System ................................................. 19 Figure 4: Assembly hierarchy of CAD model ................................................................ 20 Figure 5: Virtual Room Scene Data Flow ...................................................................... 22 Figure 6: Flock-of-Birds architecture ............................................................................. 23 Figure 7: SAPI architecture ........................................................................................... 24 Figure 8: Class architecture of Room objects ................................................................. 25 Figure 9: Object Types .................................................................................................. 26 Figure 10: Mediator design pattern structure .................................................................. 29 Figure 11: Sequence Diagram for "Find the Chair" ........................................................ 30 Figure 12: Sequence Diagram for "Find the Key" .......................................................... 31 Figure 13: Sequence Diagram for External System Interface ......................................... 32 Figure 14: Class Architecture of OGRE application ....................................................... 33 Figure 15: Pro/Engineer Room Scene 1 ......................................................................... 40 Figure 16: Prototype room scene in Pro/Engineer .......................................................... 41 Figure 17: Flock of Birds System .................................................................................. 42 Figure 18: Room prototype ............................................................................................ 43 Figure 19: A box connected with sensor ........................................................................ 43 Figure 20: Head Mounted Device .................................................................................. 44 Figure 21: Virtual Room Scene ..................................................................................... 47 Figure 22: Room Partitioning for Test Study ................................................................. 50 Figure 23: Initial Orientation of the Person in Room ...................................................... 56 Figure 24: Person's Orientation after Test Scenario 1 ..................................................... 57 Figure 25: Person's orientation after test scenario 4........................................................ 59 Figure 26: Person's orientation after test scenario 6........................................................ 60 Figure 27: Person's orientation for test scenario 9 .......................................................... 62 Figure 28: Person's orientation for test scenario 10 ........................................................ 63 Figure 29: Pie Chart for Test Scenario Success in Direction Data .................................. 64 Figure 30: Pie Chart for Test Scenario Success in Distance Data ................................... 65 Figure 31: Proportion of Successful Test Scenarios ....................................................... 65

Page 10: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

x

LIST OF TABLES

Table 1: Comparison of Image Renderers ...................................................................... 21 Table 2: Data Structure for Position and Orientation ...................................................... 28 Table 3: Code snippet for stereo initialization ................................................................ 36 Table 4: Code Snippet for Camera movement ................................................................ 37 Table 5: Room Assembly Transformation Data ............................................................. 46 Table 6: Categorization of objects respect to size ........................................................... 48 Table 7: Room Object Properties ................................................................................... 49 Table 8: Common Responses of Test Subjects ............................................................... 52 Table 9: System Response for a Scenario ...................................................................... 54 Table 10: Result Summary from Test Scenarios ............................................................ 64

Page 11: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

1

CHAPTER ONE

INTRODUCTION

There has been a recent increase of activity in the field of smart environments. A smart

environment is able to acquire and apply knowledge about both itself and its inhabitants in order

to improve their experience in that environment (3). Researchers from various disciplines e.g.

artificial intelligence, pervasive computing, sensor networks etc. are attempting to formulate

innovative methods to assist resident‟s lives by acquiring and applying knowledge about

residents and their physical surroundings.

As society and technology advance there is a growing interest in adding intelligence to our

living and working environments. A major societal impact of smart environments is assisting

people with special health needs. To most people home is a sanctuary, yet today those who need

special care often must leave home to meet medical needs. This problem spans all ages, but is

especially relevant for the quickly-growing elderly segment of the population, who are most

seriously affected by leaving a familiar environment. Nursing home costs are ~$40k/person/year

and family members give the equivalent of an additional $197 billion/year of their own care time

and resources (4). This does not always support the desire of the older adult: 9 out of 10

Americans over 60 want to live out their lives in familiar surroundings (5).

The number of individuals who live with cognitive or physical impairments is rising

significantly due to the aging of the population and better medical care. Smart environments can

aid in the diagnosis, treatment and management of disease, thereby improving quality-of-life for

cognitively and physically limited adults as well as reducing the emotional and financial burden

for caregivers and society.

Page 12: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

2

The one class of applications of smart environments is to provide care and a higher quality

of life for elderly and individuals with disabilities. These applications are developed in order to

assist the inhabitants in leading an independent life with lower health care costs. The assistive

application should meet the following goals (6):

Assurance: To ensure that individual is safe and performing routine activities.

Support: To help the individual compensate for impairment.

Assessment: To determine physical or cognitive status of the individual.

This thesis presents a novel approach to assist inhabitants in tracking the regularly used

objects, and interacting with household appliances. The diseases related to memory loss e.g.

Alzheimer, Parkinson, Huntington etc. cause the elderly difficulties in performing familiar tasks.

This system provides a suitable and tested response to inhabitants who are attempting to locate

missing objects. The system can be integrated easily with the existing smart home environments

to utilize their technological capabilities.

This facility is also beneficial for the inhabitants to get the look and feel of smart home

systems before they actually move in a real home setup. The person can visualize the virtual

room using head mounted device which provides a 3-D room scene. The audio query/response

system enables a person to issue a query for the object and system guides the person to reach the

object. The motion capture system tracks the real time location and orientation of the person and

the objects. This thesis utilizes the Flock-of-Birds system for movement tracking. As per the

system response, the person performs orientation and walking motion towards the target object.

We have conducted the studies to generate a system response which is helpful for different age

and backgrounds. Thus, the virtual reality component in the system makes available the

capabilities of the smart home environment and customizes the environment suited to the needs

Page 13: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

3

of the inhabitant. The suggestions provided by the inhabitants, will help in improving the

existing and new assisted living facilities.

In the next chapter, we provide the detailed background and existing literature about smart

home environments and motion capture systems used in the studies of human movement and

behavior.

Page 14: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

4

CHAPTER TWO

BACKGROUND AND LITERATURE REVIEW

This chapter provides background information of smart home environments, motion capture

systems and ongoing research related with assisted living facilities at Washington State

University.

2.1 Smart Environment

A smart environment acquires and applies knowledge about the physical setting and its

residents, in order to improve their experience in that setting through human-computer

interaction. Computers that can automatically detect the inhabitant‟s behavior could provide

new context-aware services in the home. One such service that has motivated this work is

proactive care for the aging.

Medical professionals believe that one of the best ways to detect emerging medical

conditions before they become critical is to look for changes in the activities of daily living

(ADLs), instrumental ADLs (IADLs) (7), and enhanced ADLs (EADLs) (8). These activities

include eating, getting in and out of bed, using the toilet, bathing or showering, dressing, using

the telephone, shopping, preparing meals, housekeeping, doing laundry, and managing

medications. If it is possible to develop computational systems that recognize such activities,

researchers may be able to automatically detect changes in patterns of behavior of people at

home, indicating declines in health.

Home environments able to automatically monitor their occupant‟s activities can help

extend independent, quality living and reduce healthcare costs (9) (10). Everyday activities in the

home roughly break down into two categories; the body movement and the body-object

interaction. Some activities require repetitive motion of the human body and are constrained, to a

Page 15: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

5

large extent, by the structure of the body. Examples are walking, running, scrubbing, and

exercising. These activities may be most easily recognized using sensors that are placed on the

body (11) (12). A second class of activities, however, may be more easily recognized not by

watching for patterns in how people move but instead by watching for patterns in how people

move things. For instance, the objects that someone touches or manipulates when performing

activities such as grooming, cooking, and socializing may exhibit more consistency than the way

the person moves the limbs.

Many applications have been developed to detect falls of the elderly (13) (14) by utilizing

acceleration sensors worn by users or cameras. Although some accelerometers collect fall data to

predict the user‟s personal fall risks for the purpose of fall prevention, there is no prevention

either against falls during the data collection period or against irregular falls afterwards. Some

wearable devices provide prompt protection such as an airbag or an overhead tether when

sensing a fall, but require the user to wear them all the time.

Medical researchers and practitioners have long sought the ability to continuously and

automatically monitor individuals suffering from diabetes or obesity. As an example of the

situation, the prevalence of overweight and obesity has increased at an alarming rate – over 50%

of all adults in the US are overweight or obese. Excess body weight contributes to diabetes and

the resulting $132 billion annual cost to the US national economy. Researchers have performed

studies to identify predictors of physical activity in working women, to compare traits of

physically active and inactive individuals, and to identify barriers and facilitators for physical

activity counseling (15).

Page 16: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

6

2.2 Virtual Reality Systems for Smart Environments

Virtual Reality (VR) has been generally defined as “a way for humans to visualize,

manipulate, and interact with computers and extremely complex data (16). More specifically, VR

can be viewed as an advanced form of human-computer interface that allows the user to interact

with and become immersed in as computer-generated information in a naturalistic fashion (17).

The key benefit that VR introduces is providing the user a more naturalistic or real-life

environment. The experience of being immersed within a virtual environment allows users to

forget that they are in a testing situation. This may subsequently allow for assessment of

behaviors under more natural conditions and provide insight into individual‟s typical behavior.

This contrasts with the presentation of more artifactually laden test-taking behaviors that may

influence results obtained in traditional testing environments (18).

Another advantage of VR is increased generalization of learning. As formulated in “identica l

elements” theory, better generalization of learning occurs with increased similarity between

training tasks and criterion targets (19). VR offers two key features for increased generalization.

First, because VR allows the creation of individualized environments, training of a task can be

conducted in a simulated version of the individual‟s home or vocational setting. Second,

practicing or training of a task within a virtual environment allows the user to receive feedback

immediately from his/her responses or behaviors while allowing the evaluator to assess the

user‟s incorporation of that feedback (20).VR technology offers the potential to develop human

performance testing environments that could supplement traditional neuropsychological

assessment procedures and offer improved reliability and validity that would produce better

detection, diagnosis, and mapping of assets and limitations that occur with different forms

central nervous system dysfunction. VR has been used within the areas of psychology and

Page 17: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

7

cognitive neuroscience, because it allows precise stimulus and environmental control, enables

accurate performance and behavioral measurements to be recorded and allows researchers to

present environments and situations that would be impossible to present in traditional settings

(21), (22), (23).

The framework defined by International Classification of Functioning, Disability and Health

(ICF) categorizes 3 disablement domains, shown in Figure 1 (24). VR and gaming technologies

can maximize function and participation for those aging with and into a disability is through the

incorporation of outcome measures across these three domains.

Figure 1: ICF model as applied to aging with disability.

There is a compelling and ethical motivation to address the needs of individuals who are

aging with disabilities by promoting home-based access to low-cost, interactive VR systems

designed to engage and motivate individuals to participate with game driven physical activities

and rehabilitation programming. The creation of such systems can serve to enhance, maintain,

Page 18: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

8

and rehabilitate the motor processes that underlie the integrated functional behaviors that are

needed to maximize independence and quality of life beyond what exists with currently

available, labor intensive, underutilized, and more costly approaches (25).

The potential of the use of VR game-based applications, as they pertain to those aging with

and into disability, involves the integration of cognitive and physical tasks within an engaging

and meaningful environment in which it is safe to (or in some cases the system makes the user

unable to) make mistakes and errors . The operating premise is that cognitive demand for daily

functional behaviors increases with age related physical decline. The approach is to enhance

physical function in meaningful activities by harnessing advances in science and technology

(especially immersive technologies) that increase sensorimotor capacity and unload cognitive

demands in those aging with and into disability. By taking advantage of VR technologies for

facilitating focused task-specific practice and gaming for enjoyment and adherence, it is

anticipated that the core processes at the body function level and activities level as represented in

the ICF model will be affected, which can enable active participation and enhance quality of life

for the intended beneficiaries.

2.3 Motion Capture Systems: An Overview

Motion capture system (MoCap) refers to the process of capturing the motion of a human

body or body part, at some resolution. Potential applications of human motion capture are the

driving force of system development, and the major application areas are: smart surveillance,

identification, control, perceptual interface, character animation, virtual reality, view

interpolation, and motion analysis (26) (27). Motion capture is an important method for studies

in biomechanics and has traditionally been used for the diagnosis of the pathomechanics related

to musculoskeletal diseases (28) (29). Recently it has also been used in the development and

Page 19: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

9

evaluation of rehabilitative treatments and preventive interventions for musculoskeletal diseases

(30).

The works of two contemporaries, Marey (1873) and Muybridge (1878), were among the

first to quantify patterns of human movement using photographic techniques (31) (32) . Over the

past two decades, the field of registering human body motion using computer vision has grown

substantially, and a great variety of vision-based systems have been proposed for tracking human

motion. These systems vary in the number of cameras used, camera configuration, representation

of captured data, types of algorithms, use of various models, and the application to specific body

regions or the whole body. The different MoCap devices are either based on active sensing or

passive sensing.

Active sensing allows for simpler processing and is widely used when the applications are

situated in well-controlled environments. The sensors used in active sensing devices are:

mechanical sensors (33), accelerometers (34), electromagnetic sensors (35), acoustic sensors

(36), and optic fibers (37). The major problem with these devices is that the subject‟s movement

is restricted due to the devices on human body.

Passive sensing is based on “natural” signal sources, e.g., visual light or other

electromagnetic wavelengths, and does not require wearable devices. An exception is when

markers are attached to the subject to ease the motion capture process. Markers are not as

intrusive as the devices used in active sensing. Passive sensing is mainly used in surveillance and

some control applications where mounting devices on the subject is not an option. Passive

systems are advantageous as they only rely on capturing images and thus provide an ideal

framework for capturing subjects in their natural environment (38).

Page 20: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

10

The need for marker less motion capture arises to avoid the risk of artificial stimulus

producing unwanted artifacts that could mask the natural patterns of motion. The feasibility of

accurately and precisely measuring 3D human body kinematics for the lower limbs using a

marker less motion capture system on the basis of visual hulls is demonstrated by (39).

2.4 Smart Home Project at Washington State University

A system that recognizes activities in the home setting is most useful if it performs in

real-time. The CASAS smart home project at WSU has done significant work in activity

recognition with sensor data to learn patterns of resident behaviors, recognize current activities in

multi resident settings, identify the individual currently in the environment, monitor their well-

being, and automate their interactions with the environment. By utilizing the methodologies of

machine learning and pervasive computing, an unsupervised method of tracking and discovering

activities in a smart environment has been developed (40). The new methods have been

developed to perform real time recognition of ADL when activities are interrupted and

interleaved (41). The real time data was collected from participants performing activities in an

on-campus smart apartment test bed.

Page 21: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

11

CHAPTER THREE

PROBLEM STATEMENT AND PROPOSED SOLUTION

3.1 Problem Statement

Smart homes can provide a range of benefits to improve the quality of life for people with

disabilities and older people. A smart home can assist a person in daily routine activities to

perform tasks which were thought to be impossible without personal assistance. But, still there is

a large section of older and disabled persons, who feel non-inclusive in the development of smart

environments. The need of smart home customization arises with the different effects of

disabilities on people e.g. a person with dementia would need different technological features in

a home than a person in a wheelchair.

During the last decade, there has been fast paced technological advancement in different

walks of life. The user interfaces of devices such as cell phones and household appliances have

become easier to use; likewise, the technological development of integrating many different

devices with a single interface is already in progress. But many older people feel these devices as

too “sci-fi” and would like to rely on sensory perceptions rather than living in a house filled with

confusing gadgets. Therefore, we need to develop an environment where a person feels

comfortable and ready to adopt the facilities of assisted living.

In this thesis we have considered the following problems related to smart home

environment:

i. How to customize the smart home environment for older persons and persons with

disabilities of different types?

ii. What are the system architectures that would provide efficient ways for the interaction

between smart home facility and the inhabitant?

Page 22: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

12

iii. How to integrate the CAD and virtual reality capabilities with existing smart environment

setup? What are the benefits and difficulties of this integration?

iv. How can virtual reality technologies integrated with smart home technologies assist a

person?

3.2 Proposed Solution

The approach for the problems stated in previous section begins with identifying key

subsystems and how their integration solves core problems of smart environment.

3.2.1 Design, Evaluation, and Training

CAD systems are excellent tools for designing the retrofitting of a living space for

assisted living and smart homes. However, CAD systems are complex to use and navigate in

and are not generally suitable for use by the home‟s occupant to provide any kind of feedback.

However, visualization systems such as virtual reality systems can use these CAD models to

create a virtual environment where the home occupant can get a feel for the smart home before

it is built and use it for the following purposes:

a) Get a feel for the smart home environment and understand what to expect at the end

b) Provide feedback on the design in terms of capabilities, assistance methods, etc.

c) Use a fully immersive virtual smart home for complete training of the occupant prior

to occupation

These uses of an integrated CAD and VR system with a smart home design process will

allow the occupant to engage with and provide feedback during the design and evaluation of

the smart home and also get trained in the use of the smart home (thereby providing further

input to the implementation of the smart home).

Page 23: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

13

3.2.2 Integration with the Use of the Smart Home

Most smart homes that track the movements of the occupant use motion sensors that

report back the general area the occupant is in when the occupant is moving. Virtual reality

tracking systems on the other hand can provide complete six degree-of-freedom information for

various objects in the room (including the occupant, the head of the occupant, the arms and

legs, movable furniture, etc.). This level of tracking can be used to provide a finer level of

assistance in smart homes. For example, tracking the arm of the user can allow the smart home

to infer what the person is reaching for and knowing where the object is, the occupant can be

provided precise guidance. With head tracking, the smart home will know what the person is

looking at and direction he/she is facing and gives instructions relative to the position and

orientation of the user (front, back, left, right, etc.). Integrating this real time tracking with the

CAD model will allow a virtual reality representation of the room to constantly keep track of

objects in the room (phone, chair, eyeglasses, etc.) all the time for use by the occupant or by a

remote assistance provider. Because sensors are used for this motion capture and full videos are

not recorded or transmitted, privacy concerns can be significantly alleviated.

3.2.3 Scope of Work

We intend to develop a virtual reality (VR) system integrated with CAD and a motion

capture setup. The objects and residing person need to be tracked by a 6 DOF tracking system,

which provides the real time positional and orientation data. CAD data of rooms and furniture

will be integrated with the VR system to create the virtual environment. Objects in the room

and the occupant will be tracked. The user can issue a query (voice command) to find an object

in the room to which the system will respond (voice response) with the object‟s location,

thereby guiding the person to the object. Additionally, the VR system will identify the objects‟

Page 24: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

14

interactions and provide a meaningful response to simplify the process. For example, if the

person is looking for the TV remote control, the system will identify whether the remote

control is on the floor, on the table, or inside the closet. Depending upon the scenario, the

person will be guided to the desired object. The system should evaluate the optimized collision-

free path. While in motion, the person may follow the path of collision with an object, e.g.

chair, so the system can respond with a possible collision event and provide a new path to avoid

the collision. When a person approaches within hand‟s reach of the object, system should

respond by telling the person to stop walking and try to grab the object by hand. Furthermore,

the system will again offer guidance for any corrections needed in hand position and

orientation.

The virtual reality scene will provide training to a person who is anticipating a move to a

smart home. With the help of head-mounted devices, the person will get a realistic experience

of the room and can utilize the full functionalities of the developed system. The events

triggered by the peripheral system can be visualized in the virtual system and the person can

take the actions suggested by the system. Therefore, the benefit of the VR component in the

overall scheme is to introduce various sorts of scenarios the person is likely to encounter in the

smart house and train him accordingly, so that when the person moves into the smart house

setting, he can replicate the actions corresponding to the actual layout of the new house.

The following will be performed as a part of this thesis research:

1. Identify requirements for the integration of CAD and VR with smart homes to

achieve the overall objectives described above

2. Perform a full software analysis and design of the system

Page 25: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

15

3. Create a CAD plug in to extract necessary information regarding room design and

export to the VR-Smart system

4. Create a VR system including motion tracking to monitor occupant, furniture and

other objects

5. Create algorithms and methods for assistance in finding objects

6. Implement a fully integrated system of all the components described above

7. Perform tests and provide insight to future development and integration with other

existing smart homes.

The next chapters will describe the system design implementation details for the proposed

solution. The system development process involves:

The functional requirements of each sub-component.

Design of system architecture which details the interaction of different sub-components.

Software development process using object oriented concepts and methodologies.

The implementation details of system hardware which involves motion capture system

(Flock-of-Birds) and virtual reality devices.

The system response test study for developing a universally accepted system response.

The testing of system in different use cases.

Page 26: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

16

CHAPTER FOUR

SYSTEM REQUIREMENT ANALYSIS

This chapter discusses the overall system requirements of virtual reality-motion capture

system for smart home environments. The key components proposed in the previous chapter are

the: 1) Motion capture system, 2) Audio query/response system, and 3) Virtual reality system.

The requirements of each of these components are discussed in detail below.

4.1 Functional Requirements of Motion Capture System Component

The following requirements have been identified for the motion capture system based on the

overall system requirement analysis:

1. Real-time tracking of both human movements and object movements within the room.

2. Integration of movement data gathered from the motion capture system into the virtual reality

system.

3. Enabling the users to use the system equipment.

4.2 Functional Requirements of Audio Query/Response System Component

The requirements for this component are:

1. The user can issue the query using microphone and waits for an audio response.

2. The user can ask different types of queries e.g. find an object, which objects are within

hand‟s reach of the user etc.

3. The user gets the audio response relative to the user‟s current position and orientation.

4. The audio response mentions the number of walking steps and the orientation of line of sight

towards the target object.

Page 27: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

17

5. The proper speed and volume of the audio response to enable the user to follow the directions

in an efficient manner.

4.3 Functional Requirements of Virtual Reality System

1. Provide a virtual reality room scene which is a replica of the actual room with the objects and

person.

2. Provide the coordination between the motion capture system data and the virtual reality

scene.

3. Maintain an appropriate viewer frame rate for the user who is visualizing the virtual

environment.

Page 28: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

18

CHAPTER FIVE

DESIGN OF SYSTEM ARCHITECTURE

This chapter describes the details of key components of the overall system. Each sub-

component of system has a loosely coupled connection with other sub-components through

message passing.

The main components of the overall system with their interactions are depicted in the Figure

2:

Figure 2: VR-Smart System Components

Figure 3 shows the functional architecture (IDEF0) (42) diagram of VR-Smart system. The

inputs to the overall system are: CAD model and audio query issued from the user. The expected

output from the system is the audio response which will guide the user to an object. The graphics

engine provides the angle and walking steps information of the object, relative to the user, to the

audio system.

Page 29: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

19

Figure 3: Functional Architecture of VR-Smart System

5.1 Virtual Reality (VR) system

This system contains the actual room scene recreated in a virtual environment. There are

three components of this system:

5.1.1 CAD System

We used Pro Engineer

as the CAD system to recreate an actual 3-D room scene. The

Virtual Reality and Computer Integrated Manufacturing Laboratory (VRCIM) room and the

objects inside it were measured in inches. The objects of the room are: table, cabinet, chairs,

cell phone and keys. Each object in the room assembly is represented as a separate

subassembly. Figure 4 shows the CAD assembly hierarchy of room model. Each subassembly

has part components which constitute an object, e.g. cabinet subassembly has shelves, doors

and outer part components.

Page 30: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

20

Figure 4: Assembly hierarchy of CAD model

5.1.2 CAD data extraction

Pro/VADE (43) is a Pro/Toolkit based application developed at VRCIM, WSU.

Pro VADE is used for extracting the CAD model‟s tessellated information. It also generates the

assembly hierarchy, assembly constraints and assembly transformation information. In our

work, we used assembly transformation information to recreate the CAD room scene in a

Page 31: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

21

virtual reality scene. The Pro Engineer

room assembly is saved in a neutral OBJ format and

exported in the OGRE graphics engine as described in the next section.

5.1.3 OGRE Graphics Engine

OGRE (Object-Oriented Graphics Rendering Engine) (44) is a scene-oriented, flexible

3D engine written in C++, designed to make it easier and more intuitive for developers to

produce applications utilizing hardware-accelerated 3D graphics.

The capabilities of the engine we selected are compared to two other leading open source

engines in Table 1.

Criteria OGRE 3D CrystalSpace OpenGL

Approach Fully Object-Oriented Object-Oriented Native

Documentation Extensive Poor Extensive

Feature Set Good Moderate Basic

Performance Fast Moderate Very Fast

Table 1: Comparison of Image Renderers

When choosing OGRE as our rendering platform, object-oriented programming,

readability, changeability and maintainability of the source code – which is important in

creating a solid framework – were considered important features. OGRE has also already

proven to be industrially, educationally, and commercially viable in hundreds of successful

projects.

There are several projects that use OGRE for 3D visualization, but there is no existing

native support for stereo rendering in the framework. In order to enable the stereo rendering

Page 32: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

22

mode, the Direct3D9 or OpenGL rendering subsystem support has been implemented in the

application.

The data flow for virtual reality system is presented in Figure 5:

Figure 5: Virtual Room Scene Data Flow

5.2 Motion Capture system

Motion capture, motion tracking, or “MoCap” are all terms used to describe the process of

recording movement and translating that movement onto a digital model. It is used in military,

entertainment, sports, and medical applications. In this thesis we have used a Flock-of-Birds

system which utilizes the VRDevServer application (45) to send the data from the birds to the

Page 33: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

23

system. The implementation details of this system are provided in section 7.3. Figure 6 provides

the architecture of Birds system. The machine running the application is connected to a master

bird through serial port connection (RS-232). The other birds act as secondary devices to master

bird. Each bird device is connected to a sensor which provides the X, Y, Z positional coordinates

and orientation angles. The benefits of using the bird system are:

Unrestricted tracking without line-of-sight restrictions.

Consistently fast measurements even with multiple sensors.

Figure 6: Flock-of-Birds architecture

5.3 Audio Query/Response system

We identified an audio query/response system as an effective means to interact with our

system. A user can ask queries about the location of objects in the room and the system will

provide a suitable response to guide the user to the objects in the actual room. The audio system

utilizes the speech recognition capabilities provided by Microsoft Speech Application

Page 34: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

24

Programming Interface (SAPI) (46). SAPI allows the use of speech recognition and speech

synthesis within windows applications. The synthesis engine processes the text strings and files

into spoken audio using synthetic voices. Speech recognition engine convert human spoken

audio into readable text strings and files. Figure 7 shows the SAPI architecture.

Figure 7: SAPI architecture

SAPI is a component object model (COM)-based API for native Windows applications. It

includes dozens of objects and interfaces that can be used by applications to listen for speech,

recognize content, process spoken commands, and speak text.

Page 35: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

25

CHAPTER SIX

OBJECT ORIENTED DESIGN AND ANALYSIS

6.1 Introduction

This chapter describes object oriented design and analysis of the overall system. The UML

diagrams are presented related to class hierarchy as well as sequence diagrams depicting object

interactions and state chart diagram for showing objects‟ state at different stages of transition.

The OGRE graphics engine is connected to the motion capture system and updates the virtual

scene in real time.

6.2 Class design of Room Scene

The class architecture of room scene is shown in Figure 8 below:

Figure 8: Class architecture of Room objects

Page 36: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

26

6.2.1 Objectives and Responsibilities

The key objectives of classes for room scene are:

Instantiation of the objects modeled in CAD software

The objects are instantiated by reading the Assembly information file generated

from Pro/VADE. The object type is set for each object. Following are the object

types as shown in Figure 9 below:

Figure 9: Object Types

Maintaining objects states and interactions

The object states are maintained for determining whether an object is being tracked

by the motion capture system or not. These states are also changed depending upon

the signal received from external systems. The objects can interact with each other

through the EventManager class which keeps all the instantiated objects.

Algorithms related with finding path and collisions

The algorithms specific to finding the path in the room and collisions between the

objects have been developed. The objects keep the updated state and position, which

Page 37: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

27

leads to finding the path of an object relative to the person‟s position and

orientation.

6.2.2 Implementation of Classes

The class architecture of the room scene provides four-way interfaces for the motion

capture system, OGRE engine, CAD system, and external systems. The objects interact with

each other and react to certain external events. Following is a description of the few important

classes in the class architecture of the room scene:

Objects class: The Objects class is the top level class from which all other room

objects, and the Person class, are derived. This class has the common attributes and

member functions of the other derived classes e.g. set/get functions for object type,

position, orientation, and state. The Objects class has a reference to the

EventManager class which is used for instantiating the room objects. This class has

the location structure which maintains the current position and orientation of room

objects. The Position structure has three values for x, y and z direction. The

Orientation structure keeps the four values of a quaternion. The location structure is

defined as follows in Table 2:

typedef struct Orientation

{

float w;

float x;

float y;

float z;

}Ori;

typedef struct Position

{

float x;

float y;

Page 38: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

28

float z;

}Pos;

typedef struct Location

{

Ori ori;

Pos pos;

}Loc;

Table 2: Data Structure for Position and Orientation

ModelManager class: The ModelManager class is responsible for extracting the

names of the objects in the room from the CAD data file. These names are passed to

the EventManager class to properly instantiate the room objects.

EventManager class: The EventManager class manages the room objects by

keeping a reference to the array of type Objects. The EventManager class

implements the Mediator design pattern. The mediator object encapsulates all

interconnections, acts as the hub of communication, is responsible for controlling

and coordinating the interactions of its clients, and promotes loose coupling by

keeping objects from referring to each other explicitly. The Mediator design pattern

structure is described in Figure 10.

All the object interactions are passed through EventManager object. The

ModelManager class passes the names of the room objects (to be tracked) to the

EventManager object which instantiates all the objects in the name list. The

instantiation of the derived objects happens in the constructor of the EventManager

class. A static variable, num_objects, keeps track of the number of instantiated

objects. The graphics engine class VRApplication interacts with the EventManager

object to pass and receive the data about the objects and person within the room, for

Page 39: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

29

implementation of finding the path in the room and checking collision between

objects.

Figure 10: Mediator design pattern structure

Person class: The Person class manages the data about the person‟s location and

orientation. The Person object updates its data by receiving the real time input from

motion capture system. It also provides the interface for the audio system to send

and receive responses to/from the EventManager.

Furniture class: The Furniture class maintains two types of room objects: Static and

Movable. The static objects are: table, cabinet, shelves etc. and movable furniture

objects are: chair, mini desks etc. EventManager initializes these objects according

to their type.

Appliances class: The Appliance class is of two categories: Handheld and Electrical.

The handheld appliances are: cell phone, TV remote etc. and electrical appliances

are: microwave oven, washing machine, coffee machine etc. The electrical

appliances can receive and activate the corresponding object as per the signal

received from external application. Another difference between these two categories

Page 40: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

30

is of the shape and movement. The handheld appliances are movable so a person

may issue a query for its position.

6.2.3 Sequence diagrams for room objects

In this section, the sequence diagrams are presented for different event scenarios. The

scenarios are categorized as per the objects‟ size and movement.

Find the chair

The chair is a MovableFurniture object and its size is bigger than other movable

objects e.g. phone, keys etc. The interaction between the objects is shown below in

Figure 11, when a query; “Find the chair” is issued for the system. The system

generates a response locating the chair distance and orientation relative to the person.

Figure 11: Sequence Diagram for "Find the Chair"

Page 41: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

31

Find the key

The key is a small movable object; similar to other regular used objects e.g. cell

phone, television remote, wrist watch etc. Thus, the system generated response not

only includes the walking steps and orientation information but also provides the

position relative to hand. Figure 12 shows the message interaction between different

objects.

Figure 12: Sequence Diagram for "Find the Key"

Page 42: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

32

External signal interface

The external systems can provide the signals pertaining to electrical appliances. The

person in the room can act accordingly when notified. Figure 13 provides the details

about signal interface and its interactions with other room objects.

Figure 13: Sequence Diagram for External System Interface

6.3 Class design of OGRE application

The class architecture of OGRE application is shown in Figure 14. We have used OGRE

1.7.1 SDK release for this development. The two classes FrameListener and

WindowEventListener are defined in the OGRE package. OgreApplication and VRApplication

classes perform graphics engine initialization and scene creation. The next two subsections

describe the key responsibilities and implementation details of OGRE application.

Page 43: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

33

Figure 14: Class Architecture of OGRE application

6.3.1 Objectives and Responsibilities

The key objectives of OGRE application are to create a virtual scene based on the CAD

model, interact with the audio query/response system and room object module.

Virtual room scene creation

The CAD room model provides the exported OGRE meshes for each object. These

ogre meshes are attached to OGRE scene manager as OGRE entities. Each OGRE

entity is then attached to an OGRE Scene node. An OGRE Scene Node is a type of

OGRE Node which is used to organize objects in a scene. It has the same hierarchical

transformation properties of the generic Node class, but also adds the ability to attach

world objects to the node and stores hierarchical bounding volumes of the nodes in

the tree. Child nodes are contained within the bounds of the parent, and so on down

Page 44: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

34

the tree, allowing for fast culling. The rotation, translation and scaling

transformations are applied to Scene Node by the matrices extracted from Pro/VADE.

Thus, each scene node is placed in the same manner as modeled in Pro/Engineer.

Event Handling

The virtual room scene handles the events initiated by keyboard and mouse. The

mouse movement allows the change in camera position and orientation. The key

pressing events are handled for camera movement in a certain direction and audio

input/output.

Stereoscopic viewing

The room scene can be visualized in 3D by using two polarized projectors, or head

mounted device. The OGRE application creates two viewports, one for each eye and

the scene is created for each eye with an offset.

6.3.2 Implementation of Classes

As described in the previous section the two classes which manage virtual scene are,

OgreApplication and VRApplication. Following are the implementation details of these classes:

OgreApplication class: This class is inherited from the Ogre package classes

FrameListener and WindowEventListener.

FrameListener is an interface class defining a listener which can be used to

receive notification of frame events. OgreApplication overrides the method

frameRenderingQueued() which is called after all render targets have had their

rendering commands issued, but before render windows have been asked to flip their

buffers over. The frames are queued for the GPU to process because once the request

to 'flip buffers' happens, the thread requesting it will block until the GPU is ready,

Page 45: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

35

which can waste CPU cycles. Therefore, it is often a good idea to use this callback to

perform per-frame processing. Of course because the frame's rendering commands

have already been issued, any changes made will only take effect from the next

frame, but in most cases that's not noticeable. OgreApplication instance is passed to

Root::addFrameListener() to register the event listener. Frame events only occur

when Ogre is in continuous rendering mode, i.e. after Root::startRendering() is

called.

WindowEventListener is a callback class used to send out window events to

the client application. OgreApplication class overrides the methods windowResized()

and windowClosed(), passing the Ogre::RenderWindow reference to these methods.

OgreApplication::initialise() method initializes the various components of the

graphics engine. Here are the few important methods executed by initialise():

initOgreCore(): Ogre has native support of Direct3D9 and OpenGL rendering

subsystems. This method presents a config dialog box for a user to select the

rendering sunsystem of their choice and loads the library RenderSystem_GL_d.dll

or RenderSystem_Direct3D9_d.dll. A user can load the configuration settings of

subsystem from „ogre.cfg‟ as well. We have provided support for both the

renderers in our application.

createSceneManager(): This method creates a scene manger instance of a given

type. Ogre::SceneManager class keeps track of the objects placed in the scene.

addResourceLocations(): This method loads „resources.cfg‟ which contains a list

of directories that Ogre scans to look for resources. The resources include scripts,

meshes, textures, cg programs etc.

Page 46: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

36

createCamera(): This method attaches a camera to the scene manager. The three

separate scene nodes for the camera are created: cameraYawNode,

cameraPitchNode and cameraRollNode to support the yaw, pitch and roll

rotations to the camera.

createViewPorts(): This method creates two rendering windows and attachs two

viewports to each window for stereoscopic viewing. Both the viewports (for left

and right eye) use the same camera created in the previous step.

createFrameListener(): This method registers the frame event listener by

providing instance of OgreAppli cation.

VRApplication class: This class is inherited from class OgreApplication. It handles

events related to keyboard, mouse and audio query/response. The key methods

implemented in this class are following:

createScene(): This method initializes scene nodes corresponding to each ogre

mesh representing objects in the CAD model. It also sets the lighting, material

and textures. The object‟s position and orientation is passed to the EventManager

object by invoking EventManager::setCurrentLocation(). A StereoManager class

instance, mStereoManager, initializes the Ogre compositors and listeners for each

view port. The initial settings related to stereo mode e.g. eyes spacing, screen

distance and stereo mode are saved in configuration file „stereo.cfg‟. Table 3

provides the code snippet for stereo viewing initialization:

mStereoManager.init(gLeftViewport,gRightViewport,

StereoManager::SM_DUALOUTPUT);

mStereoManager.setEyesSpacing(EYES_SPACING);

mStereoManager.setFocalLength(SCREEN_DIST); mStereoManager.saveConfig("stereo.cfg");

Table 3: Code snippet for stereo initialization

Page 47: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

37

moveCamera(): The camera is moved by this method by capturing the events of

translation and rotation from processUnbufferedMouseInput(). The code snippet

is provided in Table 4.

void VRApplication::moveCamera()

{

Real pitchAngle;

Real pitchAngleSign;

cameraYawNode->yaw(mRotX);

cameraPitchNode->pitch(mRotY);

cameraNode->translate(cameraYawNode->getOrientation() *

cameraPitchNode->getOrientation() *

mTranslateVector,

SceneNode::TS_LOCAL);

pitchAngle=(2*Degree(Math::ACos(cameraPitchNode-

>getOrientation().w)).valueDegrees());

pitchAngleSign = cameraPitchNode->getOrientation().x;

if (pitchAngle > 90.0f)

{

if (pitchAngleSign > 0)

// Set orientation to 90 degrees on X-axis.

cameraPitchNode->setOrientation(Quaternion(

Math::Sqrt(0.5f), Math::Sqrt(0.5f), 0, 0));

else if (pitchAngleSign < 0)

// Sets orientation to -90 degrees on X-axis.

cameraPitchNode->setOrientation(Quaternion(

Math::Sqrt(0.5f), -Math::Sqrt(0.5f), 0, 0));

}

}

Table 4: Code Snippet for Camera movement

collisionDetection(): The Ogre maintains an intersection list of the objects

corresponding to the Ogre scene manager. collisionDetection() is called for the

object for which the other intersecting object is to be retrieved e.g., check to

Page 48: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

38

determine whether cell phone is on the chair, table, or floor. In the case when an

object is in contact with more than one object the the next immediate object which

has the lowest height from the floor is selected.

6.4 Implementation of Audio query/response system

As described in section 5.3, we have selected Microsoft Speech Recognition Engine for the

audio query/response system. The sapi.lib library provides support for voice recognition and

TTS (Text-To-Speech) engine. Here are the implementation details of these two modules:

Voice Recognition module

The COM command CoInitialize() activates COM component.

The recognizer object, CComPtr<ISpRecognizer> is created with an in-process

option. The in-process instance allows only one application to control the resources.

This includes microphone and speech recognition engine.

The recognition context, CComPtr<ISpRecoContext> is created for the engine. A

context is any single area of the application needing to process speech. In our

application, only one recognition context is used. ISpRecoContext is an important

interface and the primary means for recognition. From the interface, the application

can load and unload grammars as well as get and respond to events.

The last major part of the startup sequence is loading the grammar. A grammar

specifies what the speech recognizer will recognize. Essentially there are two types

of grammars. One is for dictation and the other is command and control. The

VRApplication uses the dictation command because the text generated from speech

is sufficient to identify the query object.

Page 49: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

39

When an audio query is issued from a person the system, the recognition context

captures the text and stores the result in CComPtr<ISpRecoResult>. The text is

acquired by calling the GetText() method on recognition result object.

Text to Speech module:

Once COM is running, the next step is to create the voice. A voice is simply a COM

object. Additionally, SAPI uses intelligent defaults. During initialization of the

object, SAPI assigns most values automatically so that the object may be used

immediately afterward. The defaults are retrieved from Speech properties in Control

Panel and include such information as the voice, and the language (English,

Japanese, etc.). The CoCreateInstance() method initializes the ISpVoice instance.

The Speak () method of ISpVoice instance converts the text string into audio.

The core code for the two modules mentioned above is provided in Appendix A.

Page 50: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

40

CHAPTER SEVEN

IMPLEMENATION

7.1 Introduction

This chapter provides the implementation details of the system which includes setting up the

Flock-of-Birds System in an actual room scene and integration with the virtual room scene.

7.2 Implementation of the CAD Room Scene

In this thesis, we have implemented two CAD room scenes which belong to two separate

rooms in the VRCIM lab at WSU. The first room is of the actual size with objects at their proper

location. Figure 15 shows the room scene created using Pro/Engineer.

Figure 15: Pro/Engineer Room Scene 1

This room scene contains objects of different nature of movement:

Page 51: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

41

Heavy sized objects: Cabinet, Table.

Medium sized objects: Chair.

Small sized objects: Cell Phone, Medicine box, Keys.

The other room scene has been created as a prototype room on a wooden platform. It consists

of fewer objects e.g. Cell phone, Cabinet, Table, Chair, Medicine Box. Figure 16 shows the

prototype room scene created in Pro/Engineer.

Figure 16: Prototype room scene in Pro/Engineer

7.3 Implementation of Flock-of-Birds system:

We have utilized the flock-of-birds system for motion capture of both the human‟s and

objects‟ movements within the prototype room. The sensors are attached to the surface of the

objects and to the head mounted device worn by the person. These sensors relay the real time

positional and orientation data to the birds system which sends the data to the machine running

Page 52: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

42

the application. Here are the key components of the flock-of-birds system and implementation

details:

7.3.1 Birds System

The Flock can be configured to track from one to four sensors simultaneously with one or

more RS-232 interfaces to a host computer. The 6 degree of freedom trackers work in the

magnetic field generated by a box and captures the position and orientation of the slave birds in

real time relative to the origin of coordinate system centered at the box. Figure 17 shows the

bird system in place. Each box is connected with a sensor (bird). One bird box acts as master

connected to the machine running the application through serial port connection.

Figure 17: Flock of Birds System

In this study, we have chosen a platform space of 147” X 95” which contains the room

objects e.g. chair, cabinet, cell phone, table and medicine box. Figure 18 shows the room

prototype created in our laboratory.

Page 53: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

43

Figure 18: Room prototype

Figure 19: A box connected with sensor

Page 54: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

44

Some of the objects have been connected with the sensor. When the object is moved in the

actual room, the representative object in the virtual room scene is moved accordingly. Figure

19 shows a box connected with the sensor which corresponds to the medicine box in the virtual

scene.

The sensor data is affected by the presence of metallic objects which interfere with the

magnetic field emitted by the bird system. We have introduced some calibration factors

corresponding to the translation of objects at different points in the room. These calibration

factors modify the sensor data such that the motion in the actual room is replicated in virtual

scene accordingly.

7.3.2 Head Mounted Device

The visualization of the virtual scene is performed through a head mounted device

(HMD). Figure 20 shows the device used in the setup.

Figure 20: Head Mounted Device

Page 55: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

45

This device provides VGA 640 X 480 pixels resolution. The HMD dons quickly and

comfortably using rear and top ratchets and a spring loaded forehead rest. The interpupillary

adjustment doubles as an eye relief adjustment to accommodate glasses. Inputs and outputs for

audio, video, and power are handled through an external control box. Red LEDs indicate Power

On and Stereo modes. The control box input for each eye is fetched from the machine running

the application, which generated two viewport images.

The HMD is equipped with a mounting bracket on top which accommodates the bird that

senses real time position and head orientation of the person. The yaw, pitch and roll angles

produced by head movement are captured by the sensor and fetched to the application running

the virtual scene. This motion capture provides a person the feeling of being submerged into the

virtual scene.

7.4 Implementation of the Virtual Room Scene

The virtual room scene is generated through the OGRE graphics engine as described in

chapter 5. The CAD models are exported to the OGRE meshes and Pro/VADE toolkit

application generates the room assembly transformation matrix file to place each mesh in its

proper place inside the virtual scene. The table 5 shows the data file generated by Pro/VADE

which executes on the Pro/Engineer room assembly.

FLOOR.PRT 1.000000 0.000000 0.000000 0.000000

0.000000 1.000000 0.000000 0.000000

0.000000 0.000000 1.000000 0.000000 73.500000 0.000000 -49.000000 1.000000

CHAIR.ASM

1.000000 0.000000 0.000000 0.000000

0.000000 -1.000000 0.000000 0.000000 0.000000 -0.000000 -1.000000 0.000000

81.064257 20.041076 -58.433132 1.000000

CABINET.ASM 0.000000 1.000000 0.000000 0.000000

0.999983 0.000000 -0.005751 0.000000

Page 56: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

46

-0.005751 0.000000 -0.999983 0.000000

58.801702 19.635583 -45.526854 1.000000 CELLPHONE.ASM

1.000000 -0.000000 -0.000047 0.000000

-0.000000 -1.000000 -0.000000 0.000000

-0.000047 0.000000 -1.000000 0.000000 57.855371 33.100000 -38.124405 1.000000

TABLE.ASM

-0.000000 0.000000 1.000000 0.000000 -0.000000 -1.000000 0.000000 0.000000

1.000000 -0.000000 0.000000 0.000000

110.861112 31.000000 -36.971448 1.000000

MEDICINEBOX.ASM 1.000000 -0.000000 0.000000 0.000000

-0.000000 -1.000000 0.000000 0.000000

0.000000 -0.000000 -1.000000 0.000000 110.940398 35.500000 -37.118804 1.000000

WALL1.PRT

1.000000 0.000000 0.000000 0.000000 0.000000 -1.000000 0.000000 0.000000

0.000000 -0.000000 -1.000000 0.000000

73.628466 99.959267 2.000000 1.000000

WALL2.PRT 1.000000 0.000000 0.000000 0.000000

0.000000 1.000000 0.000000 0.000000

0.000000 0.000000 1.000000 0.000000 73.821737 100.549068 -98.000000 1.000000

WALL3.PRT

1.000000 0.000000 -0.000000 0.000000 0.000000 0.000000 1.000000 0.000000

0.000000 -1.000000 0.000000 0.000000

145.087743 102.000000 -48.830847 1.000000

WALL4.PRT 1.000000 0.000000 -0.000000 0.000000

0.000000 0.000000 1.000000 0.000000

0.000000 -1.000000 0.000000 0.000000 0.256228 102.000000 -48.130604 1.000000

Table 5: Room Assembly Transformation Data

The OGRE graphics engine creates a SceneNode for each of the assembly components

and applies the transformation on each node by reading the matrix data file. Figure 21 shows the

virtual room scene which is the same as the actual room prototype. The proper texturing is

applied on the objects to provide better visual appearance.

Page 57: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

47

Figure 21: Virtual Room Scene

Page 58: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

48

CHAPTER EIGHT

SYSTEM RESPONSE TEST STUDY

8.1 Introduction

In this chapter we present the approach developed to enhance the system response in

different scenarios for the smart home inhabitants. In the next subsections we present the

methodologies adopted to develop the test cases, response collection of test subjects and how to

come up with the generalized system response which is applicable to the inhabitants of different

age and disabilities.

8.2 Categorization of Room Objects

The test case scenarios are developed by considering the difficulties faced by older persons

and persons with disabilities in reaching out for different type of objects in the home setting.

Three representative objects of different size and nature of movement are selected. The size of an

object is considered in terms of the person‟s capability to grab it completely by the hands. Thus,

Table 6 below shows the categorization of room objects in terms of their size:

Small size objects Medium size objects Large size objects

Cell Phone Chair Cabinet

Television remote Table Book Shelf

Keys Computer Bed

Medicine Box Television Couch

Table 6: Categorization of objects respect to size

Page 59: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

49

The categorization of Table 5 helped us in selecting a representative object of each group.

The other important factors to consider for test study are whether an object is movable or not and

what is its relative distance to the person. An object can be moved during its use and it is

necessary that the object is being tracked all the time. If the object doesn‟t change its position

frequently because of its size or weight, then the object‟s position and orientation can be

recorded once only and used repetitively with respect to the person‟s position. The distance of

the object relative to the person is identified as near and far depending upon whether the object is

within hand‟s reach or the walking motion is needed to reach for the object. Table 7 below shows

the properties of the different representative objects considered in this study:

Object Size Nature of movement Distance relative to

the person

Cell Phone Small Movable Near/Far

Chair Medium Movable Near/Far

Cabinet Large Fixed Near/Far

Table 7: Room Object Properties

8.3 Development of Test Cases

We have modeled the room scene of the VRCIM lab at WSU for the virtual system. The test

study was conducted in the same room to analyze the test subjects‟ response. We divided the

room into different sectors to collect a precise response about the orientation of the object

relative to the person. Each sector is identified by a 30 degree view angle with respect to the

person. Figure 13 depicts the partitioning of sectors. Each sector has been numbered from 1 to

12. The object is put in each of these sectors; near and far to the person while the person is

Page 60: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

50

positioned in one direction only. The red arrow in Figure 22 shows the person‟s viewing

direction.

Figure 22: Room Partitioning for Test Study

The 5 persons of different age group and background were selected for the study. The

following scenarios are developed for each person as following:

Cell Phone is on the floor near to the person.

Cell Phone is on the floor at some distance to the person.

Page 61: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

51

Cell Phone is on the table near to the person.

Cell Phone is on the table at some distance to the person.

Chair is near to the person.

Chair is at some distance to the person.

Cabinet is near to the person.

Cabinet is at some distance to the person.

Cell Phone is inside the Cabinet in different shelf positions.

The object is placed at different sectors for each of the scenarios and the responses of each

person are collected. The next section describes the test queries and the responses of each person.

8.4 Responses of the test subjects

The following three queries were asked to each person for different scenarios described in the

previous section:

Where is the object?

Are you able to reach to the object, given your own response for a scenario?

Are you able to reach to the object, given the response collected from another person for

a scenario?

Thus, we collect the response of a person in his/her own perspective and with the cross

verification, we test the responses of the other person as well. The commonalities between the

responses of the persons for the test scenarios are shown in Table 8.

The difference in the responses appears when the object is in sectors 2-5 and sectors 8-11.

The general agreement between the persons for these sectors is:

o When the object is in the sectors 2, 3, 10 and 11; it is in front of the person.

o When the object is in the sectors 4, 5, 8 and 9; it is behind of the person.

Page 62: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

52

Case Response

Cell Phone/Chair/Cabinet is near to the person. Cell Phone Chair Cabinet is within hand‟s reach.

Cell Phone is on the floor. Cell Phone is at my feet.

Cell Phone/Chair/Cabinet is in the sector 1 or 12. Cell Phone/Chair/Cabinet is in front of me slightly

to my left/right

Cell Phone/Chair/Cabinet is in the sector 6 or 7.

Cell Phone/Chair/Cabinet is behind me slightly to

my left/right

Cell Phone inside the Cabinet at lower shelf. Cell Phone is near to my feet.

Cell Phone is inside the Cabinet at middle shelves.

Cell Phone is at knee height, waist height, chest

height or head height.

Cell Phone is inside the Cabinet at upper shelf. Cell Phone is above my head.

Table 8: Common Responses of Test Subjects

8.5 System response development

The responses of the test subjects were analyzed and the system response set was designed.

These system responses were verified with the persons and all the test scenarios were found

conclusive with 96% success rate. Table 9 presents the responses for a case when cell phone is

on the table, near and far to the person. The distance of the object to the person in this case is 3

walking steps. Similar responses are generated by the system for other test case scenarios.

Room

Sectors

Cell Phone, near to the Person Cell Phone, at a distance to the Person

1

Cell Phone is almost in front, and

slightly to your right, on the Table,

within hands reach

Cell Phone is almost in front and slightly

to your right, on the Table. Please walk for

3 steps to reach to the Cell Phone

Page 63: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

53

2

Cell Phone is in front of you to the right,

on the Table, within hands reach

Cell Phone is in front of you to the right,

on the Table. Please walk for 3 steps to

reach to the Cell Phone

3

Cell Phone is straight right to you,

slightly in front, on the Table, within

hands reach

Cell Phone is straight right to you slightly

in front, on the Table. Please walk for 3

steps to reach to the Cell Phone

4

Cell Phone is straight right to you,

slightly behind, on the Table, within

hands reach

Cell Phone is straight right, slightly

behind to you, on the Table. Please walk

for 3 steps to reach to the Cell Phone

5

Cell Phone is behind you to the right,

OR in south east direction, on the Table,

within hands reach

Cell Phone is behind you to the right, OR

in south east direction, on the Table.

Please walk for 3 steps to reach to the Cell

Phone

6

Cell Phone is straight behind you

slightly to your right, on the Table,

within hands reach

Cell Phone is straight behind you, slightly

to your right, on the Table. Please walk for

3 steps to reach to the Cell Phone

7

Cell Phone is straight behind you

slightly to your left, on the Table, within

hands reach

Cell Phone is straight behind you, slightly

to your left, on the Table. Please walk for

3 steps to reach to the Cell Phone

8

Cell Phone is behind you to the left,OR

in south west direction, on the Table,

within hands reach

Cell Phone is behind you to the left, OR in

south west direction, on the Table. Please

walk for 3 steps to reach to the Cell Phone

9 Cell Phone is straight left to you, Cell Phone is straight left, slightly behind

Page 64: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

54

slightly behind, on the Table, within

hands reach

to you, on the Table. Please walk for 3

steps to reach to the Cell Phone

10

Cell Phone is straight left to you,

slightly in front, on the Table, within

hands reach

Cell Phone is straight left to you slightly

in front, on the Table. Please walk for 3

steps to reach to the Cell Phone

11

Cell Phone is in front of you to the left,

on the Table, within hands reach

Cell Phone is in front of you to the left, on

the Table. Please walk for 3 steps to reach

to the Cell Phone

12

Cell Phone is almost in front, and

slightly to your left, on the Table, within

hands reach

Cell Phone is almost in front and slightly

to your left, on the Table. Please walk for

3 steps to reach to the Cell Phone

Table 9: System Response for a Scenario

This concludes our discussion about the system response study. The next chapter presents the

test cases conducted on two test subjects to evaluate the effectiveness of the system.

Page 65: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

55

CHAPTER NINE

SYSTEM TESTING

9.1 Introduction

This chapter details the testing of the system based on different scenarios. The responses of

the test subjects were compared with the ideal responses. The ideal response in each scenario is

the one in which the person suitably finds the object in the room.

9.2 Test Scenario Development

The test scenarios were developed based on the difficulty of finding and reaching the object.

The object may either be within hands reach to the person, or some walking and orientation

movement of the person is involved. The ideal responses of the application were generated by

the test case study described in chapter 8.

In the application testing we have developed 10 test cases which were assigned to each of the

test subjects. Initially the person is oriented in a certain direction and is asked a series of queries

about the objects within the room. Each query asked to the person is a test scenario. Based on the

system response the person attempts to reach to the object. The person wears the head mounted

device, immersing him/her in the virtual environment, and based upon the scenario he/she tries to

reach for the object in the virtual scene. Since the virtual scene is an exact replica of the actual

room prototype, when the person is able to reach to the object in the virtual scene, the object in

actual room should also be within the person‟s reach. If this sequence of action is performed

successfully then the test scenario is marked as successful and if not, the error is noted down for

each test case. Figure 23 shows the top view of the objects in the room and the person‟s initial

position and orientation (shown by the yellow arrow). Following are the test cases used in this

study and the responses provided by the test subjects.

Page 66: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

56

Figure 23: Initial Orientation of the Person in Room

9.3 Test Scenario 1

Query: Find the cell phone.

System Response

Cell phone is in front of you to the left inside the cabinet. Please walk for 2 steps

to reach to the Cell Phone.

Subject1 Response Cell Phone is in front of me to the left inside the cabinet. I reached to the cell

phone after walking 3 steps.

Subject2 Response Cell Phone is in front of me to the left inside the cabinet. I reached to the cell

phone after walking 2 steps.

Result

Error in walking steps for subject 1. The orientation of the object relative to

the person is found to be correct.

Page 67: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

57

Figure 24 shows the orientation of the person after completion of test scenario 1.

Figure 24: Person's Orientation after Test Scenario 1

9.4 Test Scenario 2

Query: Find the cell phone.

System Response Cell phone is almost in front of you to the right inside the cabinet within hand‟s

reach.

Subject1 Response Cell Phone is directly in front of me to the left inside the cabinet.

Subject2 Response Cell Phone is directly in front of me to the left inside the cabinet.

Result

Test scenario successful. The subjects are able to find the object within hand’s

reach.

Page 68: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

58

9.5 Test Scenario 3

The cell phone is moved inside the cabinet to the lower shelf, left of the person.

Query: Find the cell phone.

System Response

Cell phone is in front of you to the left inside the cabinet. Please walk for 2 steps

to reach to the Cell Phone.

Subject1 Response

Cell Phone is almost in front of me to the left inside the cabinet within hand‟s

reach.

Subject2 Response

Cell Phone is almost in front of me to the left inside the cabinet within hand‟s

reach.

Result Error in orientation and distance of the object relative to the person.

9.6 Test Scenario 4

The cell phone is moved inside the cabinet to the lower shelf, right of the person.

Query: Find the cell phone.

System Response

Cell phone is in front of you to the right inside the cabinet. Please walk for 2

steps to reach to the Cell Phone.

Subject1 Response Cell Phone is in front of me to the right inside the cabinet, 2 steps ahead.

Subject2 Response Cell Phone is in front of me to the right inside the cabinet, 1 step ahead.

Result Error in walking step for subject 2. Orientation is correct.

Figure 25 shows the person‟s orientation after test scenario 4.

Page 69: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

59

Figure 25: Person's orientation after test scenario 4

9.7 Test Scenario 5

Query: Find the Chair.

System Response

Chair is behind you to the right, OR in south east direction, on the Floor

within hand‟s reach.

Subject1 Response Chair is straight to my right within hand‟s reach.

Subject2 Response Chair is directly to my right within hand‟s reach.

Result

Error in chair’s orientation relative to the person. Distance information is

correct.

9.8 Test Scenario 6

The person is sitting on the chair.

Page 70: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

60

Query: Find the Cabinet.

System Response

Cabinet is in front of you to the right, on the Floor. Please walk for 2 steps

to reach to the Cabinet.

Subject1 Response

Cabinet is in front of me to the right, on the Floor. I can reach to the cabinet

in 2 steps.

Subject2 Response

Cabinet is in front of me to the right, on the Floor. I can reach to the cabinet

in 2 steps.

Result Test Case Successful.

Figure 26 shows the person‟s orientation after test scenario 6.

Figure 26: Person's orientation after test scenario 6

Page 71: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

61

9.9 Test Scenario 7

The person is sitting on the chair.

Query: Find the Table.

System Response

Table is in front of you to the left, on the Floor. Please walk for 2 steps to

reach to the Cabinet.

Subject1 Response Table is in front of me to the left, on the Floor within hand‟s reach.

Subject2 Response Table is in front of me to the left, on the Floor within hand‟s reach.

Result Orientation of table is correct. Distance to the table is erroneous.

9.10 Test Scenario 8

The person is standing near to the chair. The orientation of person is same as in test scenario 7.

Query: Find the Medicine Box.

System Response

Medicine Box is in front of you to the left, on the Table. Please walk for 2

steps to reach to the Medicine Box.

Subject1 Response Medicine Box is in front of me to the left, on the Table, 1 step ahead.

Subject2 Response Medicine Box is straight left to me, on the Table, 1 step ahead.

Result Error in Orientation and distance of medicine box to the person.

9.11 Test Scenario 9

The person is standing near to the table. Figure 27 shows the person‟s orientation.

Query: Find the Medicine Box.

System Response

Medicine Box is behind you to the left, OR in south west direction, on the

Table. Please walk for 2 steps to reach to the Medicine Box.

Subject1 Response Medicine Box is in slightly behind me to my left, on the Table, 1 step ahead.

Page 72: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

62

Subject2 Response

Medicine Box is in slightly behind me to my left, on the Table, 2 steps

ahead.

Result Error in Orientation and distance of medicine box to the person.

Figure 27: Person's orientation for test scenario 9

9.12 Test Scenario 10

The person is standing near the cabinet. Figure 28 shows the person‟s orientation.

Query: Find the Medicine Box.

System Response

Medicine Box is behind you to the right, OR in south east direction, on the

Table. Please walk for 2 steps to reach to the Medicine Box.

Subject1 Response Medicine Box is in behind me to my right, on the Table, 2 step ahead.

Page 73: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

63

Subject2 Response Medicine Box is in behind me to my right, on the Table, 2 step ahead.

Result Test scenario successful.

Figure 28: Person's orientation for test scenario 10

9.13 Test Results

The following table summarizes the results of each test scenario:

Test Scenario Error in Direction Error in Walking

Distance

Test Scenario

Successful?

1 NO YES NO

2 NO NO YES

Page 74: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

64

3 YES YES NO

4 NO YES NO

5 YES NO NO

6 NO NO YES

7 NO YES NO

8 YES YES NO

9 YES YES NO

10 NO NO YES

Table 10: Result Summary from Test Scenarios

Figure 29 presents pie chart for the test case success in system generates response related to

direction data.

Figure 29: Pie Chart for Test Scenario Success in Direction Data

Success

60%

Failure

40%

Page 75: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

65

Figure 30 shows the pie chart for test scenario success in system generated response related to

distance data.

Figure 30: Pie Chart for Test Scenario Success in Distance Data

Figure 31 shows the pie chart for overall successful test scenarios.

Figure 31: Proportion of Successful Test Scenarios

Success

40%Failure

60%

Success

30%

Failure

70%

Page 76: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

66

CHAPTER TEN

CONCLUSION

10.1 Introduction

This chapter summarizes the results gathered from the application testing and suggests

ways to improve system performance.

10.2 Result Analysis

The previous chapter presents the results of test scenarios. We found that overall system

performance was around 33%, while the performance of system generated orientation and

distance data were 60% and 40% respectively.

The first major reason of low success rate is attributed to the erroneous bird/sensor data.

Since the person‟s distance and orientation to the object relies on the sensor data, the error

produced by it causes wrong input to the system which in effect generating the wrong response.

We have introduced some calibration factors for the sensor data, but this is not sufficient. As the

calibration of bird data improves, so does the system response in evaluating an object‟s location

and orientation with respect to the person.

The second reason affecting the system performance is the different step length of each

person. We have used an average walking step length for each person, but actually the normal

step length for each person varies. This causes +/- 1 walking step difference between the system

response and the person following the response when the system response specifies up to 4

walking steps. If we could use each person‟s walking step length, that will alleviate the errors

caused in distance data of the object relative to the person.

The speech recognition technology used in our work depends upon the person‟s audio

profile created in the machine running the application. Since the test subjects had not created

Page 77: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

67

their audio profiles in the machine, there were problems when the person was issuing the audio

query for the system. We suggest that a person train his/her voice for the speech recognition

module so that the system is able to understand the audio query in a better way.

VR-SMART system introduces the capabilities of CAD and VR for smart environment

development. The system incorporates the feedback of a resident related to design of smart

environment and assistive methods. Thus, the system brings forth a resident in loop of smart

environment development and addresses any customization demands or concerns of an

individual. This work utilizes a six degree of freedom tracking system which records the

resident‟s and objects‟ movement without any privacy intrusion. An individual can get a feel of

the smart home through immersive environment before moving to actual home, and the training

related to various smart home capabilities are provided by the system. In this way, a resident

feels more confident in utilizing the smart home functionalities.

It is easy to model the virtual reality scene for any room because the CAD designer only

needs to model any object in the CAD system once, based on the actual measurements of room.

Therefore, we have a fast and reliable method to customize the application for different room

settings. The person utilizing the benefits of smart home would find it easy to interact with the

system through audio query.

10.3 Future Work

The VR-Smart system has the capability to integrate with the existing smart home systems.

We have provided an interface through which other smart home system can send the signal/data

related to electrical appliances states or health monitoring systems. These states can be visualized

in virtual environment when an event is triggered from the external systems.

Page 78: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

68

The VR-Smart system can be extended to incorporate gesture recognition technology for

the person with speaking disorders, to help in performing some daily tasks e.g. switching on the

light, controlling household appliances etc. Likewise, if a person has some walking disability and

asks a query, the wheelchair can directly take the system‟s response and move from one place to

another safely.

Page 79: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

69

REFERENCES

1. Bureau, U.S. Census. 2005.

2. Bureau, U.S. Census. 2000.

3. How smart are our environments? An updated look at the state of the art. D. Cook, S. K. Das.

2007, Journal of Pervasive and Mobile Computing.

4. Statistics, Centers for Medicare and Medicaid Services. 2006.

5. Poll, American Association of Retired Persons (AARP). 2006.

6. Pollack, M. E. Intelligent technology for an aging population: The use of AI to assist elders

with cognitive impairment. AI Magazine. 2005, Vol. 26, 2, pp. 9-24.

7. Assessment of older people: self-maintaining and instrumental activities of daily living.

Brody, M.P. Lawton and E.M. s.l. : Gerontologist, 1969, Vols. 9:179–186.

8. Functional limitations to daily living tasks in the aged. W.A. Rogers, B. Meyer, N. Walker,

A.D. Fisk. s.l. : Human Factors, 1998, Vols. 40:111–125.

9. Lifestyle monitoring: technology for supported independence. Barnes, N. M., Edwards N.

H.,Rose, D. A. D. & Garner P. s.l. : IEE Computing and Control Engineering Journal, August

1998.

10. Assisted interactive dwelling house: Edinvar housing association smart technology

demonstrator and evaluation site. Bonner, S. s.l. : Improving the Quality of Life for the

European Citizen (TIDE), 1998.

11. Development of an ambulatory physical activity monitoring device and its application for

categorization of actions in daily life. Makikawa, M. & Iizumi, H. 1995, Medinfo, pp. 747-750.

Page 80: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

70

12. Activity and location recognition using wearable sensors. Lee, S.W. and Mase, K. 2002,

IEEE Pervasive Computing.

13. Activity Summarization and Fall Detection in a Supportive Home Environment. McKenna,

H. 2004. Proceedings of the 17th IEEE International Conference on Pattern.

14. A Smart Sensor to Detect the Falls of the Elderly. Sixsmith, A., Johnson, N. 2, s.l. : IEEE

Pervasive Computing, 2004, Vol. 3.

15. Tracking activities in complex settings using smart environment technologies. G. Singla, D.

Cook, and M. Schmitter-Edgecombe. 1, 2009, International Journal of BioSciences, Psychiatry

and Technology, Vol. 1, pp. 25-35.

16. Steve Aukstakalnis, David Blatner. Silicon Mirage; The Art and Science of Virtual Reality.

s.l. : Peachpit Press Berkeley, CA, USA. p. 7.

17. The application of virtual reality technology in rehabilitation. Maria T. Schultheis, Albert

A. Rizzo. 3, 2001, Rehabilitation Psychology, Vol. 46, pp. 296-311.

18. Virtual reality and cognitive rehabilitation: A brief review of the future. Rizzo A. A.,

Buckwalter J. G., Neumann U. 6, 1997, Journal of Head Trauma Rehabilitation, Vol. 12, pp. 1-

15.

19. L., Thorndike E. Educational Psychology. New York : Lemcke & Buechner, 1903.

20. Virtual reality, disability and rehabilitation. Wilson P., Foreman N., Stanton D. 1997,

Disability and Rehabilitation, Vol. 19, pp. 213-220.

21. A highend virtual reality setup for the study of mental rotations. Lehmann A., Vidal M.,

Bu¨lthoff H. H. 4, 2008, Presence: Teleoperators and Virtual Environments, Vol. 17, pp. 365-

375.

Page 81: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

71

22. High ecological validity and accurate stimulus control in VR-based psychological

experiments. Wolter, M., Armbru¨ster, C., Valvoda, J. T., & Kuhlen, T. s.l. : Proceedings of

Eurographics Symposium on Virtual Environments/Immersive Projection Technology

Workshop, 2007. pp. 25-32.

23. Virtual reality in psychotherapy: Review. Riva, G. 3, 2005, CyberPsychology & Behavior,

Vol. 8, pp. 220-230.

24. WHO. International Classification of Functioning, Disability and Health (ICF). New York :

World Health Organization ( WHO), 2001.

25. The Potential of Virtual Reality and Gaming to Assist Successful Aging with Disability. 2,

s.l. : Elsevier Inc., 2010, Physical Medicine and Rehabilitation Clinics of North America, Vol.

21, pp. 339-356.

26. A survey of computer vision-based human motion capture. Moeslund G, Granum E. s.l. :

Computer Vision and Image Understanding, 2001, Computer Vision and Image Understanding,

Vols. 81(3):231-268.

27. Recent Developments in Human Motion Analysis. Wang L, Hu W, Tan T. s.l. : Pattern

Recognition, 2003, Pattern Recognition, Vols. 36(3):585-601.

28. Studies of human locomotion: Past, present and future. Andriacchi TP, Alexander EJ. s.l. :

Journal of Biomechanics, 2000, Vols. 33(10):1217-1224.

29. Human Motion Analysis: Current Applications and future directions. Harris GF, Smith PA.

New York : IEEE Press, 1996.

30. Potential strategies to reduce medial compartment loading in patients with knee OA of

varying severity: Reduced walking speed. Mündermann A, Dyrby CO, Hurwitz DE, Sharma

L, Andriacchi TP. s.l. : Arthritis and Rheumatism, 2004, Vols. 50(4):1172-1178.

Page 82: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

72

31. Animal locomotion. Muybridge, E. 1887, Philadelphia: J.B. Lippincott.

32. Animal Mechanism: A Treatise on Terrestrial and Aerial Locomotion. Marey, E. 1874,

London: Henry S. King & Co.

33. A survey of gloves for interaction with virtual worlds. Andersen, C. 1998, Technical Report,

Aalborg University, Denmark.

34. Motion Capture White Paper. Dyer, S., Martin, J. & Zulauf, J. 1995, Technical Report,

Windlight studios & Wavefront.

35. Off the shelf, real-time, Human body motion capture for synthetic environments. Frey, W.,

Zyada, M., Mcghee, R. & Cockayne, B. 1996, Technical report, Naval Postgraduate School,

USA.

36. Three dimensional analysis of human movement. Ladin, Z. 1995.

37. Review of virtual environment interface technology. Youngblut, C., Johnson, R., Nash, S.,

Wienclaw, R. & Will, C. 1996, Technical report, Institute of defence analysis.

38. Tracking people with twists and exponential maps. Bregler, C. & Malik, J. 1998.

International conference on Computer Vision and Pattern Recognition.

39. The evolution of methods for the capture of human movement leading to markerless motion

capture for biomechanical applications. Mündermann, L., Corazza, S., Andriacchi, T. P.

2006, Journal of NeuroEngineering and Rehabilitation.

40. Discovering activities to recognize and track in a smart environment. P. Rashidi, D. Cook,

L. Holder, and M. Schmitter-Edgecombe. s.l. : IEEE Transactions on Knowledge and Data

Engineering, 2010.

Page 83: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

73

41. Recognizing independent and joint activities among multiple residents in smart

environments. G. Singla, D. Cook, and M. Schmitter-Edgecombe. s.l. : Ambient Intelligence

and Humanized Computing Journal, 2010, Vol. 1.

42. IDEF0-Function Modeling Method. http://www.idef.com/idef0.htm. [Online]

43. Industry case studies in the use of immersive virtual assembly. Sankar Jayaram, Uma

Jayaram , Young Jun Kim, Charles DeChenne, Kevin W. Lyons, Craig Palmer and

Tatsuki Mitsui. s.l. : Springer London, 2007, Virtual Reality, Vol. 11. 4.

44. Object-Oriented Graphics Rendering Engine. http://www.ogre3d.org. [Online]

45. Kim, OkJoon. M. S. Thesis: AN APPROACH TO ENHANCE A TRADITIONAL

ERGONOMICS TOOL WITH ASSEMBLY CAPABILITIES AND ALGORITHMS FROM AN

IMMERSIVE ENVIRONMENT. Pullman : s.n., 2007.

46. Microsoft Speech Technolgies. [Online] Microsoft.

http://www.microsoft.com/speech/default.aspx.

Page 84: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

74

APPENDIX A

Core Code in VR-SMART system

Page 85: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

75

Voice Recognition System

/*******COM pointers, Speech SDK specific. **********/

/*The recognition context, CComPtr<ISpRecoContext> is created for the engine.

A context is any single area of the application needing to process speech.

In our application, only one recognition context is used.*/

CComPtr<ISpRecoContext> g_cpRecoCtxt;

/*A grammar specifies what the speech recognizer will recognize. Essentially

there are two types of grammars. One is for dictation and the other is command

and control. Our Application uses the dictation command because the text

generated from speech is sufficient to identify the query object.*/

CComPtr<ISpRecoGrammar> g_cpCmdGrammar;

/* The recognizer object, CComPtr<ISpRecognizer> is created with an

in-process option. g_cpEngine, is a Pointer to our recognition engine instance*/

CComPtr<ISpRecognizer> g_cpEngine;

/******Converting Audio to Text************/

int VRApplication::getQuery()

{

HRESULT hr;

CComPtr<ISpRecoResult> cpResult;

//The loop is iterated till BlockForResult() receives any audio event

//successfully. cpResult contains the audio stream.

while (SUCCEEDED(hr = BlockForResult(g_cpRecoCtxt, &cpResult)))

{

CSpDynamicString dstrText; //Output wide char text string.

if (SUCCEEDED(cpResult->GetText(SP_GETWHOLEPHRASE,

SP_GETWHOLEPHRASE, TRUE, &dstrText, NULL)))

{

USES_CONVERSION;

question = new char[120];

strcpy(question, W2A(dstrText));

break;

}

}

return (hr==S_OK);

}

Page 86: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

76

HRESULT VRApplication::BlockForResult(ISpRecoContext * pRecoCtxt, ISpRecoResult **

ppResult)

{

HRESULT hr = S_OK;

CSpEvent event;

//The loop is iterated till any audio event is recognized.

//The audio stream is stored in the recognition context.

while (SUCCEEDED(hr) && SUCCEEDED(hr = event.GetFrom(g_cpRecoCtxt)) &&

hr == S_FALSE)

{

hr = g_cpRecoCtxt->WaitForNotifyEvent(INFINITE);

}

//store the audio stream in result variable.

*ppResult = event.RecoResult();

if (*ppResult)

{

(*ppResult)->AddRef();

}

return hr;

}

Text-To-Speech recognition system

/*******COM pointers, Speech SDK specific. **********/

/*The recognition context, CComPtr<ISpRecoContext> is created for the engine.

A context is any single area of the application needing to process speech.

In our application, only one recognition context is used.*/

CComPtr<ISpRecoContext> g_cpRecoCtxt;

/*A grammar specifies what the speech recognizer will recognize. Essentially

there are two types of grammars. One is for dictation and the other is command

and control. Our Application uses the dictation command because the text

generated from speech is sufficient to identify the query object.*/

CComPtr<ISpRecoGrammar> g_cpCmdGrammar;

/* The recognizer object, CComPtr<ISpRecognizer> is created with an

in-process option. g_cpEngine, is a Pointer to our recognition engine instance*/

CComPtr<ISpRecognizer> g_cpEngine;

/*****************Initialization for Audio Commands*******************/

void VRApplication::convertTextToAudio(char* response)

{

ISpVoice *pVoice = NULL;//Pointer for handling audio stream.

size_t inputsize = strlen(response) + 1;

const size_t newsize = 200;

Page 87: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

77

size_t convertedChars = 0;

wchar_t wcstring[newsize]; //The input wide char text string to be

//converted into audio stream.

//converting a char string into a wide char string.

mbstowcs_s(&convertedChars, wcstring, inputsize, response, _TRUNCATE);

//Initializing the ISpVoice pointer for managing the audio stream.

HRESULT hr = CoCreateInstance(CLSID_SpVoice, NULL, CLSCTX_ALL,

IID_ISpVoice, (void **)&pVoice);

if( SUCCEEDED( hr ) )

{

pVoice->SetRate(-4); //Speed at which the text is spoken.

pVoice->SetVolume(50); //Audio volume.

//Speaking the input text string.

hr = pVoice->Speak(wcstring, SPF_DEFAULT , NULL);

//Releasing the resources held by pointer.

pVoice->Release();

pVoice = NULL;

}

return;

}

Collision-Detection system

Ogre::String& VRApplication::collisionDetection(Ogre::String& objName, Ogre::String&

fName, Ogre::String& lName)

{

//Executing the intersection query initialized in createScene() function.

IntersectionSceneQueryResult& queryResult = intersectionQuery->execute();

//A list of pairs(object-object).

SceneQueryMovableIntersectionList iList = queryResult.movables2movables;

int listSize = iList.size();

//Iterator for visiting the pairs in the list.

SceneQueryMovableIntersectionList::iterator iter;

int objType;

//Getting the type of object. If object is table or the cabinet, it is on the floor.

mEventMgr.getType(objName.c_str(), &objType);

if(objType == _Table || objType == _Cabinet)

Page 88: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

78

{

lName = "floor";

return lName;

}

for ( iter = iList.begin(); iter != iList.end(); iter++ )

{

fName = iter->first->getName();

lName = iter->second->getName();

//The target object name will either in 'fname' or in 'lname'.

if(objName.compare(fName)==0 || objName.compare(lName)==0)

break;

}

if(objName.compare(fName)==0)

//If object name is found in 'fname' then it is in collision with the name stored in

//'lname'.

return lName;

else if(objName.compare(lName)==0)

//If object name is found in 'lname' then it is in collision with the name

//stored in 'fname'

return fName.

else

{

lName = "floor"; //Otherwise, if the object is not colliding with other object then

//it is on the floor.

return lName;

}

}

OGRE objects creation

/*****Creating the OGRE objects corresponding to CAD objects.*******/

void VRApplication::createObjects()

{

ifstream infile; //for reading room_2.DAT file.

int loc=-1, i=0, found =0, idx=1;

string str, tmp;

char name[20];

float matrix[4][4];

char entity_name[20];

/***Setting up the path for reading the transformation file***/

Page 89: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

79

char *path = "C:\\MODELS\\Output_vade_models\\room_2\\ASSEMBLY_INFO\\";

char *temp_path = new char[strlen(path)+strlen("room_2.DAT")+1];

strcpy(temp_path, path);

strcat(temp_path, "room_2.DAT");

infile.open(temp_path);

//A person entity created in virtual scene.

personEnt = mSceneMgr->createEntity("person", "ninja.mesh");

personNode = mSceneMgr->getRootSceneNode()->createChildSceneNode();

personNode->attachObject(personEnt);

personNode->translate(Vector3(73.5,1,-15));

personNode->scale(Vector3(0.01f, 0.01f, 0.01f));

Vector3 obj_pos;

Ogre::Quaternion obj_ori;

obj_pos = personNode->getPosition();

obj_ori = personNode->getOrientation();

//seeting up the p_object structure for person node.

strcpy(p_object[0].object_name,"PERSON");

p_object[0].pos[0] = obj_pos[0];

p_object[0].pos[1] = obj_pos[1];

p_object[0].pos[2] = obj_pos[2];

p_object[0].ori[0] = obj_ori[0];

p_object[0].ori[1] = obj_ori[1];

p_object[0].ori[2] = obj_ori[2];

p_object[0].ori[3] = obj_ori[3];

while(!infile.eof())

{

getline(infile, str);

if(str.find("PRT")!=string::npos|| str.find("ASM")!=string::npos)

{

found = 0;

/*From transformation file, only transformation matrices from ASM

components are used and part components of FLOOR, WALL and CEILING*/

if(str.find("FLOOR")!=string::npos || str.find("WALL")!=string::npos

|| str.find("CEILING")!=string::npos || str.find("ASM")!=string::npos)

{

loc = str.find('.');

tmp.assign(str,0,loc);

user_string_to_lower(tmp.c_str(), name);

strcpy(entity_name, name);

strcat(name, ".mesh");

i=0;

found = 1;

Page 90: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

80

entity_num++; //entity_num, maintains number of entities in

virtual scene.

continue;

}

else

continue;

}

if(found)

{

sscanf(str.c_str(),"%f %f %f% %f",&matrix[i][0], &matrix[i][1],

&matrix[i][2], &matrix[i][3]);

if(i==3)

{

Ogre::Matrix3 m(matrix[0][0],matrix[0][1],matrix[0][2],

matrix[1][0],matrix[1][1],matrix[1][2],

matrix[2][0],matrix[2][1],matrix[2][2]);

Ogre::Quaternion q(m);

p_ent[entity_num] = mSceneMgr->createEntity(entity_name,

name);

//Different type of textures are being applied for each object.

if(strcmp(entity_name, "table")==0)

p_ent[entity_num]->setMaterialName("Examples/TableTexture");

else if( strcmp(entity_name, "cabinet")==0 )

p_ent[entity_num]->setMaterialName("Examples/ChairTexture");

else if( strcmp(entity_name, "cellphone")==0 )

p_ent[entity_num]->setMaterialName("Examples/PhoneTexture");

else if(strncmp(entity_name, "chair", 5)==0)

p_ent[entity_num]->setMaterialName("Examples/ChairTexture");

else if(strcmp(entity_name, "floor")==0 )

p_ent[entity_num]->setMaterialName("Examples/FloorTexture");

else if(strcmp(entity_name, "ceiling")==0 )

p_ent[entity_num]->setMaterialName("Examples/FloorTexture");

else

p_ent[entity_num]->setMaterialName("Examples/PhoneTexture");

//Each Ogre enitity is attached to a scene node.

p_node[entity_num] =mSceneMgr->getRootSceneNode()-

>createChildSceneNode();

p_node[entity_num]->attachObject(p_ent[entity_num]);

//Transformation matrices are applied to each scene node

p_node[entity_num]->rotate(q);

p_node[entity_num]->translate(matrix[3][0], matrix[3][1],matrix[3][2]);

Page 91: VR-SMART - A VIRTUAL REALITY SYSTEM FOR SMART …By Vinay Mishra, M.S. Washington State University August 2010 Chair: Sankar Jayaram This thesis work presents a novel approach to assist

81

obj_pos = p_node[entity_num]->getPosition();

obj_ori = p_node[entity_num]->getOrientation();

idx = entity_num+1; //To maintain the index of objects for

EventManager.

//Setting up the data into p_object structure which will be passed to EventManager.

strcpy(p_object[idx].object_name,tmp.c_str());

p_object[idx].pos[0] = obj_pos[0];

p_object[idx].pos[1] = obj_pos[1];

p_object[idx].pos[2] = obj_pos[2];

p_object[idx].ori[0] = obj_ori[0];

p_object[idx].ori[1] = obj_ori[1];

p_object[idx].ori[2] = obj_ori[2];

p_object[idx].ori[3] = obj_ori[3];

}

i++;

}

}

/*Read the birds.dat file for objects-bird association*/

loadBirdConfigFile("birds.DAT");

//initialize the locations of the room objects based on the CAD model.

mEventMgr.setCurrentLocation(p_object, idx);

/*create the intersection query object to be utilized in collisionDetection() function*/

intersectionQuery = mSceneMgr->createIntersectionQuery();

//cleanup of temp_path

if(temp_path)

{

delete[] temp_path;

infile.close();

}

return;

}