a stereoscopic visualization environment for air traffic ... · a stereoscopic 3d visualization...

177
A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW INTERACTION TECHNIQUES A Thesis Presented by Nguyen-Thong DANG In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in Computer Science Ecole Pratique des Hautes Etudes 2005 1

Upload: vocong

Post on 31-Mar-2019

225 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL

AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW INTERACTION TECHNIQUES

A Thesis Presented

by

Nguyen-Thong DANG

In Partial Fulfillment of the Requirements for the Degree

Doctor of Philosophy in Computer Science

Ecole Pratique des Hautes Etudes 2005

1

Page 2: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

2

Page 3: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Dedicated to Xuân and Sao Mai for their love and support through the years

3

Page 4: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

4

Page 5: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

ACKNOWLEDGEMENTS I would like to take this opportunity to thank Dr. Vu Duong, Head of Innovative Research Area of EUROCONTROL Experimental Centre, for giving me a chance to work with Virtual Environment and 3D User Interface. Thank for always supporting me and guiding me to do the research. Thank for helping me to understand the meaning of the research. This is the most valuable thing that I was able to obtain during the thesis time.

I would like to thank Professor Marc Bui who accepted to supervise this work. Thank for your support, your precious advice during the thesis time. Thank for helping me improve my presentation skill. Thank for your availability for my work.

I would like to express thanks to Dr. Wendy Mackay, Director of research at INRIA (the French National Institute for Research in Computer Science and Control) and Professor Patrick Bellot from ENST (Ecole Nationale Supérieure des Télécommunications), who have interest on my work and accepted to review my thesis.

I would like to thank Dr. Francois Jouen, Director of study at Ecole Pratique des Hautes Etudes and Professor Anders Ynnerman from the Linköping University for having interest on my work and for accepting to be among of members of the committee for the defense of my thesis.

Special thanks to my team mate Monica Tavanti for spending hours to read my thesis, to correct my English and to provide detailed comments and useful advices for me. Monica, without your help, I could not finish the thesis. Also, thank for bringing to the team the knowledge about the humans factors and the usability evaluation. That was a great chance for me to work in a team with you and with Ha Le-Hong. Thank also to Ha for his jokes that always bring an enjoyable ambiance to the team.

Thanks to Marc Bourgois, Deputy Manager of Innovative Research Area of EUROCONTROL Experimental Centre for devoting your time to read my thesis and for giving me many precious comments that helped me to finalize the thesis.

Thanks to Martina Jurgen for helping me to correct my English and for daily support and help at EUROCONTROL Experimental Centre.

Thanks to “Santa Claus” Horst HERING for daily support. Thank for your availability every time I needed your helps.

Thanks to all the EEC students - Peter CHOROBA, Antonia COKASOVA, Claus GWIGGNER, Frederic FERCHAUD, Nabil BELOUARDY, Elzbieta PINSKA, Sophie MAGNIER, Sonja STRAUSSBERGER, Magnus AXHOLT, Stephen PETERSON, and many others - who shared their knowledge in different fields with me during the Friday meetings, and for helping me to improve my communication skills.

Thanks to the many volunteers who were test subjects for devoting your time for the test and for providing me with many useful feedbacks.

Thanks also to my friends Linh, Minh, Cúc, Huy, Sĩ Hoàng, Liên, Công and many others who helped me though the difficult times.

5

Page 6: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Above all, the greatest thanks to my family: my wife Xuân and my little girl Sao Mai. Xuân, thank for always supporting me and taking care of me with your love. Thank for many brilliant ideas that helped me a lot in my work. Thank for always being with me through the difficult times. Thank for always making sure that I were not hungry during the work. My little Sao Mai, your presence gives me the strength to go on and pursuit the research.

I am deeply indebted to all of these people, and to others who I am sure I must have forgotten. I hope that they will pardon me.

6

Page 7: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

ABSTRACT This thesis is a part of an investigation of the applicability of Virtual Environment-based three-dimensional (3D) user interfaces for air traffic control (ATC) (in short, 3D-ATC environment) through a multidisciplinary framework which takes into account the assessment of human factors as central design principles for visualisation and interaction. This thesis investigates the interaction features of a 3D-ATC environment on three sides: the main interaction tasks in a 3D-ATC environment, the interaction techniques which can be used for the interaction tasks in the context of a 3D-ATC environment and the interaction issues specialized in a 3D-ATC environment.

To perform the analysis on interaction tasks and interaction techniques, the present thesis considers a 3D-ATC environment as a combination of 4 components: Object, View, Interaction Task and Interaction Technique. The analysis attempted to identify the appropriate interaction techniques for two contexts: Interaction Task with the Object and Interaction Task with the View. The analysis results in different combinations of interaction techniques for interaction tasks in which the ray-casting technique is recognized as the key technique for the object selection and manipulation tasks while the combination of flying vehicle technique and wayfinding aids and the World-In-Miniature technique as good candidates for the navigation task.

Early evaluations on the usability of the ray-casting technique revealed the problem of selection of occluded objects (in short, the problem of occlusion). The existing interaction techniques do not solve this problem well. A new approach, the Selection-By-Volume approach which uses a combination of geometric shape and two-dimensional menu system for interaction with the 3D objects, was proposed to solve the problem of occlusion. An empirical evaluation of Transparent Sphere and Transparent Cylinder, two interaction techniques derived from the Selection-By-Volume approach, was carried out. The experimental results show that the interaction techniques derived from the Selection-By-Volume approach, through the Transparent Sphere and Transparent Cylinder techniques, can solve the problem of occlusion and can be used for interaction with the 3D objects in general.

Keywords: Air Traffic Control, Virtual Environment, 3D User Interfaces, Human Computer Interaction, 3D Interaction, 3D Interaction Technique.

7

Page 8: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

8

Page 9: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

RÉSUMÉ La surcharge du trafic aérien avec une augmentation du trafic de l’ordre de 5% a 6% par an, en Europe et aux Etats-Unis, et de plus de 10% en Asie pose un problème difficile à résoudre quant au changement de l’organisation de travail de contrôleurs aériens, ainsi qu’aux procédures opérationnelles associées. Plusieurs tentatives ont été effectuées afin d’améliorer les conditions de travail des contrôleurs, les modes opérationnels, et des méthodes de contrôle aérien. Cependant, des résultats significatifs sont toujours attendus. Une solution est de proposer le changement de l’interface du système d’information des contrôleurs pour un environnement tridimensionnelle (3D) afin que les contrôleurs puissent contrôler la circulation aérienne directement en 3D.

Au lieu de regarder tout ce qui se passe sur l’écran RADAR bidimensionnel sur lequel chaque avion est représente comme un point et l’espace comme une surface plane, les contrôleurs, immergés dans un environnement 3D, peuvent observer et agir sur les avions et l’espace d’une manière plus conforme à la réalité. Ils peuvent agir de façon naturelle en utilisant des gestes, toucher et manipuler des représentation virtuelles des vols et des trajectoires etc. pour résoudre les conflits entre des avions et pour planifier le trafic.

Il s’agit donc d’appliquer les techniques de réalité virtuelle à l’interface des contrôleurs du trafic aérien. Ce problème est à l’intersection de trois domaines : le contrôle du trafic aérien (ATC), la réalité virtuelle et l’interface homme machine. Il s’agit donc d’exploiter les avancées recherches et technologies du domaine de la réalité virtuelle en basant sur les principes de conception de l’interface homme-machine, avec une approche facteurs humaines pour fournir aux contrôleurs du trafic aérien un nouveau système plus adapté aux changements éventuels de l’organisation de leur travail et des futurs systèmes de contrôle du trafic aérien.

Une approche multidisciplinaire a proposé pour étudier la faisabilité d’une telle interface tridimensionnelle. Cette approche comporte trois composantes : l’interaction tridimensionnelle, la visualisation d’informations tridimensionnelle et l’aspect facteurs humains.

Cette thèse a pour objectif d’étudier la composante interaction tridimensionnelle. Il s’agit d’analyser toutes les tâches d’interaction possibles des contrôleurs au sein d’une interface 3D, de trouver des techniques d’interaction parmi celles existantes qui conviennent à l’interaction avec chaque objet aérien tel que l’espace aérienne, le secteur, le vol, la trajectoire etc. et de trouver des solutions, c’est à dire de nouvelles techniques d’interaction, pour les problèmes d’interaction 3D spécifiques liés au domaine ATC

La thèse développe deux contributions principales.

Premièrement, une analyse détaillée des tâches d’interaction et des techniques d’interaction qui peuvent être utilisées pour l’interaction dans l’interface 3D pour les contrôleurs a été proposée comme la première contribution. Cette analyse a été effectuée en considérant l’interface 3D comme une combinaison de 4 composants: Objet, Vue, Tâche d'interaction et Technique d’interaction. La navigation est la tâche d’interaction effectuée avec la Vue tandis que la sélection et la manipulation sont les tâches d’interaction effectuées avec l’Objet. La tâche de manipulation se compose de trois sous-tâches: le placement d’objet, la modification des attributs d’objet et la question sur l’information des objets.

9

Page 10: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

L’analyse a montré que la technique d’interaction ray-casting est appropriée pour la tâche de sélection et manipulation des objets ATC et que la combinaison entre la technique Flying Vehicle et les aides de wayfinding et la technique World-In-Miniature sont convenable pour la navigation dans l’espace ATC tridimensionnelle. Plus détaillé, quatre possibilités pour la sélection d’objet et la manipulation d’objet qui sont formées autour de la technique ray-casting ont été proposées comme suivantes:

• La technique ray-casting pour la tâche de sélection d’objet et la tâche de placement d’objet; la combinaison entre la technique ray-casting et le menu graphique pour la tâche de modification des objets et de question sur l’information des objets.

• La technique ray-casting pour la tâche de sélection d’objet, combinée avec le menu graphique pour la tâche de modification des objets et de question sur l’information des objets; la technique d’interaction virtual hand pour le placement des objets.

• La commande par voix pour la tâche de sélection d’objet; la technique ray-casting pour la tâche de placement d’objet; la combinaison entre la technique ray-casting et le menu graphique pour la tâche de modification des objets et de question sur l’information des objets.

• La commande par voix pour la tâche de sélection d’objet; la technique d’interaction virtual hand pour le placement des objets; la combinaison entre la technique ray-casting et le menu graphique pour la tâche de modification des objets et de question sur l’information des objets.

Deuxièmement, une nouvelle approche nommé ‘sélection par volume’ et deux nouvelles techniques d’interaction nommées ‘sphère transparente’ et ‘cylindre transparent’ sont proposées pour résoudre le problème d’objet occulté. Ce problème a été identifié grâce à des évaluations sur la technique ray-casting pour la tâche de sélection et manipulation. Ces deux techniques ont été expérimentées pour comparer la performance à celles existantes. Les résultats expérimentaux ont montré que ces deux métaphores répondent correctement au problème de sélection des objets occultés avec une bonne performance et peuvent être utilisées pour la sélection et manipulation d’objet dans un environnement virtuel.

Mots-clefs: Contrôle du Trafic Aérien, Environnement Virtuel, Interface Tridimensionnelle, Interface Homme-Machine, Interaction Tridimensionnelle, Technique d’Interaction Tridimensionnelle.

10

Page 11: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

TABLE OF CONTENTS A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL......................................................................................................................... 1

ACKNOWLEDGEMENTS ............................................................................................. 5

ABSTRACT........................................................................................................................ 7

RÉSUMÉ ........................................................................................................................... 9

TABLE OF CONTENTS .............................................................................................. 11

LIST OF TABLES........................................................................................................... 15

LIST OF ILLUSTRATIONS ......................................................................................... 17

CHAPTER 1 - INTRODUCTION ......................................................................... 23

1.1 CONTEXT OF RESEARCH .................................................................................... 23 1.2 MULTIDISCIPLINARY FRAMEWORK FOR EMPIRICAL ANALYSIS ............................ 24 1.3 THESIS ............................................................................................................... 25

1.3.1 Research Hypothesis and Domains of Investigation ............................................... 25 1.3.2 Major Contributions ....................................................................................... 27

1.4 STRUCTURE OF THE THESIS................................................................................. 27

CHAPTER 2 - 3D USER INTERFACES AND AIR TRAFFIC CONTROL ..... 31

2.1 3D USER INTERFACES ........................................................................................ 31 2.1.1 Depth Cues................................................................................................... 31 2.1.2 Shutter Glasses.............................................................................................. 32 2.1.3 Hardware Technologies for 3D Interfaces ............................................................ 32

2.2 AIR TRAFFIC CONTROL ...................................................................................... 36 2.2.1 Air Traffic Controllers .................................................................................... 37 2.2.2 How can a flight be controlled by ATCOs? ......................................................... 38 2.2.3 ATCOs activities........................................................................................... 39 2.2.4 Air Traffic Controller’s Interface ....................................................................... 41

2.3 3D USER INTERFACES FOR ATC – RELATED WORKS ......................................... 41 CHAPTER 3 - INTERACTION TASKS AND INTERACTION TECHNIQUES IN 3D USER INTERFACES ............................................................ 45

3.1 INTRODUCTION.................................................................................................. 45 3.2 INTERACTION TASKS IN 3D USER INTERFACES .................................................. 45

3.2.1 Object Selection.............................................................................................. 45 3.2.2 Object Manipulation....................................................................................... 45 3.2.3 Navigation.................................................................................................... 46

3.3 3D INTERACTION TECHNIQUES ......................................................................... 46 3.3.1 Selection and Manipulation .............................................................................. 46 3.3.2 Navigation.................................................................................................... 55

3.4 CONCLUSION ..................................................................................................... 61

11

Page 12: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

CHAPTER 4 - AN ANALYSIS OF INTERACTION TASK AND INTERACTION TECHNIQUES IN A 3D ENVIRONMENT FOR AIR TRAFFIC CONTROL 65

4.1 INTRODUCTION.................................................................................................. 65 4.1.1 Motivation .................................................................................................... 65 4.1.2 The scope of the analysis .................................................................................. 65

4.2 ANALYSIS APPROACH.......................................................................................... 65 4.3 OBJECT .............................................................................................................. 66

4.3.1 Airspace....................................................................................................... 67 4.3.2 Air Traffic ................................................................................................... 68

4.4 VIEW .................................................................................................................. 69 4.5 INTERACTION TASKS WITH THE OBJECT: SELECTION-MANIPULATION ............... 70

4.5.1 Object Selection.............................................................................................. 70 4.5.2 Object Placement ............................................................................................ 74 4.5.3 Object Modification ........................................................................................ 77 4.5.4 Informative query ........................................................................................... 78 4.5.5 Summary...................................................................................................... 79

4.6 INTERACTION TASKS WITH THE VIEW: NAVIGATION.......................................... 82 4.6.1 Exploration task ........................................................................................... 82 4.6.2 Search task ................................................................................................... 84 4.6.3 Summary...................................................................................................... 85

4.7 CONCLUSION ..................................................................................................... 86 CHAPTER 5 - SELECTION-BY-VOLUME APPROACH & NEW INTERACTION TECHNIQUES ................................................................................ 89

5.1 INTRODUCTION.................................................................................................. 89 5.2 THE USABILITY EVALUATION ON THE RAY-CASTING TECHNIQUE ....................... 89

5.2.1 Training sections ............................................................................................ 89 5.2.2 The problem of occlusion .................................................................................. 90

5.3 PREVIOUS WORKS ON THE USE OF SELECTION VOLUME FOR THE OBJECT SELECTION TASK ............................................................................................................ 93 5.4 SELECTION-BY-VOLUME APPROACH.................................................................. 93

5.4.1 Selection volume ............................................................................................. 94 5.4.2 2D Menu System........................................................................................... 95 5.4.3 Interaction techniques derived from Selection-By-Volume approach ........................... 97

5.5 TRANSPARENT SPHERE....................................................................................... 98 5.5.1 Description ................................................................................................... 98 5.5.2 Design ......................................................................................................... 98 5.5.3 Selection ....................................................................................................... 99 5.5.4 Manipulation .............................................................................................. 100 5.5.5 Undo ......................................................................................................... 100

5.6 TRANSPARENT CYLINDER ................................................................................ 100 5.6.1 Description ................................................................................................. 100 5.6.2 Design ....................................................................................................... 101 5.6.3 Selection ..................................................................................................... 101

12

Page 13: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5.6.4 Manipulation .............................................................................................. 102 5.6.5 Undo ......................................................................................................... 103

5.7 A USABILITY EVALUATION OF THE SELECTION-BY-VOLUME APPROACH USING THE COGNITIVE WALKTHROUGH METHOD ................................................................. 103

5.7.1 The Cognitive Walkthrough ........................................................................... 103 5.7.2 Object Selection............................................................................................ 103 5.7.3 Scenario and Analysis................................................................................... 103 5.7.4 Problems and Solutions ................................................................................. 106

5.8 DISCUSSION...................................................................................................... 107 5.8.1 Advantages ................................................................................................. 107 5.8.2 Disadvantages ............................................................................................. 107 5.8.3 Problems to Consider in the Design of Interaction Techniques Using the Selection-By-Volume Approach ....................................................................................................... 107

5.9 CONCLUSION ................................................................................................... 108

CHAPTER 6 - EVALUATION ............................................................................. 111

6.1 OBJECTIVES...................................................................................................... 111 6.2 EXPERIMENTAL DESIGN .................................................................................. 111 6.3 MATERIALS ...................................................................................................... 111 6.4 PROCEDURE AND TASKS ................................................................................... 114 6.5 SUBJECTS .......................................................................................................... 115 6.6 EQUIPMENT ..................................................................................................... 115 6.7 EXPERIMENTAL RESULTS ................................................................................. 115

6.7.1 Occluded scenes ............................................................................................ 116 6.7.2 Mixed scenes ............................................................................................... 118 6.7.3 Non-occluded scenes ...................................................................................... 120 6.7.4 Questionnaire .............................................................................................. 122

6.8 DISCUSSION...................................................................................................... 124 6.8.1 Occluded scenes ............................................................................................ 124 6.8.2 Mixed scenes ............................................................................................... 125 6.8.3 Non-occluded scenes ...................................................................................... 125 6.8.4 Menu......................................................................................................... 125

6.9 CONCLUSION ................................................................................................... 125

CHAPTER 7 - CONCLUSIONS & FUTURE WORK ....................................... 129

7.1 CONCLUSIONS .................................................................................................. 129 7.2 FUTURE WORK ................................................................................................. 130

7.2.1 The analysis on Object, View, Interaction Task and Interaction Technique ............. 130 7.2.2 The Selection-By-Volume approach.................................................................. 131

APPENDIX A – ACRONYMS..................................................................................... 135

APPENDIX B – QUESTIONNAIRE ........................................................................ 136

APPENDIX C – ORDERS OF THE SCENES ......................................................... 139

APPENDIX D – SCENARIOS .................................................................................... 142

13

Page 14: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

APPENDIX E – NUMERICAL RESULTS ............................................................... 150

APPENDIX F – RESULTS FROM QUESTIONNAIRE......................................... 153

APPENDIX G – RESPONSES TO THE QUESTIONNAIRE............................... 154

REFERENCES ............................................................................................................. 171

14

Page 15: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

LIST OF TABLES Table 1. Order of scenes in case of the ray-casting technique.......................................................... 139 Table 2. Order of scenes in case of the Transparent Cylinder technique ........................................... 140 Table 3. Order of scenes in case of the Transparent Sphere technique ............................................. 141 Table 4. Mixed scenes.......................................................................................................... 150 Table 5. Occluded scenes....................................................................................................... 151 Table 6. Non-occluded scenes................................................................................................. 152 Table 7. Results from the questionnaire ................................................................................... 153

15

Page 16: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

16

Page 17: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

LIST OF ILLUSTRATIONS Figure 1-1. The multidisciplinary framework ............................................................................. 24 Figure 2-1. Shutter Glasses (Image courtesy of Stereographics [Stereographics 2005])......................... 32 Figure 2-2. Components of a stereoscopic 3D display system .......................................................... 32 Figure 2-3. CyberGlove (Photograph courtesy of The Hong Kong Polytechnic University,

www.sd.polyu.edu.hk)...................................................................................................... 34 Figure 2-4. An Intersense wand (Image courtesy of Intersense [Intersense 2005]) .............................. 34 Figure 2-5. A workbench display of BARCO (Photograph courtesy of BARCO, www.barco.com)....... 35 Figure 2-6. The air traffic control system (Figure adapted from Howstuffworks, www.howstuffworks.com)

................................................................................................................................... 37 Figure 2-7. Flight profile........................................................................................................ 38 Figure 2-8. An example of modern operational display system for air traffic controllers ....................... 41 Figure 3-1. Ray-casting technique: The red box can be selected when the light rays (in red) emitted from

hand-held input devices intersect it (Pol, Ribarsky et al. 1999[Pol 1999], GVU Center, Georgia Tech) ........................................................................................................................... 48

Figure 3-2. Flexible Pointer using two-hands pointing technique: the ray is controlled by two hands-held input devices so that it becomes a curve (in yellow); this curve can help to select the target object (the black box in this case) while avoiding the obscuring object in the foreground (Olwal and Feiner 2003 [Olwal 2003], UIST’03) ............................................................................................... 48

Figure 3-3. The Aperture technique: The conic selection volume is described by the eye point (viewpoint) and aperture position (Forsberg et al. [Forsberg 1996]) ............................................................... 49

Figure 3-4. The virtual hand is controlled by a CyberGloveTM (image courtesy of Virtual Reality Project at Ohio Supercomputer Center, www.osc.edu) .......................................................................... 50

Figure 3-5. The HOMER technique ....................................................................................... 51 Figure 3-6. Voodoo Dolls interaction technique [Pierce 1999b] ..................................................... 52 Figure 3-7. Creating a doll in the Voodoo Dolls technique [Pierce 1999b] ...................................... 52 Figure 3-8. The ray-casting technique is used for selecting the menu item .......................................... 53 Figure 3-9. Using the ray-casting technique and the graphical menu for querying information of objects... 54 Figure 3-10. Typical techniques for selection and manipulation ...................................................... 55 Figure 3-11. An illustration of Scene-In-Hand travel technique: the scene is attached to the 6 DOF input

device and the user can control the scene by moving or rotating the device (image courtesy of Data Visualization Research Lab [DVRL 2005]) .................................................................... 56

Figure 3-12. An illustration of Camera-In-Hand travel technique (image courtesy of Data Visualization Research Lab [DVRL 2005]) ........................................................................................ 57

Figure 3-13. An illustration of Flying Vehicle travel technique (image courtesy of Data Visualization Research Lab [DVRL 2005]) ........................................................................................ 57

Figure 3-14. An illustration of Teleportation: The user can point to a location on the map and go directly to this location in the 3D scene (Image courtesy of Bowman et al. [Bowman 2000]) ........................ 58

Figure 3-15. World-In-Miniature (image courtesy of Stoakley et al. [Stoakley 1995]) ....................... 58 Figure 3-16. North-up map versus forward-up map (Image courtesy of the MOVES Institute [MOVES

2005], Naval Postgraduate School) ................................................................................... 59 Figure 3-17. The markers in different colours which help the user to remember some important locations are

artificial landmarks in this context (Image courtesy of Darken and Peterson [Darken 2001], Naval Postgraduate School) ....................................................................................................... 60

17

Page 18: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 3-18. In this example, the user moved backwards and is looking towards the start location. The footprints. The footprints show the direction the user was looking at that point in time (Figure courtesy of Darken and Peterson[Darken 2001], Naval Postgraduate School) ...................................... 60

Figure 3-19. Typical techniques for navigation............................................................................ 61 Figure 4-1. Interaction features as a combination of Object, View, Interaction Task and Interaction

Technique ..................................................................................................................... 66 Figure 4-2. Airspace ............................................................................................................ 67 Figure 4-3. Airspace and sector (image from ATM application developed by Linköping University) ..... 67 Figure 4-4. Flight and Flight route representation (image from ATM application developed by Linköping

University).................................................................................................................... 68 Figure 4-5. Example of a global view of flight routes (image from ATM application developed by

Linköping University)..................................................................................................... 69 Figure 4-6. Example of a detailed view within a sector (image from ATM application developed by

Linköping University)..................................................................................................... 70 Figure 4-7. ATC objects that are subjected to the object selection task ............................................. 71 Figure 4-8. The interaction techniques for the analysis on the object selection task .............................. 71 Figure 4-9. The disambiguation in case of selection a single object in a set of closely spaced objects .......... 72 Figure 4-10. An example of the graphical query interface ............................................................. 73 Figure 4-11. Two phases of the object placement task................................................................... 74 Figure 4-12. Using the ray-casting technique for the moving phase .................................................. 75 Figure 4-13. Using the virtual hand technique for the moving phase ................................................ 75 Figure 4-14. Using the ray-casting technique for the placement phase............................................... 76 Figure 4-15. Using the virtual hand technique for the placement phase ............................................ 76 Figure 4-16. An example of the combination of the graphical menu and the ray-casting technique for the

object modification task.................................................................................................... 78 Figure 4-17. An example of the combination of the graphical menu and the ray-casting technique for the

informative query task ..................................................................................................... 79 Figure 4-18. Proposed icons of the selection and manipulation techniques ......................................... 80 Figure 4-19. Proposed icons of the selection and manipulation task................................................. 80 Figure 4-20. Combination A.................................................................................................. 80 Figure 4-21. Combination B .................................................................................................. 81 Figure 4-22. Combination C .................................................................................................. 81 Figure 4-23. Combination D.................................................................................................. 82 Figure 4-24. The interaction techniques for the analysis on the exploration task ................................ 83 Figure 4-25. The interaction techniques for the analysis on the search task ....................................... 85 Figure 5-1. The green rectangular is occluded .............................................................................. 90 Figure 5-2. The full occlusion (the left figure): the blue box is fully occluded by the yellow one. The partial

occlusion (the right figure): the blue box can be partially seen by the user. ................................... 90 Figure 5-3. An example of selecting objects using the wand as input device and the ray-casting as selection

technique. Users try to select the blue triangle, but it is occluded by the yellow rectangle (from the current user’s viewpoint). As a result, the users have to move the wand until there is no occlusion between the blue triangle and other objects anymore................................................................................ 91

Figure 5-4. An example of selecting objects using wand as the input device and ray-casting as the selection technique. Users change the scale of the scene (e.g. by zooming) to eliminate the occlusion state between the target object (the blue triangle) and other objects (from the current user’s viewpoint).................. 91

18

Page 19: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 5-5. An example of visual solution for solving the problem of occlusion. The object P2 is occluded in the current viewpoint. A label can be added to P2 in order to notice the existence of P2 and to facilitate the selection of P2 ........................................................................................................... 92

Figure 5-6. The object P2 change from the “non-occlusion” state to “occlusion” state when moving......... 92 Figure 5-7. With the same scene, a large selection volume could contain more objects than a small one .... 94 Figure 5-8. With the same selection volume size, more objects could locate within the selection volume in the

high-density scene than in the low-density scene...................................................................... 95 Figure 5-9. Three methods of menu placement............................................................................. 96 Figure 5-10. A listing of choices for every feature of an interaction technique created from the Selection-By-

Volume Approach.......................................................................................................... 97 Figure 5-11. Transparent Sphere Technique: the objects located within the transparent sphere are

highlighted by a different colour (yellow)............................................................................... 98 Figure 5-12. The object selection task using the Transparent Sphere ............................................... 99 Figure 5-13. In the scene, the ball P09 is occluded by the ball P01. P09 can be selected by pointing to its

label on the menu ......................................................................................................... 100 Figure 5-14. Transparent Cylinder technique: the objects located within the cylinder are highlighted by a

different colour (yellow) .................................................................................................. 101 Figure 5-15. The object selection task using the Transparent Cylinder ........................................... 102 Figure 5-16. In the scene, the ball P01 is occluded by the ball P05. P01 can be selected by pointing to its

label on the menu ......................................................................................................... 102 Figure 5-17. The hidden object P7 is displayed in the menu ........................................................ 105 Figure 6-1. The Ray-casting technique a) The object intersected with the red ray is highlighted by yellow

colour b) After being selected, the object of interest (P1) is turned red, attached to the ray for manipulation. .............................................................................................................. 112

Figure 6-2. The Transparent Sphere technique a) The objects located within the transparent sphere are highlighted by yellow b) A ray is used to select a label (P3) on the pop-up menu c) After being selected, the object P3 is turned red, moved to the centre of the sphere and is ready for manipulation. ......... 112

Figure 6-3. The Transparent Cylinder technique a) The objects located within the transparent cylinder are highlighted b) A ray is used to select a label (P1) on the pop-up menu c) After being selected, the object P1 is ready for manipulation task.................................................................................... 113

Figure 6-4. Snapshots of the test ............................................................................................ 114 Figure 6-5. Results of the performed time with the occluded scenes ................................................. 116 Figure 6-6. Results of the selection time with the occluded scenes ................................................... 117 Figure 6-7. Results of the placement time with the occluded scenes ................................................. 118 Figure 6-8. Results of the performed time with the mixed scenes ................................................... 119 Figure 6-9. Results of the selection time with the mixed scenes...................................................... 119 Figure 6-10. Results of the placement time with the mixed scenes.................................................. 120 Figure 6-11. Results of the performed time with the non-occluded scenes ........................................ 121 Figure 6-12. Results of the selection time with the non- occluded scenes........................................... 122 Figure 6-13. Results of the placement time with the non-occluded scenes ......................................... 122 Figure 6-14. Statistical results of the questionnaire .................................................................... 123 Figure 6-15. The problem of focus in the Transparent Cylinder technique ...................................... 124 Figure 0-1. Scene 00 (trial) .................................................................................................. 142 Figure 0-2. Scene 01 (trial) .................................................................................................. 142 Figure 0-3. Scene 02 (trial) .................................................................................................. 142

19

Page 20: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 0-4. Scene 03 (trial) .................................................................................................. 143 Figure 0-5. Scene 04 (trial) .................................................................................................. 143 Figure 0-6. Scene 05 (trial) .................................................................................................. 143 Figure 0-7. Scene 06 (test) ................................................................................................... 144 Figure 0-8. Scene 07 (test) ................................................................................................... 144 Figure 0-9. Scene 08 (test) ................................................................................................... 144 Figure 0-10. Scene 09 (test).................................................................................................. 145 Figure 0-11. Scene 10 (test).................................................................................................. 145 Figure 0-12. Scene 11 (test).................................................................................................. 145 Figure 0-13. Scene 12 (test).................................................................................................. 146 Figure 0-14. Scene 13 (test).................................................................................................. 146 Figure 0-15. Scene 14 (test).................................................................................................. 146 Figure 0-16. Scene 15 (test).................................................................................................. 147 Figure 0-17. Scene 16 (test).................................................................................................. 147 Figure 0-18. Scene 17 (test).................................................................................................. 147 Figure 0-19. Scene 18 (test).................................................................................................. 148 Figure 0-20. Scene 19 (test).................................................................................................. 148 Figure 0-21. Scene 20 (test).................................................................................................. 148 Figure 0-22. Scene 21 (test).................................................................................................. 149

20

Page 21: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 1

INTRODUCTION

21

Page 22: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

22

Page 23: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 1 - Introduction

1.1 Context of Research Air Traffic Control (ATC) deals with traffic organization such that safety is maintained despite the traffic complexity, within a defined limit that is called capacity. Growth in air traffic demand of an average of 5-6% per year requires that the theoretical capacity of each control centre should be increased in order to be able to accommodate more aircrafts in the air without jeopardizing the safety of air traffic.

Consequently, the controllers are exposed to the challenge of being more and more efficient, not only in the maintenance of safety of flights but also the in the management of flight operations to reduce the complexity of the encounters. More decision support tools and information are needed to assist the controllers in their new challenge. Considering the increasing amount of information made available, traditional two-dimensional (2D) radar representation might be overloaded. A possible solution to overcome the problem could be the use of stereoscopic three-dimensional (3D) displays. This solution deals with the use of Virtual Environment-based 3D User Interfaces for Air Traffic Controllers (ATCOs). To simplify, this solution will be called as 3D-ATC environment hereinafter.

Because human-dependent safety critical domains such as ATC require careful and focused interface design (since the way air traffic information is presented can heavily impact on the way controllers perform monitoring tasks as well as on their workload level), particular attention has to be brought on the human factors side of any technology-driven interface design. To support the investigation on the applicability of the Virtual Environment (VE) technology to the interface of ATCOs, the strategy taken at EUROCONTROL1 Experimental Centre (EEC) had been to establish parallel technology-centred design and human-centred design investigations:

• The first direction of investigation has been conducted through collaboration with the Norköpings Visualisation & Interaction Studio (NVIS) of the Linköping University. This technology-centred-design-based investigation aims at exploring all features that can be offered by 3D stereoscopic visualisation and multimodal interaction technologies for Air Traffic Control. This application, targeted at semi-immersive displays (workbench displays), makes use of VE technology to provide an environment within which an air traffic controller can observe and monitor a large number of aircraft over a wide area, being kept aware of the many complex factors about their planned routes which may affect the future planning of the flight paths, and can use 3D interaction methods to select and re-route aircraft interactively as the data are updated in real time. Results obtained so far on this investigation can be found in [INO 2003] and are not discussed herein.

1 EUROCONTROL is the European Organisation for the Safety of Air Navigation. This civil and military organisation, which currently numbers 34 Member States, has as its primary objective the development of a seamless, pan-European Air Traffic Management (ATM) system. The achievement of this objective is a key element to the present and future challenges facing the aviation community, which are to cope with the forecast growth in air traffic, while maintaining a high level of safety, reducing costs, and respecting the environment.

23

Page 24: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• The second parallel direction of investigation has been conducted at the Innovative Research Lab of the EEC. This human-centred-design-based investigation aims at exploring all possible adaptations of 3D stereoscopic visualisation and interaction features to Air Traffic Controllers. The ultimate goal of this direction of investigation has been to empirically uncover the main implications derived from the adoption of a 3D stereoscopic visualization environment for Air Traffic Control, with particular attention to its applicability by deploying a multidisciplinary approach from which three different areas of study are inter-dependently investigated: 3D visualization, 3D interaction and human factors. The work presented in this dissertation2 deals with 3D interaction.

We can note that while the technology-centred design focuses on the design of components, the human-centred one focuses on the user. While the component design emphasizes on the internal architecture that can be designed by VE software specialists, targeting performance of innovative but intuitive interface for ATC, the user driven design emphasizes on external requirements for human-system adaptation that shall be experimented by a multi-disciplinary team approach, targeting user validation prior to development. The complementarities of these two design approaches have been indeed the key strategic element of this research axis of the EEC.

1.2 Multidisciplinary Framework for Empirical Analysis In a 3D-ATC environment, the issues of visualization representation, human perception, and interactions are interrelated. The choice of displaying objects, the ways display them and interact with them certainly influence final users’ human performance. The integration of these different aspects in a research framework becomes a necessity.

HumanFactors

Inte ractionVisualization

User-centre dDesign

Figure 1-1. The multidisciplinary framework

In this perspective, a multidisciplinary framework for the empirical analysis of the applicability of Virtual Environment-based 3D user interfaces to ATC is designed. This research framework suggests taking into account the assessment of human factors as central design principles for visualization and interaction, as illustrated in Figure 1-1. 2 This work was performed at the Innovation Research Area, EUROCONTROL Experimental Centre.

24

Page 25: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• The central research component involves Human Factors, the domain that deals with final users’ needs, abilities, and performance within a system as a whole. The goal of this research component is to perform the evaluations of visualization and interaction solutions with quantitative and qualitative measures on real controllers. The expected results include empirical analysis indicating the most suitable design principles that shall be used in the design of the visualization and interactions. Examples of evaluations include the human capacity to memorize locations of objects in 3D environments [Dang 2003c; Tavanti 2003; Tavanti 2004a].

• The second research component deals with investigations of 3D Visualization. The goal of this research component is to study and define the way the ATC components should be graphically represented in a Virtual Environment. Examples of the studies include the representations of domain knowledge and decision support information such as flight plans, trajectories, and safety related information. For each visualized object, not only the trade-off between the degree of realism of the 3D models, and the associated computational costs shall be evaluated but also the pertinence of the representation with respect to controllers’ behaviours in the execution of their tasks (e.g. use of colours, use of 2D elements as text, etc.). Appropriateness of the representation is the major goal of this research component that could only be reached through successive iterations with the human factors evaluations [Le 2005].

• The third research component concerns the 3D Interaction, i.e. the investigation of the appropriateness of interactivity methods and devices in a 3D-ATC environment. The goal of this research component is to define the methods and techniques for 3D interaction that is compliant with controllers’ behaviours, requirements and needs. The research described in this dissertation is a subset of this component. It concerns the analysis of interactions in a 3D-ATC environment with current proposed technologies, and the proposal of a technique more appropriate for the selection of the occluded objects.

In order to get the richest results, all these aspects were investigated in parallel. For every solution or new idea elaborated, an evaluation with end-users was conducted to measure the impacts on human users. This approach ensures that the progress in the study is compliant with actual users’ requirements, thus to avoid the implementation of a potentially unusable system at both visualization and interaction levels.

Results have been published during the course of the research, either on the common framework [Dang 2003a; Dang 2003c] or on the newly proposed metaphor [Dang 2003d; Dang 2005a; Dang 2005b; Tavanti 2004c]. The research conducted in the human factor and visualisation components are also published as PhD dissertations[Le 2005; Tavanti 2004b].

1.3 Thesis

1.3.1 Research Hypothesis and Domains of Investigation

The major research topic of this PhD Thesis is the interactivities between Air Traffic Controllers and the 3D-ATC environment. Two main questions have been expressed as key points of this research:

25

Page 26: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

1. Can currently used interaction techniques and standards in Virtual Environment be applied to Air Traffic Control without jeopardizing controllers’ performance?

2. If yes, what are the limits of the applicability? Otherwise, are there any other techniques which are more appropriate for the specific domain of ATC?

Obviously, in order to be able to establish the baselines determining the applicability of Virtual Environment for ATC, i.e. the appropriateness and the adequacy, the concerned domains are not only those dealing with advanced computer graphics and Virtual Environment technologies, in particular 3D user interfaces technologies and 3D human-computer interaction techniques, but also the domain of Air Traffic Control, in particular ATCOs interactivities.

To further highlight the essence of the research performed for this thesis, it’s worth clarifying some key terminologies that link tightly to the concerned domains of this thesis.

• Human-Computer Interaction (HCI) – HCI concerns the process of communication between human users and computers (or technologies in general). In the 3D-ATC environment, controllers communicate actions, intents, goals, queries, etc. to computers. Computers, in turn, communicate to the controllers the information about the world, about their internal state, about the responses to user queries, etc. [Hix 1993]

• User Interface (UI) – The medium through which the communication between users and computers takes place is often referred to as UI. The UI translates a user’s actions and states (inputs) into a representation that the computer can understand and acts upon, and it translates the computer’s actions and states (outputs) into a representation that the human user can understand and act upon. [Hix 1993]

• Interaction Technique – An interaction technique is a method allowing the user to accomplish a task via the UI. An interaction technique includes both hardware (input/output devices) and software components. This software component is responsible for the mapping of the information from the input device (or devices) into some system actions, and for the mapping of the output of the system into a representation that can be displayed by the output device(s). [Bowman 2004]

• 3D Interaction – Human-computer interaction in which the user’s tasks are performed directly in a 3D spatial context. Interactive systems that display 3D graphics do not necessarily involve 3D interaction; for example, if a user tours a model of a building on her desktop computer by choosing viewpoints from a traditional menu, no 3D interactions has taken place. On the other hand, 3D interaction does not necessarily mean that 3D input devices are used; for example, in the same application, if the user clicks on a target object to navigate to that object, then the 2D mouse input has been directly translated into a 3D location, and thus 3D interaction has occurred. 3D interaction in ATC implies obviously the analysis of the controller’s tasks to be performed, and thus also a major domain of investigations [Bowman 2004].

• 3D User Interface (3D UI) – UI that involves 3D Interaction [Bowman 2004].

• Virtual Environment (VE) – A synthetic, spatial (usually 3D) world seen from a first-person point of view. The view in a VE is under the real-time control of the user. VE can be considered as a type of 3D UI. [Bowman 2004]

26

Page 27: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

1.3.2 Major Contributions

This PhD research contributes to the global topic of 3D stereoscopic environment for ATC on two major axes:

1. First, the analysis of interaction tasks and interaction techniques that can be used in a 3D-ATC environment. Previous investigations [Azuma 2000; Lange 2003; Zeltzer 1997] have ignored the importance of interactions from the user’s performance in their technology-driven developments of 3D-ATC environment.

2. Second, the proposal of a new approach for interaction - the Selection-By-Volume approach - deriving into two interaction techniques, namely Transparent Sphere and Transparent Cylinder, to solve the interaction problems relating to occluded objects in an ATC scene identified above. Experimentations were set up to test with potential users, and have empirically demonstrated to be able to solve well the problem of interaction with occluded objects. This topic represents more generally a contribution to the field of 3D UI at large and is subjected to a communication as research paper at the IEEE Virtual Reality Conference, edition 2005 [Dang 2005a] and to an accepted communication at the HCI International Conference, edition 2005 [Dang 2005b].

1.4 Structure of the thesis The thesis is organized in seven chapters.

Chapter 1 outlines the scope, the objective and the contributions of the thesis.

Chapter 2 gives an overview of the state of the art of the 3D User Interfaces technologies before introducing shortly Air Traffic Control and the activities of Air Traffic Controllers. An overview of related works about applying 3D Interfaces for Air Traffic Controllers is also included in this chapter.

Chapter 3 introduces the interaction tasks and the interaction techniques used in 3D User Interfaces. The content of this chapter is helpful for understanding the analysis of interaction tasks and interaction techniques in a 3D-ATC environment.

Chapter 4 introduces an analysis of interaction tasks that can be taken place in a 3D-ATC environment and of 3D interaction techniques that can be used for these interaction tasks. By considering the interaction features of a 3D-ATC environment under the relationship of 4 components: Object, View, Interaction Task and Interaction Technique, this analysis also identifies the appropriate interaction techniques to interaction tasks in a 3D-ATC environment.

Chapter 5 focuses on the problem of interaction with occluded objects that was identified thanks to a usability evaluation on the ray-casting technique, the key technique for the object selection and manipulation tasks recognized through the analysis in chapter 4. A new approach for interaction with occluded objects, the Selection-By-Volume approach, is also described in this chapter. Moreover, two interaction techniques, the Transparent Cylinder and the Transparent Sphere techniques derived from this new approach are also presented.

Chapter 6 describes the empirical evaluation of the Transparent Cylinder technique and the Transparent Cylinder technique compared to the ray-casting technique.

Chapter 7 summarizes the main contributions of the thesis and present the future works.

27

Page 28: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

28

Page 29: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 2

3D User Interfaces and Air Traffic Control

29

Page 30: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

30

Page 31: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 2 - 3D User Interfaces and Air Traffic Control

In this chapter, a global view of related technologies and techniques which are discussed and structured into three domains of interest: 3D user interfaces technologies, Air Traffic Control and the related works about 3D user interfaces for Air Traffic Control.

2.1 3D User Interfaces It is worth noting here that by 3D we mean stereoscopic 3D visualization, while the display of 3D information on a flat screen or monitor means only 2.5D. The difference between 2.5D and 3D is that 2.5D makes use of monocular depth cues while 3D makes use of binocular depth cues.

Similar to the field of Human-Computer Interaction (HCI), the design of 3D User Interfaces (3D UIs) is an interdisciplinary field that draws from existing knowledge in perception, cognition, human factors, graphic design, and especially 3D computer graphic and Virtual Environment technology. Therefore, in this section are presented some basic notions of depth perception, followed by a short description of the input and output devices.

2.1.1 Depth Cues

In order to allow for depth perception, 3D scenes can be provided with a number of cues that gives a sense of depth and three-dimensionality. Depth cues can be monocular (or pictorial) and binocular.

Monocular depth cues refer to depth information that can be inferred from a static image viewed by a single eye. Important monocular depth cues are the relative size of objects in relation to the others (bigger objects are perceived as closer than smaller ones), the relative height (the object closer to the horizon is perceived as farther away, and the object further from the horizon is perceived as closer), the occlusion (when the first object is placed over a second object, the first object appears closer than the second, which is partially blocked), the linear perspective (parallel lines appear as converging in the distance), the shadow and light (an objects' shadow when lighted provides some clues about the objects' orientation relative to the observer and its three-dimensional shape), and the texture gradient (as a surface gets farther away from the observer, the texture of this surface gets finer and appears smoother). The 2.5D Interfaces make use of these monocular depth cues to create the feeling of depth or “look like 3D” graphical representation.

Stereopsis is among the most important binocular cues to depth perception. Stereopsis is based on the slightly different images projected onto the retina by each eye. The 3D Interfaces make use of these binocular depth cues to create the feeling of depth for users. In fact, based on the stereopsis, depth perception can be created by emitting simultaneously two slightly different images, each for each eye. There are several techniques to produce 3D stereoscopic images. The technique used in the investigation in this thesis uses special eyeglasses known as shutter glasses.

31

Page 32: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

2.1.2 Shutter Glasses

This device enables left- and right-eye views "simultaneously" that emulates the way to see in the real world with both eyes. Showing the left image on the display while the right lens is off, then showing the right image on the display while the left lens is closed accomplishes stereo viewing. Because this happens a minimum of 120 times per second, user perceives left and right views simultaneously. The result is highly realistic depth cues.

Figure 2-1. Shutter Glasses (Image courtesy of Stereographics [Stereographics 2005])

Technically, these glasses have high-speed electronic shutters that open and close in sync with the images on the screen thanks to the infrared emitter (Figure 2-2). When the left image is on the screen the left shutter is open and the right shutter is closed which allows the image to be viewed by the left eye only. When the right image is on the screen the right shutter is open and the left shutter is closed. If this process happens fast enough, the human brain thinks it is seeing a true stereoscopic image. If this shuttering speed is not fast enough, user can still see a stereoscopic image, but may also see some flickering.

Figure 2-2. Components of a stereoscopic 3D display system

2.1.3 Hardware Technologies for 3D Interfaces

Interfaces are often defined as a layer between the user and a system. The user communicates commands, requests, questions, intents, and goals to the system. The system in turn provides feedback, requests for input, information about the system state and so on. These communications are set up thanks to physical devices (hardware) that serve as the medium of communication between the parties. In order to interact with a system, the user needs both input and output devices. Input devices are the means by which the user gives information to the system. Output devices present or display information to the user’s perceptual system.

32

Page 33: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

2.1.3.1 Input Devices

There are many types of input devices and in order to have a more systematic view, several attempts to classify input devices were done [Bowman 2004]. La Viola [LaViola 1999] classified input devices into 4 categories: discrete event device, continuous event device, combination or hybrid device, and speech input device. Discrete event devices are devices that generate one event at a time based on the user action, i.e., an event is generated when the user presses a button. The keyboard is an example of a discrete input device. Pinch gloves, developed by Fakespace [Fakespace 2005], are another example of discrete-input device: the user pinches two or more fingers to signal an event.

By way of contrast, continuous event devices generate a continual stream of events. Two of the most common continuous-input devices are trackers and data gloves. A tracker is a device that continually outputs position and orientation records, even if the device is not moving. Data gloves transmit records on bend angles of the fingers.

Then, there are hybrid devices that combine both discrete and continuous events to form single, more flexible devices. Examples of hybrid devices are mouse and joystick.

A special type of input devices are speech input, which provides a complement to other input devices. In fact, it is a natural way to combine different modes of input (e.g. multimodal interaction) to form a more cohesive and natural interface. In general, when functioning properly speech input can be a valuable tool in Virtual Environment applications especially when both user’s hands are occupied.

The input devices used for this work are a Tracking Device (a continuous event device) and a Wand (a hybrid device according to the present classification). These devices were used to interact with the 3D stereo scenes that were used for the investigation of this thesis. The details of these two types of devices are the following.

Tracking device

Tracking devices are devices that capture and record human motions. Tracking devices determine in real time the x, y, z position, and the orientation of some parts of the user’s body in reference to a fixed point [Rolland 2001]. Usually, tracking devices track the user’s head to control his/her viewpoint. They also track the input devices used to interact with the 3D objects. Tracking devices affect the accuracy of an action and the responsiveness of system, therefore they are very important in the interaction with a 3D interface.

Another commonly used forms of tracking devices are data gloves. Data gloves are input devices that provide detailed tracking information of the user’s hands, such as how the fingers are bending. Two basic varieties of Data gloves are Pinch Glove and Bend-sensing glove. Pinch Glove determines if a user is touching two or more fingertips together. Bend-sensing data gloves are commonly used for hand gesture recognition. It is equipped with sensors that run through the fingers of the glove and can detect changes in the bend at each joint. By measuring the bends at the joints, the bend-sensing glove can be used to recognize a variety of gestures such as a closed fist, or pointing with one finger. CyberGlove® (Figure 2-3) created by Immersion [Immersion 2005] is one of most commonly used bend-sensing gloves.

33

Page 34: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 2-3. CyberGlove (Photograph courtesy of The Hong Kong Polytechnic University, www.sd.polyu.edu.hk)

Wand

The wand consists of a single joystick device and three or four control buttons (Figure 2-4). The wand can be tracked by a tracking device thanks to sensors installed on the Wand. The Wands allow several actions like pointing and selecting of an object, and navigating within a 3D scene.

Figure 2-4. An Intersense wand (Image courtesy of Intersense [Intersense 2005])

2.1.3.2 Output Devices

The output devices present information to one or more of the user’s senses through the human perceptual system. Usually, they stimulate the visual, and/or auditory, and/or haptic senses. Therefore, output devices comprise visual, auditory, and haptic displays. Auditory displays implements sounds as the main output, while haptic displays provide feedback through the stimulation of haptic senses. However, their description goes beyond the scope of this thesis. As a matter of fact, the equipment used for this work makes use of a simple visual display which displays a visual representation of a certain scene. There are various types of visual display device used in 3D interfaces, like conventional monitors, surround-screen displays, workbenches, etc.

Conventional monitors are commonly used in many different kinds of 3D applications, including modelling, scientific visualization, and computer games. They are relatively inexpensive, they

34

Page 35: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

can display 3D scenes that make use of monocular cues, but they also provide binocular depth cues by equipping some additional hardware such as shutter glasses. These configurations often include a head tracker and either a tracked glove or wand which form a system called fish-tank Virtual Reality (VR) [Ware 1993]. However, this type of displays can be rather small. Therefore the users may feel like he is looking through a window into a 3D world and may not feel that s/he is actually immersed into this world

Surround-screen displays are visual output devices that have three or more large projection-based display screens that surround the human participant. The first surround-screen VR system, developed at the Electronic Visualization Laboratory at the University of Illinois at Chicago, consisted of four screens (three wall and a floor) and was called the CAVE [Cruz-Neira 1993]. The viewer with stereo glasses and head tracking can explore the virtual world by moving around inside the 3mx3mx3m cube composed of display screens and can grab objects by using a wand or a data glove. The obvious advantage of surround-screen displays is that it offers a wide field-of-view and allows us to see a larger portion of a virtual world. One of the biggest drawbacks of surround-screen display devices is that they are expensive and often require a large amount of physical space. In addition, when more than one person inhabits a surround-screen device, they all view the same images which are rendered from the tracked user’s perspective. As the tracked user moves, all non-tracked users effectively see the environment through the tracked user’s perspective, which can cause cue conflicts and lead to cyber-sickness [LaViola 2000; Stanney 1998].

Workbenches are special types of projection-based display which has only one display screen, originally developed by Kruger and Frohlich [Kruger 1995]. A variety of workbenches are available, including the BARON, VersaBench™, and Immersadesk™, etc. Head and hand tracking is usually employed with this display. Because the display of workbenches is larger than a monitor and thus covers a large portion of the user’s view, it provides a greater level of immersion. An additional benefit is that many desks have the ability to be tilted so that the user can work at any angle sehe finds comfortable. Figure 2-5 shows BARON display developed by BARCO [BARCO 2005], an example of workbench display, which is used in the investigation in the thesis.

Figure 2-5. A workbench display of BARCO (Photograph courtesy of BARCO, www.barco.com)

35

Page 36: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

2.2 Air Traffic Control

Air Traffic Control terminologies

Airspace. Airspace means the portion of the atmosphere controlled by a particular country or, more generally, any specific portion of the atmosphere. Aviators divide airspace into three basic types:

• Uncontrolled airspace is airspace in which air traffic control does not exert any authority. It exists close to the ground or in mountainous terrain where radar coverage is impossible.

• Controlled airspace exists in almost every place else. Air traffic control is capable of directing aircraft in this airspace.

• Military airspace includes volumes where flight is prohibited for national security reasons.

Sector. Sector is a block of controlled airspace of predetermined size assigned to a team of air traffic controllers. A sector has vertical as well as horizontal boundaries. Depending on various factors (traffic density, etc.), a team of two controllers may be responsible for one or more sectors at any given time.

Flight level (FL). A level of constant atmospheric pressure related to a reference datum of 29.92 inches of mercury. Each is stated in three digits that represent hundreds of feet. For example, FL250 (flight level two-five-zero) represents a barometric altimeter indication of 25,000 feet.

Call sign. Most airlines employ a distinctive and internationally recognised call-sign that is normally spoken during radio transmissions as a prefix to the flight number. The flight number is normally that published in their public timetable and appearing on the arrivals and departure screens in the airport terminals served by that particular flight. In cases of emergency, the airline name and flight number, rather than the individual aircraft's registration, are normally mentioned by the main news media. For example, flight Air France 172 will have the call-sign AFR172.

Flight Route. Flight route is a designated route followed by airplanes in flying from one airport to another. A flight route composed of waypoint and route segments.

Flight Strip. Flight strip is a strip of paper that contains a computer printout of an aircraft's abbreviated flight plan that is used by air traffic controllers to monitor an aircraft's flight.

Taxiway. The paved airport surfaces which allow aircraft to travel between runways and other airport locations such as the hangars or terminals.

Run way. A defined rectangular area on a land airport prepared for the landing and takeoff run of aircraft along its length.

36

Page 37: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

The air traffic control system gives guidance to aircraft, to prevent collisions and manage efficient traffic flow.

Air traffic control can be divided into three major subspecialties: tower control, approach control and enroute control (Figure 2-6). Tower control involves air traffic on the airport, approach control handles traffic within the immediate airport environment, within about 30 nautical miles, and while enroute control handles traffic between major terminals.

Figure 2-6. The air traffic control system (Figure adapted from Howstuffworks, www.howstuffworks.com)

2.2.1 Air Traffic Controllers

Air traffic controllers have the responsibility to expedite and maintain a safe and orderly flow of air traffic and help to prevent collisions.

Corresponding to three major subspecialties of air traffic control system, there are three types of controller: tower controller, approach controller and en-route controller.

Airport Tower Controllers work in facilities called Control Towers. A Control Tower is located at every airport that has regularly scheduled flights. Towers handle all takeoff, landing, and ground traffic. Relying on radar and/or visual observation, Tower controllers are responsible for the separation and efficient movement of aircraft operating on the taxiways and runways of the airport.

Approach controllers work in facilities called Approach Control. These centres are facilities containing radar operations from which approach controllers direct aircraft during the departure and approach phases of flight.

Enroute controllers work in facilities called Upper Airspace Control Centres. Enroute controllers work in teams of two members, depending on how heavy traffic is; each team is responsible for a section of the airspace called sector.

37

Page 38: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

In general, Air Traffic Controllers apply separation rules to keep each aircraft apart from the others in their area of responsibility and move all aircraft efficiently through 'their' airspace and on to the next.

2.2.2 How can a flight be controlled by ATCOs?

A typical flight passes seven phases: pre-flight, takeoff, departure, enroute, descent, approach and landing (Figure 2-7). Each phase is monitored by an air traffic control facility with its own group of controllers. Each of these controllers follows specific rules and procedures while directing flights through designated flight routes. They monitor the flight using special equipment and decision support tools (computers) that ensure a safe and efficient flight.

en route

preflighttakeoff

departure descent

approachlanding

ApproachControl

Upper AirspaceControl

TowerControlTower Control Approach Control

Figure 2-7. Flight profile

• Preflight – The pilot receives the most recent weather information and a flight plan has been filed. Prior to takeoff, the pilot performs the flight check routine; pushes back the aircraft from the terminal's gate, and taxis out to the designated takeoff runway.

• Takeoff – The pilot receives permission from Local Control (from the Control Tower) to takeoff. The aircraft powers up and begins its takeoff.

• Departure – Upon lift off, the pilot is instructed to change radio frequencies to receive new flight instructions from Departure Control in the Approach Control. The pilot is instructed to follow a pre-determined, preferred routing that will take the aircraft up and away from the departure airport onto its route. The pilot is then issued further altitude and routing clearance. The controller monitors the target (the aircraft) and its track (flight route) on the radar screen. As the aircraft reaches the edge of the Approach Control airspace, the Departure Controller performs an electronic transfer of the flight to the controller in the next airspace.

• En Route – The pilot receives instructions as to what altitude and heading to maintain, as well as to which radio frequency to tune. This portion of the flight can be as short as a few minutes or as long as many hours.

• Descent – As the aircraft nears its destination airport, the pilot is instructed to change radio frequencies and contact Descent Control in the Approach Control for instructions. The pilot is instructed to descend and change heading. After receiving these instructions, the aircraft descends and maneuvers to the destination airport.

38

Page 39: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• Approach – The pilot has received an approach clearance to the destination airport from the Approach Controller working in the Approach Control. The flight has been placed in line with other aircraft preparing to land at the same airport. The pilot flies a specified flight procedure in order to get in line for the designated landing runway. The pilot receives instructions from the Approach Controller to change radio frequency and contact Local Control (in the airport’s control tower) for landing clearance. The aircraft is electronically handed off from Approach Control to the Control Tower.

• Landing – The pilot receives clearance from the Local Controller in the airport's control tower to land on a designated runway. Upon touching down, the flight is then handed off to Ground Control. The Ground Controller directs the pilot across the taxiways to its destination gate at the terminal.

Briefly, the controllers working in the control tower handle the preflight, the takeoff and the landing phases; the approach controllers take care of the departure, the descent and the approach phases; the en route controllers working in the upper area control center are in charge of the en route phase.

The approach control and upper area control deal with air traffic while the tower control deals mainly with the ground traffic on the airport. This thesis only concerns the air traffic, i.e. the approach control and upper airspace area control. The tower control dealing with ground traffic should be considered in another study because the complexity of representation and interaction with components of an airport.

The next sections go in details the daily activities and the tasks of en-route and approach controllers.

2.2.3 ATCOs activities

The en-route and approach airspace is divided into sectors. In general, two controllers control each sector: the executive and the planning controller. They both perform surveillance and system updating tasks. Also, the executive controller is responsible for direct communication with pilots (by radio), the planning controller for communication with the planning controllers of other sectors.

2.2.3.1 Planning Controller

Below is an informal list of the main tasks for planning controllers:

• They receive information on flight strips about entering aircraft.

• They check possible conflicts with other aircraft in the air. If they detect any they inform the executive controller.

• When a flight is changing sector if problems are detected then they phone the planning controller of the other sector and negotiate the optimal flight parameters to enter in the new sector (change parameters: flight level, route, etc.). When there is a change in sector the sender controller can change parameters and update the system, the receiver controller takes the strips, checks the color and gives them to the executive.

• They can perform surveillance tasks: check conflicts and communicate with the executive for modifications.

39

Page 40: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

2.2.3.2 Executive Controller

Below is an informal list of the main tasks for executive controllers:

• They receive flight strips and deduce the flight evolution.

• They receive the first contact request from the pilot with its flight parameters.

• In the first contact, the following information is indicated: identification, flight level. The executive checks if they are okay and gives clearance, otherwise s/he changes the information on flight strips.

• They give to the aircraft the frequency for the next sector.

• They give the optimal flight level.

• They control the aircraft and detect any possible conflicts (if conflicts are detected then they have to create a separation whose parameters depend on the altitude)

To sum up, the executive controller performs two main activities: handling traffic inside the sector and handling change of sector of one flight. The first activity is decomposed in watching the radar for surveillance purposes and handling the air traffic (giving regularity to the flights and avoiding conflicts). Once the executive controller identifies a problem which may require e.g. solving a conflict then s/he has to identify a solution (changing heading as a lateral solution, changing flight level as a vertical solution, changing speed as a longitudinal solution) and finally to communicate it.

The second activity means either that a flight is leaving the sector or that a flight is entering into the sector. In case of leaving flights, the executive controller gives the frequency of the next sector, receives the answer from the pilot and then removes the strip of the leaving flight. When there are entering flights the executive controller receives the first contact from the pilot, replies and then has to manage the related strip.

40

Page 41: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

2.2.4 Air Traffic Controller’s Interface

Many technologies are used in air traffic control systems. Primary and secondary radar are used to enhance the controller's "situational awareness" within her/his assigned airspace. Certain types of weather may also register on the radar screen.

Figure 2-8. An example of modern operational display system for air traffic controllers

2.3 3D User Interfaces for ATC – Related Works We find two main research directions about 3D Interfaces for ATC in literature. One concentrates on the evaluation the benefit of 3D (2.5D and 3D) compared to 2D, while the other on the technology aspects of visualization and interaction in 3D environment. The combination of these two directions is necessary for the design of a usable 3D interface for ATC. However, it is not the case at present when they have been usually carried out separately.

The first research direction concentrated mainly on comparing the performance among the plan view display (2D interface), the perspective view display (2.5D interface) and the stereoscopic 3D display (3D interface) across some specific tasks of ATCOs.

• Brown (University of London) [Brown 1994ab; Brown 1994ba] compared different displays for air traffic control. Three types of display to be investigated are: the current 2D display, 2.5D display and 3D display (using shutter glasses for stereoscopic display). Some his results prove the advantages of 3D displays in some contexts.

• Tham and Wickens [Tham 1993] evaluated the performance of several ATC task across a plan view, perspective and stereoscopic displays. The results do not show advantages for the 3D display.

41

Page 42: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• Burnett and Barfield [Burnett 1991] suggest that in terms of personal preferences controllers favored the perspective display. Other researches [Ellis 1987], instead, indicate favourable results for 3D interfaces.

• There is still effort [Wickens 1989] aiming to conduct research in this area

The second research direction is technology-driven. Several works have exploited the VE technology to build a 3D visualization environment for ATC.

• F. Persiani and A. Liverani (University of Bologna, Italia)[Persiani 2000] developed ATC-VR, a new interface for real 3D ATC data visualization and interaction. ATC-VR provides the operator with an intuitive tool for approach control. ATC-VR uses stereoscopic glasses and 3D pointing devices to visualize and interact with the 3-D graphical representations.

• Ronald Azuma et al. (Hughes Research Laboratories, California, USA) [Azuma 1996; Azuma 2000] introduced an interactive, real time demonstration of 3D visualization for ATC. This demonstration offers a 3D stereoscopic view for both controller and pilot. They paid much attention to the representation of flight path for visualization and resolution of conflicts.

• David Zeltzer and Steven Drucker (MIT Media Laboratory USA) [Zeltzer 1997] developed a mission planner using Virtual Environment system. Users can interact directly with 3D models such that aircraft, terrain, threats and targets by using voice and gesture recognition. Data Gloves is used as 3D input devices. Users can use Data Gloves to move waypoints. Once a waypoint is moved, the flight path is recalculated and displayed.

• Nancy Dorighi et al. [Dorighi 1994] (Advanced Displays and Spatial Perception Laboratory, NASA Ames Research Centre, USA) investigated “Advanced Display and Manipulative Interface for Air Traffic Management”. This advanced display uses a perspective format display in place of the current user interface to present the air traffic. Graphical enhancements to better utilize the full potential of the perspective display are being investigated.

• Marcus Lange et al. (University of Linköping, Sweden) [Lange 2003] have developed an application for interactive 3D visualization of ATM in a workbench display. This application introduces basic function for handling flights and flight routes, for visualizing weather phenomena. The multimodal interaction has being developed for this application.

A common point of these technology-centred applications is the fact that they do not take into account the usability side of the application. In addition, the interaction has not been considered enough in these studies.

42

Page 43: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 3

Interaction Tasks and Interaction Techniques in 3D User Interfaces

43

Page 44: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

44

Page 45: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 3 - Interaction Tasks and Interaction Techniques in 3D User Interfaces

3.1 Introduction This chapter introduces the basic interaction tasks that the user can perform in a 3D interface and the interaction techniques that can be used to accomplish these interaction tasks.

The understanding of interaction tasks and interaction techniques is necessary to understand the analysis performed in Chapter 4 about the interaction tasks and interaction techniques in a 3D visualization environment for ATCOs.

3.2 Interaction Tasks in 3D User Interfaces Bowman [Bowman 1999a] proposed three basic interaction tasks in 3D user interfaces: the object selection and the object manipulation to handle 3D objects inside the 3D scene and the navigation to explore the 3D scene.

3.2.1 Object Selection

Object selection involves the specification of one or more objects from a set of objects by the user. Selection may be based on a number of attributes including name, appearance, location, etc. Selecting an object based on shape or spatial proximity may imply some forms of direct manipulation [Shneiderman 1997]; for example, reaching for the mouse placed next to a keyboard in daily life. On the other hand, selecting an object based on the name or other non-spatial attributes may suggest an indirect form such as query [Gabbard 1997].

Selection helps users to make one or many objects active for the object manipulation task. Correspondingly, two general cases of the object selection tasks are: selection of a single object and selection of multiple objects.

3.2.2 Object Manipulation

Object manipulation usually takes place after object selection. Users select object (or objects) of interest with the intent of setting the position and/or orientation, of querying the information or of changing the attributes like colour, size etc. of the selected object(s). Similar to the object selection task, two general cases of the object manipulation tasks are: manipulation of a single object and manipulation of multiple objects. Four basic manipulation tasks are: object placement, object rotation, object modification and informative query.

Object placement refers to the relocation of an object. This kind of task is common in daily life and also in 3D environments. Users select the object of interest and move it to a new position. When the user places the selected object to the new position, s/he performs the object placement task.

Similarly, object rotation concerns the observation of the object of interest under different viewpoints by rotating it. This task helps the users to understand the visual features of the object of interest.

45

Page 46: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Object modification concerns the change of the object attributes such as colour, shape, label etc. This task can be applied on a single object or on a group of objects. This can help the users to organize the 3D scene in a manner meaningful to them by customizing the object attributes [Gabbard 1997].

Informative query refers to the information retrieval of an object or a group of objects. One of the simplest types of informative query is the one which returns general information of a particular object. A richer type of informative query is the one which allows the users to query one of more databases through non-direct manipulation means.

3.2.3 Navigation

Navigation tasks include exploration, search and manoeuvring [Bowman 2004].

In an exploration task, the user has no explicit goal for his/her movement. S/he browses the 3D scene to obtain information about the objects and their locations within the 3D scene and builds up knowledge of the 3D environment.

Search tasks refer to find a target location within the 3D environment.

Manoeuvring tasks take place in a local area and involve small, precise movements. The most common use of manoeuvring is to place the viewpoint more precisely within a limited local area to perform a specific task.

3.3 3D Interaction Techniques The interaction techniques provide the user with the mean to perform the interaction tasks mentioned above. This section introduces an overview of existing 3D interaction techniques grouped by two groups: Selection-Manipulation and Navigation.

3.3.1 Selection and Manipulation

In reality, selection and manipulation usually go together. Selection is often performed to set up manipulation. There is no pure selection or manipulation technique, several selection techniques are also used for manipulation. Therefore, interaction techniques for selection and manipulation can be classified in the same category.

Both perceptual information and abstract information are subjected to selection and manipulation. Examples of perceptual information are position, shape, colour, texture etc. of 3D object. Abstract information is information that is not normally directly perceptible in the physical world. For example, information about the visual appearance or surface texture of a table is directly perceptible, while information about its date and place of manufacture is not (this information is thus abstract) [Bowman 1998b; Bowman 2003].

Several selection and manipulation techniques have been created and several classifications [Bowman 1999a; Poupyrev 1998] of interaction techniques have also been proposed. However, these classifications only concentrate on the interaction techniques with perceptual information. The interaction techniques with abstract information should be taken into consideration. The following sections introduce the typical existing selection and manipulation techniques divided into two groups with regard to the perceptual information and abstract information:

46

Page 47: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• The first group deals with perceptual information. The interaction techniques in this group support the object selection task, the object placement task and the object rotation task. The typical interaction techniques in this group include the ray-casting based techniques, the cone-casting based techniques, the virtual hand techniques, the World-In-Miniature technique, the combining techniques. These techniques are presented in the sections from 3.3.1.1 to 3.3.1.5

• The second group deals mainly with abstract information. The interaction techniques in this group support the object selection task, the object modification task and the informative query. The interaction techniques in this group are combinations of some interaction techniques belonging to the first group and the system control techniques. The details of the system control technique and the combination of the system control technique with some of the interaction techniques belonging to the first group are presented in Section 3.3.1.6

3.3.1.1 Ray-casting based techniques

The ray-casting based techniques allow the user to select 3D objects by pointing at them. Usually, a light ray (a straight light ray or a curve-shaped light ray) is emanated from the hand-held input device to point to 3D objects. When the light ray intersects an object, the user can select it by issuing a trigger event that confirms the selection. Examples of triggers are the button press and the voice commands. The selected object can then be attached to the ray for manipulation. By using the ray-casting based techniques, the object selection task, the placement and rotation tasks are performed as following:

• Object Selection – When this light ray touches the object of interest, the user can trigger a command (e.g., a wand button press) to select it. The object then is attached to the ray for manipulation.

• Object Placement – The selected object is attached to the ray and the user can move the hand-held input device to move the object.

• Object Rotation – The ray-casting based techniques does not offer a good performance for the object rotation task. By using this technique, the selected object can be only rotated around the ray. It’s impossible to rotate around other axis.

Several ray-casting based techniques have been reported in literature. The difference between them is defined mostly by the direction and shape of the ray. The commonly used ray-casting technique and the two-handed pointing technique are typical ray-casting based technique.

Ray-casting technique

The ray-casting technique uses a straight ray light to interact with the 3D objects. This technique allows the user to select and manipulate objects beyond the area of normal reach. The user points at the object with a light ray and then the objects intersecting with the ray can be selected, attached and manipulated. The direction of the ray (the pointing direction) is defined by the position and the orientation of the input device.

47

Page 48: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 3-1. Ray-casting technique: The red box can be selected when the light rays (in red) emitted from hand-held input devices

intersect it (Pol, Ribarsky et al. 1999[Pol 1999], GVU Center, Georgia Tech)

Two-handed pointing

Another way to specify the pointing direction is to use two hands for selection with help from two hand-held input devices like the Data Gloves or the Wand. The coordinated movements of two hands can control the direction and the shape of the ray.

For example, the pointing direction can be specified using a two-handed technique: one hand specifies the starting point of the light ray while the other hand specifies where the ray is pointing to. An alternative way is that, the distance between the hands can be used for adjusting the length of the light ray, and by slightly twisting the hands the user can curve the ray. This can provide a simple mechanism to disambiguate, that is, to have a very fine and precise selection) among several objects arranged in the depth direction and simplifies pointing to fully or partially occluded objects [Olwal 2003].

Figure 3-2. Flexible Pointer using two-hands pointing technique: the ray is controlled by two hands-held input devices so that it becomes a curve (in yellow); this curve can help to select the target object (the black box in this case) while avoiding the obscuring

object in the foreground (Olwal and Feiner 2003 [Olwal 2003], UIST’03)

A disadvantage of this approach is the fact that both hands must be tracked. However, it allows for richer and more effective pointing interaction.

48

Page 49: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

3.3.1.2 Cone-casting based techniques

The cone-casting based techniques were derived from the ray-casting technique in which a cone-shaped cursor is used for interacting with 3D objects instead of a light ray. By using the cone-casting technique, the object selection, placement and rotation tasks are performed as following

• Object Selection – If there is only one object in the conic selection volume, the user can trigger a command to select it. If two or more objects fall into the conic volume, then the object that is the closest to the centre line of the selection cone is selected. If the angle formed with the centre line of the selection cone is the same for both objects, the object closer to the device is selected.

• Object Placement & Object Rotation – This technique is different from the ray-casting only in the selection part. The object placement and the object rotation are performed in as similar manner to the case of ray-casting technique.

There are two forms of cone-casting technique: the Spotlight technique and the Aperture technique.

Spotlight

In the spotlight technique [Liang 1994], the pointing direction is defined in the same way as in the ray-casting technique, but the light ray is replaced with a conic selection volume whose apex is at the input device. Objects that fall within this cone can be selected. The shortcoming of this technique is that more than one object can fall into the spotlight, especially with increased distance to the object.

Aperture

The aperture technique [Forsberg 1996] is a modification of the spotlight technique that allows the user to interactively control the spread of the selection volume. The pointing direction is defined by the 3D position of the user’s viewpoint in a virtual space, which is estimated from the tracked head location, and the position of a tracked hand, represented as an aperture cursor within the 3D environment. The user can interactively control the spread angle of the selection volume by bringing the hand sensor closer or moving it further away. The aperture technique, thus, perfects the spotlight technique by providing an interactive mechanism of object disambiguation within the conic volume.

Figure 3-3. The Aperture technique: The conic selection volume is described by the eye point (viewpoint) and aperture position

(Forsberg et al. [Forsberg 1996])

49

Page 50: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

3.3.1.3 Virtual Hand techniques

Virtual Hand techniques allow the users to select and to manipulate virtual objects with their hands. Typically, a 3D cursor is used to visualize the current location of the user’s hand; the 3D cursor can be a 3D model of a human hand. The position and orientation of the input device are mapped onto the position and orientation of the virtual hand.

To select and object, the user simply intersects the 3D cursor with the target of the selection and then s/he uses a trigger technique (e.g., button press, voice command, or hand gesture) to pick it up. The object is then attached to the virtual hand and can be easily translated and rotated within the 3D environment until the user releases it with another trigger.

• Object Selection – To select an object, the user simply intersects the virtual hand with the target, and issues a trigger (a hand gesture, a button press) to pick it up.

• Object Placement – After being selected, the target object is attached to the virtual hand. The mapping of the user’s hand motion to the virtual hand’s motion facilitates the placement of object because every user’s hand motion corresponds to a virtual hand motion. Within the reach, the user can move an object freely along X-axis, Y-axis, Z-axis or any other axis.

• Object Rotation – The user can rotate the hand held-input device (a data glove or a wand) to rotate the selected object. The user has full freedom to rotate around the object centre or around any axis.

Two typical techniques in this category are ‘classical’ virtual hand and Go-Go technique.

‘Classical’ Virtual Hand

Figure 3-4. The virtual hand is controlled by a CyberGloveTM (image courtesy of Virtual Reality Project at Ohio Supercomputer

Center, www.osc.edu)

The classical virtual hand technique is a direct mapping of the user’s hand motion to a virtual hand’s motion in a 3D environment, typically linearly scaled to establish a direct correspondence between the device and the coordinate system of 3D environment.

The fundamental problem with such techniques is that only objects within the area of the user’s reach can be selected and manipulated. In order to select objects located further away,

50

Page 51: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

the user must employ a travel technique (which will be discussed in Section 3.3.2) to move to the object, which in many cases is inconvenient and increases the complexity of the 3D UIs.

Go-Go

The Go-Go technique [Poupyrev 1996] interactively extends the virtual hand’s reaching distance by using a non-linear mapping function applied to the user’s real hand extension. All features of the “Classical” Virtual Hand are also true for the Go-Go technique. The Go-Go technique allows users to both bring faraway objects near and move near objects further away. Therefore, unlike the “Classical” Virtual Hand, the Go-Go technique allows the user select and manipulate objects that are both close to the user and at-a-distance.

3.3.1.4 World-In-Miniature

The World-In-Miniature (WIM) technique [Stoakley 1995] provides the user with a miniature handheld model of the VE, which is an exact copy of the VE at a smaller scale. This WIM gives also the user another point of view from which s/he can observe the scene. The user can change that point of view under direct manipulation by turning the model in her or his hands.

• Object Selection – The user can select the object of interest by either directly pointing (using the ray-casting technique) to the object itself or by pointing to its representation in the WIM. Therefore, the users can select objects at any distance.

• Object Placement & Object Rotation – WIM allows the user to select and manipulate objects both, within and outside the user’s range. Small movements of the objects in the WIM are magnified so that moving a chair a millimetre might move it ten feet in the larger world, making precise placement or rotation difficult.

3.3.1.5 Combining techniques

HOMER

Time

Figure 3-5. The HOMER technique

51

Page 52: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

HOMER [Bowman 1997a] stands for Hand-centered Object Manipulation Extending Ray-casting. This technique is a combination of the ray-casting technique and the classical Virtual Hand technique. The user selects an object using a ray-casting technique, and instead of the object being attached to the ray, the user’s virtual hand instantly moves towards the object and attaches to it. The technique then switches to the manipulation mode, allowing the user to place and rotate the selected object (Figure 3-5).

Voodoo Dolls

Voodoo Dolls [Pierce 1999b] is a two-handed interaction technique that uses a pair of pinch gloves (Figure 3-6) to allow the user to seamlessly switch between different frames of references for a manipulation task, allow object selection of varying sizes and at different distances.

Figure 3-6. Voodoo Dolls interaction technique [Pierce 1999b]

Voodoo Dolls allows manipulation of virtual objects indirectly, using temporary, miniature, handheld copies of objects called dolls. Similar to the WIM, the user manipulates these dolls instead of the virtual objects (Figure 3-7).

Figure 3-7. Creating a doll in the Voodoo Dolls technique [Pierce 1999b]

• Object Selection – The user can select an object by creating a doll of this object. As a result, the objects that have any size, any state of occlusion and that are at any distance can still be selected and manipulated.

• Object Placement & Object Rotation – With the Voodoo Dolls technique, the user dynamically creates dolls: transient, hand held copies of the objects whose effects on the objects they represent, are determined by the hand holding them.

52

Page 53: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

3.3.1.6 System control technique

The issuing of command is an essential way to access any computer system’s functionality. The system control technique refers to the technique that supports the issuing of commands.

In 2D interfaces, the system control is supported by the use of a specific interaction style such as pull-down menus, text-based command lines or tool palette [Preece 2002]. Many of these techniques can be applied to 3D UIs.

System control techniques for 3D UIs include graphical menus (visual representations of commands) and voice commands (commands accessed via voice).

Graphical menus

Graphical menus for 3D UIs are the 3D equivalent of the 2D menus that have proven to be a successful system control technique in desktop UIs [Bowman 2004]. Adapted 2D menus are the commonly used graphical menus in 3D UIs. These menus basically function in the same way as they do on the desktop. Examples of adapted 2D menus are pull-down menus, pop-up menus, etc.

The commands are pre-defined through menu items. The selection of a menu item implies the issue of the corresponding command. The ray-casting technique is usually used for the selection of menu item. The user points to a command on the menu and trigger the command (e.g., by pushing a button of the input device). Figure 3-8 shows an example of using the adapted 2D menu for the object selection task.

D

Box A

Box B

Box C

CB

Box D

A

Figure 3-8. The ray-casting technique is used for selecting the menu item

An effective use of the graphical menu is the query interface. The query interface allow the user to determine what type of actions are available for any object[Esposito 1996]. The representation of the query interface through graphical menu system is useful to show all queries to the user (Figure 3-9).

53

Page 54: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

D

Query 1

Query 2

Query 3

CB

A

Figure 3-9. Using the ray-casting technique and the graphical menu for querying information of objects

Voice command

Voice command is a type of hands-off interaction using natural language. It is often combined with other interaction styles. The issuing of voice commands can be performed via simple speech recognition or by means of spoken dialogue techniques. Speech recognition techniques are typically used for issuing single commands to the system, while a spoken dialogue technique is focused on promoting discourse between the user and the system [Bowman 2004].

3.3.1.7 Summary

To summarize, several techniques have been proposed for the selection and manipulation tasks in 3D environment. The Virtual Hand techniques, the Ray-casting based technique, the Cone-casting based techniques, the World-In-Miniature and combinations of these techniques allow selection and manipulation of the 3D objects through and on the perceptual information of objects. System control techniques like voice command and the combination of graphical menus with the ray-casting technique support the selection and manipulation of 3D objects through and on abstract information of objects.

Figure 3-10 illustrates the summary of the typical techniques for selection and manipulation.

54

Page 55: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Virtual Handtechnique

Ray-casting basedtechnique

System control

Selection andManipulation technique

Voice command

Graphical menu & Ray-casting

World-In-Miniature

Cone-casting basedtechnique

Ray-casting

Two-hand pointing

Flashlight

Spotlight

Classical Virtual Hand

Go-Go

Combining techniquesHOMER

VooDoo Dolls

Figure 3-10. Typical techniques for selection and manipulation

3.3.2 Navigation

Navigation in 3D environments is made of two components: the motor component and the decision-making component. In the literature, the motor component is called by different names such as motion [Darken 1996a], locomotion [Arns 2002; Chance 1998], travel [Bowman 1998a]or viewpoint motion control [Bowman 1997b]. The decision-making component is usually known as wayfinding. In this thesis, we used the terms travel for the motor component and wayfinding for the decision-making component.

The interaction techniques for navigation include travel techniques and way finding aids. Travel techniques support exploration and manoeuvring tasks while wayfinding aids are helpful for search tasks.

Sections 3.3.2.1 and 3.3.2.2 introduce an overview of the existing travel techniques and way finding aids.

3.3.2.1 Travel

Travel is defined as the control of the user’s viewpoint motion in the 3D environment. Travel techniques allow a user to move within the 3D environment in order to obtain different views of the scene [Bowman 1998a].

There are several metaphors for travel in 3D environment. In general, they are classified into two categories based on user movement in 3D environment: the physical movement and the virtual movement [Bowman 2004]. In this section, we introduce some typical travel techniques in these two categories.

Physical Movement

55

Page 56: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Physical movement refers to the fact that the user has to change her/his physical position in order to change her/his viewpoint. Travel techniques in this category allow the users to walk forwards, backwards, and sideways, to rotate themselves in order to get different viewpoints of the VE. They imitate a natural method of locomotion in the physical world. This kind of travel technique is appropriate for applications in which realism is required such as simulations, video games, entertainment systems etc. Examples of techniques in this category are the Walking technique [Slater 1995], the Walking in Place [Usoh 1999], the Devices Simulating Walking (Treadmill) [Darken 1997; Iwata 1999].

Virtual Movement

The virtual movement refers to the use of input devices to change the user’s viewpoint. Users can simply click on a button or manipulate a joystick of a wand in order to move to another place in the Virtual Environment. There are different metaphors for this travel technique.

Scene In Hand

In the Scene-In-Hand or World-In-Hand navigation, the entire Virtual Environment is attached to the user hand [Ware 1990]. When the user moves or rotates his hand, the entire VE moves along with it, resulting in changing the user’s viewpoint (see Figure 3-9). The movement can be scaled so that a small movement of the user’s hand creates a large change in the position of the VE, allowing the user to cover large distances quickly. Similarly, scaling can be used to provide the user with very fine movement control.

Figure 3-11. An illustration of Scene-In-Hand travel technique: the scene is attached to the 6 DOF input device and the user can control the scene by moving or rotating the device (image courtesy of Data Visualization Research Lab [DVRL 2005])

Camera-in-Hand

Camera-In-Hand or Eye-In-Hand is a travel technique using motion trackers. A tracker is held in the hand, and the absolute position and orientation of that tracker in a defined workspace specifies the position and orientation of the camera from which the 3D scene is drawn. The tracker is imagined to be a virtual camera looking at this world. The user travels by moving the tracked hand in the workspace.

56

Page 57: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 3-12. An illustration of Camera-In-Hand travel technique (image courtesy of Data Visualization Research Lab

[DVRL 2005])

The technique can be confusing because the user has an exocentric view of the workspace, but the 3D scene is drawn from an egocentric point of view. Therefore, it requires a considerable effort to concentrate on the travel instead of doing other activities [Ware 1990]. Scene-in-hand has been shown to be easier to use than Camera-In-Hand [Ware 1989].

Flying Vehicle

In this technique, the input device (e.g., the Wand) is considered as a control device for a virtual vehicle. The Virtual Environment is perceived from this virtual vehicle; the displayed images correspond to the image the user would see as if s/he were placed inside the vehicle [Ware 1990]. Figure 3-11 illustrates the Flying Vehicle technique in which the Wand is used as the input device. The user can fly her/his viewpoint through the Virtual Environment by handling the Wand. The control on the joystick, the change on the position and the orientation of the Wand through hand motion result in changes in the directional and rotational velocity of the viewpoint.

Figure 3-13. An illustration of Flying Vehicle travel technique (image courtesy of Data Visualization Research Lab [DVRL

2005])

Teleportation

In real life, we all wish to have the ability to point at a desired location on a map and be instantly “teleported” to this location. Teleportation is based on this idea. In the Virtual Environment context, the users can have a map of Virtual Environment and can point to a location on this map to get to this location.

57

Page 58: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 3-14. An illustration of Teleportation: The user can point to a location on the map and go directly to this location in the

3D scene (Image courtesy of Bowman et al. [Bowman 2000])

However, when using teleportation, users may have difficulties in building a mental representation of the environment and in understanding the relationship between the initial position and the new position after the teleport. In fact, some studies show that teleportation is confusing and disorienting for the user [Bowman 1998a]

World in Miniature

A World-In-Miniature (WIM)[Stoakley 1995] is a hand-held miniature copy of the entire VE. Like in teleportation, the user who uses a WIM specifies where he wants to move to, and is then taken to that location via a smooth animation through the environment.

Figure 3-15. World-In-Miniature (image courtesy of Stoakley et al. [Stoakley 1995])

The smooth animation allows the user to maintain his/her frame of reference and to avoid disorientation in the Virtual Environment. The user can also rotate the WIM in his/her hands and view it from any angle, which can sometimes provide him/her with the desired information without having to actually travel through the VE [Stoakley 1995]. However, this technique is less useful if the distance to be travelled is quite large, or if the path has obstacles that must be avoided while travelling [Pausch 1995].

58

Page 59: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

3.3.2.2 Way-finding

Way-finding is the “cognitive” process of defining a path through an environment, using and acquiring spatial knowledge, aided by both natural and artificial cues which are also called wayfinding aids. The “cognitive” process here means that a user makes decisions such as “where am I?” or “which direction should I go?” by mentally processing “inputs” (information obtained from the environment) and producing “outputs” (a trajectory to follow). Therefore, wayfinding doesn’t involve only movement but also the tactical and strategic parts that guide the movement [Darken 2001].

Maps, artificial landmarks and trails are the main wayfinding aids [Darken 1996b]. The following sections introduce more details about these wayfinding aids.

Maps

North-up map Forward-up map

You-are-here marker

Direction of travel Direction of travel

Figure 3-16. North-up map versus forward-up map (Image courtesy of the MOVES Institute [MOVES 2005], Naval Postgraduate School)

A map provides an exocentric representation of an environment [Bowman 2004]. By using maps, the users can have an overview of the scene and a detailed view of a location in the scene, simultaneously. The main problems of the map design are the map size (the size is dependent on the size and the resolution of the 3D environment.), the map placement (the map should be placed so that it is easy to access it and the occlusion caused by the map is as little as possible), the orientation (for example, a forward-up map is preferable in the egocentric view such as the whole view of environment while a north-up map is better for the exocentric view such as the view within a sector [Darken 1999]), and the content (the map should be legible and should show the organizational structure of the 3D-ATC environment) of the map.

A special feature of the map usage is the “You-Are-Here” (YAH) marker. A YAH marker can be added to a map to help the users to gain spatial awareness by providing her/his viewpoint position and/or orientation on the map, interactively and dynamically.

Artificial landmarks

Artificial landmarks are easily distinguishable objects that can be used to maintain spatial orientation and serve for distance or direction estimation. Artificial landmarks may be added to any environment to support users’ wayfinding tasks. For example, the user can put an

59

Page 60: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

artificial landmark (e.g., a flag) to mark a region and s/he can easily return to this region at any moment thanks to this easily distinguishable landmark. The main problems for landmark design are the visual representation of landmarks (e.g., the landmark should be easily distinguishable by using a contracting colour, different lighting, a contracting form or a different size) and the location of landmark (e.g., the artificial landmark should be located so that it is easy to access).

Figure 3-17. The markers in different colours which help the user to remember some important locations are artificial landmarks in this context (Image courtesy of Darken and Peterson [Darken 2001], Naval Postgraduate School)

Trails

In order to help the user to “retrace her/his steps” in an environment, or to show which parts of the world has been visited or which operations s/he has performed recently, trails can help the user to get out of a disorientation situation. A trail can be made up of a simple line or by using markers that include directional information, just like the footprints in the real world.

Figure 3-18. In this example, the user moved backwards and is looking towards the start location. The footprints. The footprints show the direction the user was looking at that point in time (Figure courtesy of Darken and Peterson[Darken 2001], Naval Postgraduate School)

60

Page 61: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

3.3.2.3 Summary

To summarize, both travel techniques and wayfinding aids support navigation in a 3D environment. Travel techniques can involve either physical or virtual movements. The typical techniques for virtual movement are Scene-In-Hand, Camera-In-Hand, Flying Vehicle, Teleportation and World-In-Miniature. Way-finding aids include maps, artificial landmarks and trails. Figure 3-19 illustrates the summary of the typical techniques for travel and wayfinding.

Physical Movement

Virtual Movement

Travel technique

Flying Vehicle

Scene-In-Hand

Map

Artificial LandmarkWay-finding aid

Trail

Camera-In-Hand

Teleportation

World-In-Miniature

Figure 3-19. Typical techniques for navigation

3.4 Conclusion This chapter has introduced the background of the interaction tasks and interaction techniques in 3D User Interfaces. Object selection, object manipulation and navigation are three main interaction tasks that the users can perform in a 3D interface.

Object selection sets up object manipulation task. Object manipulation task helps the user to change the perceptual information of 3D objects (e.g., visual features such as color, size etc.) or to query the abstract information of the 3D objects. Object manipulation task is composed of different sub-tasks including the object modification, the object rotation, the object placement and the informative query. The ray-casting techniques, the cone-casting techniques, the virtual hand techniques, the World-In-Miniature, the combining techniques, the system control techniques are among the most commonly used selection and manipulation techniques for perceptual information. The combination of system control techniques and the ray-casting technique are commonly used to deal with abstract information.

Navigation is composed of exploration, search and maneuvering tasks.

Travel techniques such as the Scene-In-Hand, the Flying Vehicle, the Camera-in-Hand techniques etc. support the exploration and the maneuvering tasks while the wayfinding aids such as map, artificial landmark etc. support search tasks.

61

Page 62: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 4 will analyze the role of the interaction tasks in the context of 3D-ATC and will investigate the possibility to use the existing interaction techniques for these interaction tasks in a 3D-ATC environment.

62

Page 63: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 4

An Analysis of Interaction Task and Interaction Techniques in A 3D

Environment for Air Traffic Control

63

Page 64: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

64

Page 65: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 4 - An Analysis of Interaction Task and Interaction Techniques in a 3D Environment for Air

Traffic Control

4.1 Introduction

4.1.1 Motivation

As stated in Chapter 2 (section 2.3), several works [Azuma 1996; Lange 2003; Persiani 2000] in the perspective of building a 3D environment for ATC have been previously performed. However, these works seem rather technology-driven, in that the 3D-ATC environment is built “intuitively”, mostly derived from the current 2D interfaces for air traffic control and they concentrate mostly on visualization features. These applications propose some interaction tasks and interaction techniques for the 3D-ATC environment without judging the choice of interaction tasks and interaction techniques among the existing interaction techniques. In other words, there is a lack of an understanding about interaction tasks and the interaction techniques that can be used for 3D-ATC environment.

State-of-the-art of 3D interfaces presents different forms of interaction tasks in 3D UIs and several different 3D interaction techniques for these interaction tasks. However, which interaction tasks can be performed in a 3D-ATC environment and which interaction techniques can be used for these interaction tasks? The analysis performed in this chapter contributes to an answer to these questions. This analysis provides a vision of how the interaction tasks and the interaction techniques will be used in a future 3D-ATC environment.

4.1.2 The scope of the analysis

This analysis concerns interaction tasks that affect 3D visualization of traffic within sectors in the approach and en-route control. The analysis only concentrates on two indispensable parts of an ATC system: the dynamic part (the air traffic) including flights and the static part (the airspace) including sectors and routes.

Both perceptual information and abstract information about objects are considered in investigating interaction tasks and interaction techniques. In the ATC context, perceptual information concerns the visual representation of airspace, sector, aircraft etc. (e.g., geometry, lighting, colour, texture, etc.); abstract information concerns the information of sector, flight etc. that is not easy for a visual representation such as the capacity, the load of a sector.

4.2 Analysis approach Based on the interaction features, we define the 3D-ATC environment as a combination of four components: Object, View, Interaction Task and Interaction Technique (Figure4-1)

65

Page 66: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• Object refers to a visual representations of an ATC object and is a source for querying abstract information

• View defines the point of observation of an object or a group of objects.

• Interaction Task defines the actions that the user can perform in a 3D-ATC environment. The interaction tasks include Navigation, Object Selection and Object Manipulation.

• Interaction Technique defines the way an interaction task is performed. We consider two groups of interaction techniques: Selection-Manipulation and Navigation. The interaction techniques in the Selection-Manipulation group include the interaction techniques for the object selection task and the object manipulation task. The interaction techniques in the Navigation group include the travel techniques and the wayfinding aids.

ViewObject

InteractionTask

InteractionTechnique

Figure 4-1. Interaction features as a combination of Object, View, Interaction Task and Interaction Technique

All these four components are analysed. The analysis begins with the “Object”, followed by the “View”. Because of the evident link between the interaction tasks and the interaction techniques, the Interaction Task and Interaction Technique will be analysed together and in two different contexts: the interaction with the Object and the interaction with the View. Object Selection and Object Manipulation are the interaction tasks performed on the Object while Navigation as the interaction task performed on the View.

Section 4.3 introduces the analysis on Object; Section 4.4 describes the analysis on the View. Section 4.5 and Section 4.6 analyse the interaction tasks and interaction techniques on the Object and the View respectively.

4.3 Object The mobility of an object is a very important feature for the analysis of interaction techniques. Considering ATC objects as mobile features, we categorize ATC objects in two groups: the “static group” and the “dynamic group”. The “static group” contains the objects including airspace, sectors and flight routes. The “dynamic group” contains the objects including flight.

66

Page 67: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

We consider the airspace and the air traffic as the main components which build a basic ATC environment. Airspace is the main component of the static group side while the flight is the main component of the “dynamic group”. The interactions with the visual representation of the airspace including the sectors and the flights and the flight are the main interactions which could take place in a 3D-ATC environment. A detailed analysis of these two components is indispensable for the analysis of interaction in next steps.

4.3.1 Airspace

Airspace is composed of routes and predefined airspace volumes like sector. Any route is created from predefined waypoints and route segments. Each route segment connects two waypoints. The visual representation of airspace offers an overview of sector-to-sector traffic while sector representation offers a detailed view of traffic within sector.

Airspace

RouteAirspaceVolume

Waypoint SectorRouteSegment

Figure 4-2. Airspace

Figure 4-3 shows an example of visual representation of airspace and sector in the ATM application of Linköping University.

Figure 4-3. Airspace and sector (image from ATM application developed by Linköping University)

67

Page 68: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Regarding the type of information, the visual representations of sector, waypoint and route segment are perceptual information while the information of sector such as Capacity (estimate of the number of flights per unit time that can be accommodated by the sector), Demand (estimate of the number of flights per unit time that plan to pass through the sector), Load (actual utilization of sector by aircraft in a specific time period), etc. is abstract information.

4.3.2 Air Traffic

Air traffic is made of flights entering and exiting the sectors.

Every flight is visually represented with an aircraft. The visual representation of a flight can help to visualize the current position of the flight in the sector. Figure 4-4 shows an example of visual representation of flight in the ATM application of Linköping University. The visual representation of waypoint and route segment belonging to airspace is also shown in this figure.

Figure 4-4. Flight and Flight route representation (image from ATM application developed by Linköping University)

Regarding the type of information, the visual representation of flight is the perceptual information while the information related to the flight such as Trajectory (the planned four-dimensional path of the aircraft.) or Aircraft Type (the type of aircraft) is abstract information.

To sum up this section about the Object, the visual representation of flight, sector, flight route and waypoint are the main ATC objects that are subjected to interaction tasks in this analysis. The common point of these ATC objects is that they all have an identifier. For example, every flight has a callsign; the sector and waypoint all have its identifiers. This can facilitate the use of some specific interaction techniques.

68

Page 69: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

4.4 View We can consider two types of view based on the level of details of a 3D scene: the global view and the detailed view. The level of “global” or “detail” can be changed through the navigation task. The view also has effects on the size of the objects and in some cases this can affect the interaction with the objects. In fact, when changing the view of 3D scene from a detailed view to a global view during the navigation, the size of object in the scene is reduced.

Global view. The global view shows the view of the entire 3D scene. In the context of 3D-ATC environment, the view of airspace in which the user can see several sectors or the view of the flight routes that pass through different sectors are examples of the global view. Figure 4-5 shows an example of global view of flight routes from different sectors, extracted from the ATM application of Linköping University.

Figure 4-5. Example of a global view of flight routes (image from ATM application developed by Linköping University)

Detailed View. The detailed view shows the details of a zone of interest. In the ATC context, examples of detailed views are the view of a particular region of a sector or the view on a specific flight. Figure 4-6 shows a detailed view on the traffic within a region in a sector, also extracted from the ATM application of Linköping University.

69

Page 70: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 4-6. Example of a detailed view within a sector (image from ATM application developed by Linköping University)

4.5 Interaction tasks with the Object: Selection-Manipulation The object selection task and the object manipulation tasks including the object modification, the object placement and informative query are analyzed consecutively in Section from 4.5.1 to 4.5.4. The object rotation task is not analyzed because this task is really useful in the design, e.g. the automobile design or the aircraft design etc. when the designers usually rotate the 3D model see the details the design model of a product. It seems not to be the case in the context of 3D-ATC environment.

4.5.1 Object Selection

The object selection task is considered under two views: the global view and the detailed view. In the global view, users can see the entire traffic in the sector through the visual representation of sectors and of other objects like the flight route, the flight. In the detailed view, the controllers can observe and then select flight representations, flight routes (waypoint, route segment). The visual representations of flight, sector, waypoint and route segment can be all subjected to the object selection task (as summarized in Figure 4-7).

70

Page 71: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

SectorFlight Flight route

Selection

Waypoint RoutesegmentAircraft

Figure 4-7. ATC objects that are subjected to the object selection task

Examples of the object selection task in a 3D-ATC environment are as follows: select a sector to see its information, select a flight representation to see the information of this flight, select two or more flight representations to see if there is any potential conflict between (among) them, select the flights which have a given flight level, select a waypoint for manipulation, etc.

Selection of single object and selection of multiple objects will be considered in choosing the appropriate selection technique.

We consider five groups of selection techniques for the object selection task: the virtual hand techniques, the ray-casting based techniques, the cone-casting based techniques, the World-In-Miniature and the system control technique (Figure 4-8).

Virtual Hand

Ray-casting basedtechnique

System control

Object Selection

Voice command

Graphical menu & Ray-casting

World-In-Miniature

Cone-casting basedtechnique

Ray-casting

Two-hand pointing

Flashlight

Spotlight

Classical Virtual Hand

Go-Go

Figure 4-8. The interaction techniques for the analysis on the object selection task

4.5.1.1 Ray-casting based techniques, cone-casting based techniques and virtual hand techniques

Ray-casting based techniques, cone-casting based techniques and virtual hand techniques are among the most popular techniques for the object selection task. The ray-casting based techniques and

71

Page 72: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

the cone-casting based techniques provide notably better performance in selection tasks than the virtual hand techniques [Bowman 2004]. The cone-based techniques, Spotlight and Aperture, are short of the disambiguation mechanism when having to select a single object in a set of closely-spaced objects [Forsberg 1996] (Figure 4-9).

Figure 4-9. The disambiguation in case of selection a single object in a set of closely spaced objects

Two-handed pointing requires both hands for interaction and it is quite complicated because the user has to control the direction of the ray every time s/he performs an object selection task.

Thus, the ray-casting technique seems an appropriate candidate for the object selection task. However, the ray-casting technique does not support the selection of multiple objects because only the object which is touched by the ray can be selected. The user is not able to select multiple objects at the same time. Indeed, the user can still select multiple objects by touching the ray to the object one after another. In this case, a multi-selection mechanism can be added to support this type of selection, e.g. keep pushing a special button on the input device (e.g., the Wand) to select more than one object.

4.5.1.2 World-In-Miniature

The user has to use ray-casting technique to select the items representing the ATC objects in the World-In-Miniature. The items representing the ATC objects usually have small size. In case of small size of objects in the global view, the size of these items in the WIM is even smaller. This is the biggest drawback in using the WIM for selection in a 3D-ATC.

4.5.1.3 Voice command

Normally, each ATC object like sector, flight or waypoint has an identifier. This can facilitate the voice selection. No matter which size and distance of objects, objects can be selected by speaking out their identifier.

Here we only concentrate on the speech recognition technique for issuing the commands to the system. The predefined voice commands can help the user to perform several tasks from simple to complicated tasks.

In the context of the object selection task, the voice command can help to select a single object or multiple objects. For example, we can define the command “SELECT Identifier” to select a single object through its identifier or “SELECT GROUP Condition” for selecting multiple objects which satisfy the given condition.

72

Page 73: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

4.5.1.4 Graphical menu and Ray-casting technique

The graphical query interface, a combination of the graphical menu and the ray-casting technique, can provide pre-built query for commonly used query such as “What are the flights which haves the flight level FL320? ” or “where is the waypoint WPT1?”. The result of such a query can be an object or a group of objects which is highlighted and is ready for the manipulation task.

The pre-built queries can be graphically presented by items of a query interface (e.g. the menu system) and the user selects the menu item to accomplish the query. The ray-casting technique is used to select items on the menu.

Figure 4-10 presents an example of using the graphical query interface in the context of 3D-ATC environment. The menu item “Flight Level” presents the query which has the form “What are the flights which haves the flight level X?”. The user points to this menu item and a pop-up menu which contains different flight levels is shown. The user then point to the flight level of interest (e.g., FL320) to select all flights which have this flight level.

FL300

FL310

FL320

FL330320

320

320

330

330

310

Flight level Waypoint

Figure 4-10. An example of the graphical query interface

This technique can help the user to select multiple objects simultaneously. The ray-casting technique is used in this case to select the items on the query interface. As mentioned above, the only drawback of the ray-casting technique in the object selection task is the lack of support in selecting simultaneously multiple objects. The graphical menu thus offers a good element to combine to the ray-casting technique.

To sum up, the interaction techniques for the object selection tasks should satisfy three conditions: selection of a single object, selection of multiple objects and selection of small objects. The combination of the ray-casting technique and the graphical menu and the voice command satisfy these conditions and could be candidates for the object selection task in a 3D-ATC environment.

73

Page 74: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

4.5.2 Object Placement

Object placement is the special handling that can be applied for a particular ATC object: the waypoint of flight route. In fact, the modification of flight route is required for the resolution of conflict between flights. In general, a flight route is built from a set of waypoints. The modification of a flight route refers to the change of its set of waypoints. The change made on the set of waypoints can be the adding or deleting one (or many) waypoints to/from this set, or as the substitution of a waypoint by another one. On the visualization size, the modification of flight route is represented through the change in the form of the flight route representation. From the interaction point of view, the problem is the placement of waypoints.

Placing an object from the point A to the point B can be composed of two phases (Figure 4-11)

1. Moving phase: the user moves the object from the location A to the area in close proximity to the location B

2. Placement phase: the user does small and fine moves and controls to place precisely the object at the location B

(1)

(2)

A

B

Figure 4-11. Two phases of the object placement task

We consider three groups of interaction techniques for the object placement task: the ray-casting technique, the virtual hand technique and the voice command technique

4.5.2.1 Ray-casting technique and Virtual Hand technique

The virtual hand provides the most natural way to place the selected object while the ray-casting technique is an easy way to move the selected object from the initial location to the area in close proximity to the target location. In other words, the ray-casting technique is better in the moving phase while the virtual hand is better in the placement phase.

In the moving phase, the user has to move the input device a longer by using the virtual hand technique than by using the ray-casting technique. Figure 4-12 and 4-13 explain this difference. The scenario in two figures is the same. The user has to move the selected object a distance D from the position A to the position B.

By using the ray-casting technique, the user only has to turn the hand (and so the hand-held input device) an angle α. The very small distance d is generated because of the slightly change of the position of the hand while turning. In this case, d approaches the value 0.

74

Page 75: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

α

D

d

A B

the user hand Figure 4-12. Using the ray-casting technique for the moving phase

By using the virtual hand technique, the user moves the hand-held input device (e.g., data gloves) to move the hand-shaped cursor. In order to move the selected object a distance D from the point A to B, the user has to move the hand-held input device a distance d. d is usually smaller than D to reduce the hand moving. However, the distance d in this case is much bigger than in the case of the ray-casting technique.

D

d

A B

the user hand

the virtual hand

Figure 4-13. Using the virtual hand technique for the moving phase

In the placement phase, it is more difficult to perform small and fine movements to place precisely the object at the target location with the ray-casting technique than with the virtual hand techniques. Figure 4-14 and 4-15 explain this difference. The scenario in two figures is the same. The user has to place precisely the selected object from the point B to the point C.

75

Page 76: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

α

D

d

B

C

the user hand Figure 4-14. Using the ray-casting technique for the placement phase

By using the ray-casting technique, the turning angle α in this case is very small and it’s difficult to control for acquiring this small value α.

D

d

B

C

the user hand

the virtual hand

Figure 4-15. Using the virtual hand technique for the placement phase

By using the virtual hand technique, it is easy to move the hand-held input device a small distance d (even a very small distance d) to place the selected object from the point B to C.

To summarize, the ray-casting technique seems better in the moving phase but leads to some difficulties in the placement phase. On the contrary, the virtual hand technique requires the user to move her/his hand for a longer moving distance but is better in the placement phase. As a result, these two techniques could be considered as good candidates for the object placement task.

4.5.2.2 Voice command

The voice could be the quickest way to move and place an object from a location to another, if the user has preliminary information (e.g., the identifier) concerning the target location. In such a case, the user releases a voice command with information of the new location to

76

Page 77: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

move object to there. However, in many cases, the user doesn’t know about the location to move and s/he has to find it out. Voice commands seem not suitable for these cases.

To sum up, both the ray-casting technique and the virtual hand technique provide a direct and interactive way for the placement of objects. They offer a quite natural way to move objects and could be appropriate candidates for the object placement task.

4.5.3 Object Modification

The object modification task is performed after the object selection task. All selectable objects - flight representation, sector, waypoint and route segment - are subjected to the object modification. The object modification helps to change attributes of an object in order to draw attention to this object or to facilitate the observation of a situation. The attributes of ATC objects’ representation can be the color, the size, the shape, the transparency etc.

Examples of the object modification task in a 3D-ATC environment could be the followings: change color of a sector or a group of sectors, change color of a waypoint or of a route segment, change color of a group of waypoints and route segments to highlight the flight route of a flight, etc.

In general, there are several options for each attribute. Before modifying an attribute, all options of this attribute should be known by the user. The user can be aware of the attributes and their options explicitly thanks to the graphical representation (e.g. a graphical menu system) or implicitly through predefined voice commands.

We consider two groups of interaction techniques for the object modification task: the combination of the ray-casting technique and the graphical menu and the voice command technique.

4.5.3.1 Ray-casting technique and Graphical menu

The graphical menu presents all attributes and options of an object (or a group of objects) and the ray-casting technique can be used to select an item (an attribute or an option) on the menu.

Figure 4-14 illustrates the combination of the graphical menu and the ray-casting technique for the object modification task. The selected object is highlighted (in yellow). A graphical menu is popped up to show all attributes of the selected object (color, size, transparency). When the user points to an attribute, another popup menu (or palette), which contains all options of this attribute, is shown up. The user points to an option to perform the modification task.

77

Page 78: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Color

Size

Transparency

Figure 4-16. An example of the combination of the graphical menu and the ray-casting technique for the object modification task

4.5.3.2 Voice command

The voice command can be used for changing the attributes of the selected objects. However, each attribute usually has several options (e.g. different colors, different sizes). It’s not advisable to define a set of voice commands for all options because memorizing this set of voice command could be a problem.

To sum up, the interaction techniques for the object modification task should take into consideration the fact that there are several attributes and options for an object and it’s better if the user can visually observe these attribute and options before performing a modification. The combination of ray-casting technique and graphical menu satisfy these conditions and could be suitable for the object modification task in 3D-ATC environment.

4.5.4 Informative query

Similar to the object modification task, all selected objects - flight representation, sector, waypoint and route segment - are subjected to the informative query task (in other words, the request for information). However, while the object modification task is applied to the perceptual information of objects, the informative query task on ATC objects in 3D-ATC environment deals with the abstract information of ATC objects.

The abstract information may concern one or multiple objects. Regarding the informative query on a single object, the information of an aircraft such as the category, the minimum, maximum speed etc. or the information of a sector such as the current load of this sector (the number of flight in the sector), the capacity of the sector etc. are examples of abstract information of ATC objects. Regarding the informative query on multiple objects, information about the relationship (the potential conflict) between (among) two (or more) flights is abstract information. They can all be sources of the informative query.

78

Page 79: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Similar to the attributes and options of a modifiable object in the object modification task, the queries should be known by the user. The user can be aware of the provided queries explicitly thanks to the graphical representation (e.g. a graphical menu system) or implicitly thanks to predefined voice commands.

Similar to the analysis performed with the object modification task, the combination of ray-casting technique and graphical menu is the appropriate interaction technique for the informative query task.

Figure 4-17 illustrates the combination of the graphical menu and the ray-casting technique for the informative query task. The selected object is highlighted (in yellow). A graphical menu is popped up to show all queries of the selected object (e.g. information of the aircraft, trajectory of the flight etc.). When the user points to a query and triggers a command (e.g., a wand button press), the result of the query is displayed.

Information of the aircraft

Trajectory of the flight

Type: Boeing 777Maximum speed:

1000 km/h

Figure 4-17. An example of the combination of the graphical menu and the ray-casting technique for the informative query task

4.5.5 Summary

From the analysis performed above, we can recapitulate three main suggestions as following

1. The combination of ray-casting technique and graphical menu, the voice command seem suitable for the object selection task

2. The combination of ray-casting technique and graphical menu seem appropriate for the object modification and informative query tasks.

3. Both ray-casting technique and virtual hand technique seem suitable for the object placement task.

Therefore, there are four interaction techniques that can be used for the selection and manipulation task: the ray-casting technique, the virtual hand technique, the combination of the ray-casting technique and the graphical menu. Figure 4-18 shows the icons of these techniques. Figure 4-19 shows the icons of the four main selection and manipulation tasks: the object selection task, the object placement task, the object modification task and the informative query task. These icons will be used to represent the combination of these techniques for the selection and manipulation task in a 3D-ATC environment, later on in the thesis.

79

Page 80: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Voice command Ray-casting Ray-casting &Graphical menu

Virtual Hand

Figure 4-18. Proposed icons of the selection and manipulation techniques

?Object selection Object placement Object modification Informative query

Figure 4-19. Proposed icons of the selection and manipulation task

Derived from the analysis, four combinations of selection and manipulation techniques can be taken into consideration for the selection and manipulation task

1. Combination A (Figure 4-20): Using ray-casting technique for the object selection and object placement tasks, combined to the graphical menu for the object modification and informative query task. This combination provides the user with a seamless interaction, i.e. there is no change in the way to interact (in other words, the interaction technique) with the objects.

?

Figure 4-20. Combination A

80

Page 81: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

2. Combination B (Figure 4-21): Using the ray-casting technique for the object selection task, combined to the graphical menu for object modification and informative query task and using the virtual hand technique for object placement. This combination can be interpreted through the HOMER technique, a combination of ray-casting technique for selection and virtual hand technique for manipulation.

?

Figure 4-21. Combination B

3. Combination C (Figure 4-22): Using voice command for the object selection task, using ray-casting technique for the object placement task, using the combination of ray-casting technique and graphical menu for object modification and informative query task.

?

Figure 4-22. Combination C

4. Combination D (Figure 4-22): Using voice command for the object selection task, using virtual hand technique for the object placement task, using the combination of ray-casting technique and graphical menu for the object modification and informative query task.

81

Page 82: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

?

Figure 4-23. Combination D

4.6 Interaction tasks with the View: Navigation We consider the navigation as the interaction task performed on the view. Navigation helps the user to acquire situation awareness of environment.

The exploration and the search are considered for navigation in 3D-ATC environment. Similar to the object rotation task, the manoeuvring task is not analysed because this task concerns the very detailed view on an object or on many objects. The manoeuvring task is really useful for the design of automobile or aircrafts etc.; it seems not to be the case in the context of 3D-ATC environment.

Beside the exploration to build a whole view of 3D environment, the user may want to find and reach a target, e.g. a waypoint or a flight representation, as readily as possible.

4.6.1 Exploration task

The exploration task in a 3D-ATC environment helps the user to change from the current view to another view, e.g., from the global view to the detailed view and vice versa. The exploration is performed thanks to the travel techniques.

The physical movement metaphor requires a large space for physical moves. To explore the 3D environment, the user has to move in a physical space. It can cause a time-consuming and tiring travel. This kind of travel is good for entertainment where the physical feedback is more important. We do not investigate the physical feedback aspects for a 3D-ATC environment in this study; therefore, we will not go to the details of the physical movement.

The virtual movement metaphors - Scene-In-Hand, Camera-In-Hand, Flying Vehicle, Teleportation and World-In-Miniature - will be all considered for the exploration task consecutively in the sections from 4.6.1.1 to 4.6.1.5.

82

Page 83: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Travel techniqueExploration task Flying Vehicle

Scene-In-Hand

Camera-In-Hand

Teleportation

World-In-Miniature

Figure 4-24. The interaction techniques for the analysis on the exploration task

The support for the user to change from the global view to the detailed view (and vice versa) and the degree of disorientation of the user when using the travel techniques for the exploration task are the main issues which have to be considered when choosing the appropriate travel technique.

4.6.1.1 Scene-In-Hand

By using the Scene-In-Hand technique, the user has the scene in her/his hand and can easily change her/his viewpoint to the scene by moving or rotating the hand-held input device. The Scene-In-Hand technique changes the position and the orientation of the scene in every handling. The scene is only translated and rotated. The user always has the global view of the scene but it is not possible to change from the global view to the detailed view.

In addition, the scene is moved all the time when using this travel technique. The movement of the scene can easily cause disorientation while performing the travel, because some slight changes in the position or in the orientation of the hand-held input device can lead to the immediate changes of the position and orientation of the scene. In this case, many efforts are also required to select and manipulate the objects in the moving scene.

Therefore, this travel technique is not suitable for navigation in 3D-ATC environment.

4.6.1.2 Camera-In-Hand

The Camera-In-Hand technique draws the scene from the viewpoint placed on the hand-held input device. The user can only observe the scene from a given viewpoint.

The user has to build a mental picture of the global view of the scene. As the result, some slight changes on the position and orientation of the hand-held input device derive a new viewpoint, and the user has to do considerable efforts to understand the new view drawn from the new location of the hand-held input device. By using this technique, the user has to concentrate on navigation instead of performing the selection and manipulation tasks.

Similar to the Scene-In-Hand technique, the scene is also moved all the time when using the Camera-In-Hand technique. As explained above, this leads to disorientation while performing the travel requires many efforts to select and manipulate objects in the moving scene.

Therefore, this travel technique is not suitable for navigation in 3D-ATC environment.

4.6.1.3 Flying Vehicle

The Flying Vehicle technique offers a natural way for the travel in 3D environment. The user conducts herself/himself in a 3D environment in the same way that s/he controls the wheel of a spaceship for navigating in a 3D space.

83

Page 84: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

The user can easily change from the global view to the detailed view and vice versa. The movement of the scene is on-demand, i.e. every time the user clicks on the control button. Therefore, this travel technique does not affect the selection and manipulation task.

However, the user may get lost while doing the exploration task by using this technique because of the lack of the view on the context, i.e. his/her current position in the 3D environment.

This technique could be used for the exploration task in 3D-ATC environment if some cues are added to help the user to be aware of his/her position in the 3D environment while navigation

4.6.1.4 Teleportation

Teleportation can help the user to get to a position quickly and it is easy to use. The user has always the global view of the scene thanks to a map of the scene and s/he can go to any location of the scene by pointing to its corresponding position on the map. However, it may cause also the disorientation problem because of the quick translation [Bowman 1998a]. The user is placed in the new location right after the pointing.

In addition, because of the small scale of the map, a location on the map and its corresponding real location in the scene are usually different. In many case, the user cannot go to the desired position accurately. In this case, the user needs to be provided with some free movements to get to the desired position. However, this technique does not provide this kind of free moving control.

Therefore, this technique could be used for the exploration task in 3D-ATC environment if a smooth translation is added to the movement from a location to another one and if a free moving control is added.

4.6.1.5 World-In-Miniature

The World-In-Miniature technique works in a similar way to the Teleportation technique. However, the map in the Teleportation technique is replaced by a miniature model of the 3D scene and a smooth translation to the target is also added to help to overcome the disorientation problem.

However, similar to the teleportation, the World-In-Miniature does not provide a free and direct movement within a 3D space as the Flying Vehicle does

Therefore, this technique could be used for the exploration task in 3D-ATC environment if a free moving control is added.

4.6.2 Search task

The search task in a 3D-ATC environment helps the user to find an ATC object (e.g. a way point, a flight representation). The search task can be performed thanks to the wayfinding aids.

84

Page 85: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Wayfinding aidsSearch task Artificial Landmark

Map

Trail

Figure 4-25. The interaction techniques for the analysis on the search task

The wayfinding aids like map, artificial landmarks and trails will be all considered for the search task consecutively in the Sections 4.6.2.1 and 4.6.2.2.

The wayfinding aids support the user to find out the desired target and help to reduce the disorientation.

4.6.2.1 Map

In general, the map shows the global view of the scene. By using the map, the user is always aware of her/his position in the 3D environment through the You-Are-Here marker on the map. Therefore, the map is ideal to help the user to reduce the disorientation in 3D environment.

The map shows all objects in the 3D environment in the small size. This is helpful for the search task because the user can find a target on the map easily. The map could be either 2D or 3D.

Therefore, the map is appropriate for the search task and can be used for the navigation in a 3D-ATC environment. In fact, the use of maps has shown many advantages in the Teleportation and the World-In-Miniature techniques. The 2D map is used in the Teleportation technique and the 3D map (the 3D miniature model) is used in the World-In-Miniature technique. The maps seem to reduce the disorientation in the exploration task and also support the search task with the Teleportation and the World-In-Miniature.

The only drawback of these two techniques is the lack of a support for free and direct moving within 3D space like the Flying Vehicle. A combination of the Flying Vehicle technique and the map can overcome this drawback and can offer an effective way for navigation in the 3D-ATC environment

4.6.2.2 Artificial landmarks and Trails

Artificial landmarks can be useful for the search task, e.g. adding some markers to mark the visited area. Trails can help the user e.g. to return to a correct view when getting lost in 3D environment. These wayfinding aids can be added to the map usage in order to enhance the search task and to reduce the disorientation for the user.

4.6.3 Summary

The analysis of the techniques for navigation in a 3D-ATC environment reveals three possibilities for the interaction techniques: The combination of the Flying Vehicle technique and wayfinding aids (the map, the trail, the artificial landmarks), the teleportation and the World-In-Miniature.

The Teleportation and World-In-Miniature make use of the map as wayfinding aids.

The combination of a Flying Vehicle technique and wayfinding aids is better than Teleportation because it offers a way for freely and directly handling the scene.

85

Page 86: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

To sum up, two good candidates for the navigation are the combination of Flying Vehicle and wayfinding aids (map, trail, artificial landmark) and the World-In-Miniature technique.

4.7 Conclusion This chapter introduces an analysis of interaction tasks that can be taken place in a 3D-ATC environment and of 3D interaction techniques that can be used for these interaction tasks. The traffic within sectors is chosen as the context of analysis.

Four components - Object, View, Interaction Task and Interaction Technique- were selected for the analysis. The selection and manipulation are considered as the interaction tasks on the Object while the navigation as the interaction task on View.

This analysis also tries to identify the appropriate interaction techniques to interaction tasks in a 3D-ATC environment.

Based on the analysis on the selection and manipulation tasks, four basic selection and manipulation techniques that can be used in the context of 3D-ATC environment were revealed. They are the ray-casting technique, the virtual hand technique, the combination of the graphical menu and the ray-casting technique and the voice command. From these techniques, four combinations were proposed for the selection and manipulation techniques in a 3D-ATC environment.

Based on the analysis on the navigation task, two techniques were proposed for the navigation in a 3D-ATC environment: the combination of Flying Vehicle technique and wayfinding aids (the map, the trail, the artificial landmark) and the World-In-Miniature technique.

The next step is to evaluate the usability of these techniques to figure out the possible interaction issues and to finalize the most appropriate interaction techniques that can be used for a 3D-ATC environment.

The ray-casting technique was chosen as the first interaction technique for the usability evaluation. One of the usability problem with the ray-casting technique revealed through the usability evaluation is the problem of interaction with the occluded objects. Chapter 5 introduces in details this problem and our solution to overcome this problem.

86

Page 87: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 5

Selection-By-Volume Approach & New Interaction Techniques

87

Page 88: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

88

Page 89: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 5 - Selection-By-Volume Approach & New Interaction Techniques

5.1 Introduction The analysis in Chapter 4 proposes different combinations for selection, manipulation and navigation. As also mentioned in Chapter 4, the usability evaluation on the proposed interaction techniques and on the proposed combinations should be performed as the next step of this study.

The ray-casting technique was chosen as the first interaction technique for the usability evaluation. Some training sessions were organized with formerly operative controllers helped to discover the problem of interaction with occluded objects when using the ray-casting technique (in short, the problem of occlusion).

This chapter presents two main subjects: the problem of occlusion and the solutions to help the user to overcome this usability problem.

First, this chapter presents the training sections for air traffic controllers and the problem of occlusion in Section 5.2.

Second, this chapter presents the Selection-By-Volume approach, an approach for interaction with three-dimensional (3D) objects through a combination of geometric shape and two-dimensional (2D) menu system. The Selection-By-Volume approach was proposed to help to overcome the difficulties caused by the problem of occlusion. The main topics relating to the Selection-By-Volume approach are presented from Section 5.3 to Section 5.8. Section 5.3 presents the related works concerning the use of geometric shape for the object selection task. Section 5.4 describes main features of the Selection-By-Volume approach. Sections 5.5 and 5.6 introduce two new interaction techniques, Transparent Sphere and Transparent Cylinder, derived from this approach. Section 5.7 presents a usability evaluation of the Selection-By-Volume approach using the cognitive walkthrough method and Section 5.8 discusses the main issues of the Selection-By-Volume approach.

5.2 The usability evaluation on the ray-casting technique

5.2.1 Training sections

Some training sessions with formerly operative controllers were organized in order to get them acquainted with the Wand and the ray-casting technique. This training session was not quantitatively measured but still we could collect feedbacks, controllers’ opinions and discuss eventual interaction problems.

For this training the equipment available was used. We used a BARCO projection system BARON 900 (136cmx102cm, resolution 970x720 with 110Hz frequency) displaying stereoscopic 3D scenes.

89

Page 90: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 5-1. The green rectangular is occluded

Controllers were standing up in front of the BARCO monitor and they were required to freely move simple virtual objects within a scene. The ray-casting is made of a simple straight line emanating from the tracked wand. The user can move the wand to point the ray towards a certain object. The selection of the object is then determined by the collision detection between the ray and the object.

If a user wants to select a target object (as example, the green rectangle depicted in Figure 5-1) that is occluded by other objects, the ray (coloured in red in the Figure 5-1) will intercept all the objects lying within the same trajectory (cf. Figure 5-1) and this will trigger more collisions to be detected. However, only one object at a time can be selected and it may happen that other than the desired object will be selected.

This problem emerged many times during the training with the controllers. As a matter of fact, at several occasions controllers wanted to select a special target, but they were not able to perform the selection, because of the technical limitations above described.

5.2.2 The problem of occlusion

The problem of occlusion is originated by the relative location of viewpoint of the observer in the 3D scene. In fact, given a certain viewpoint, some objects may appear either partially or totally occluded to the observer (as in Figure 5-2).

View-point View-point

Figure 5-2. The full occlusion (the left figure): the blue box is fully occluded by the yellow one. The partial occlusion (the right figure): the blue box can be partially seen by the user.

90

Page 91: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Normally, before being able to interact with the occluded object, users have to move themselves or to even change the topology of the scene in order to gain visibility of the occluded objects. These disadvantages can be explained in Figure 5-3 and Figure 5-4.

AA A

NONO OK

Figure 5-3. An example of selecting objects using the wand as input device and the ray-casting as selection technique. Users try to select the blue triangle, but it is occluded by the yellow rectangle (from the current user’s viewpoint). As a result, the users have to move the wand until there is no occlusion between the blue triangle and other objects anymore

AA

zoom

NO OK

Figure 5-4. An example of selecting objects using wand as the input device and ray-casting as the selection technique. Users change the scale of the scene (e.g. by zooming) to eliminate the occlusion state between the target object (the blue triangle) and other objects (from the current user’s viewpoint)

Figure 5-3 shows an example of selecting objects using the wand as input device and the Ray-casting [Bolt 1980] as selection technique. Users try to select target object (the blue triangle) which is occluded by the yellow rectangle. Users have to move the wand to eliminate the occlusion state between the blue triangle and other objects. Figure 5-4 shows another way to select the occluded objects. Users change the scale of the scene (e.g. by zooming) to eliminate the occlusion state between the blue triangle and other objects.

The handles to eliminate the occlusion state (as in Figure 5-3 and Figure 5-4) require several moves and controls, and cause the tiredness to the users.

Although several interaction techniques have been proposed, only a small number of techniques have taken into consideration this problem. However, they do not offer a complete solution for the problem of occlusion. VooDoo Dolls [Pierce 1999a; Pierce 1999b] and Flexible Pointer [Olwal 2003] can deal with occluded objects but are quite complicated because they oblige two-handed interaction. The ray-casting technique [Bolt 1980] requires many manoeuvres to select the occluded object in the partially occluded case and is impossible to select the target object in the fully occluded case. Likewise, cone casting-based techniques like Spotlight [Liang 1994] and Aperture [Forsberg 1996] necessitate many

91

Page 92: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

handlings to get the occluded target object. Virtual Hand and other techniques based on Virtual Hand metaphor [Poupyrev 1996] were not designed to deal with occluded objects; therefore they cannot deal with the problem of occlusion

Visual solutions or automatic “disocclusion” algorithms [Wloka 1995] can be implemented to overcome the problem of occlusion. For example, a label can be added as a visual support to help to select the occluded objects (as in Figure 5-5)

P2 P2P1 P1

P2

Viewpoint Viewpoint

label

Figure 5-5. An example of visual solution for solving the problem of occlusion. The object P2 is occluded in the current viewpoint. A label can be added to P2 in order to notice the existence of P2 and to facilitate the selection of P2

However, the dynamic aspect of air traffic can still cause the occlusion between objects (Figure 5-6) and the “disocclusion” solutions can cause the unnatural perception of the traffic [Tavanti 2004b]. In fact, at a given moment, a moving object can be visible for a viewpoint (Figure 5-6(a)).

P2 P5P1 P1

(a) (b)Viewpoint Viewpoint Figure 5-6. The object P2 change from the “non-occlusion” state to “occlusion” state when moving

As a matter of fact, a new method to help the user to interact with occluded objects is necessary. In an attempt to facilitate the interaction with occluded objects, we proposed a novel approach for object selection. The basic idea of this approach is to use a combination between the selection volume (under the form of geometric shapes) and 2D menu systems for interaction with 3D objects. We called this approach the Selection-By-Volume approach. Basing on this approach, we proposed two interaction techniques: the Transparent Sphere and the Transparent Cylinder.

92

Page 93: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5.3 Previous works on the use of selection volume for the object selection task

The idea of using a geometric shape for interaction with object was exploited in the past through the cone-casting based techniques such as the Spotlight technique [Liang 1994], the Aperture technique [Forsberg 1996] and the “Silk Cursor” technique [Zhai 1994].

The Spotlight technique uses a conic selection volume. All the objects within the conic selection volume may be selected. If there is only one object in the selection volume, the user can trigger a command to select it. If two or more objects fall into the selection volume, then the object that is the closest to the centre line of the selection cone is selected. If the angle formed with the centre line of the selection cone is the same for both objects, the object closer to the device is selected. The Aperture selection technique is a modification of the spotlight technique that allows the user to interactively control the spread of the selection volume. By changing the selection volume, users can change the objects that are selectable within the volume. By using these two techniques, it is difficult to select a single object in a set of closely-spaced objects [Forsberg 1996]. Therefore, the Spotlight technique and the Aperture technique do not help to overcome the difficulties in the selection of occluded object.

Zhai’s silk cursor [Zhai 1994] selects objects that fall within a transparent cubic volume. The transparent cubic volume provides visual cues that indicate the location of the target objects which can be behind, within, and in front of the transparent cubic volume. However, it is also difficult to select a single object in a set of closely-spaced objects or occluded objects because users have to decide which part of which object is behind, within or in front of the transparent cubic volume. This complicates the selection task.

The use of 2D menu system in combining with the volume selection in the Selection-By-Volume approach will offer a disambiguation method for selecting a single object in a set of closely spaced objects or occluded objects. The following section introduces the main features of the Selection-By-Volume approach.

5.4 Selection-By-Volume Approach In real world, we select objects by simply touching, holding them. The “select by touching” idea is inspired by the interaction with the real world. This idea is exploited in Virtual Pointer and Virtual Hand-based interaction techniques in which selection is performed by touching a light ray or a 3D cursor to the target object.

Would it be possible to select the object of interest without touching it, i.e. “select without touching”? This idea inspired us to propose the Selection-By-Volume approach. As stated above, the basic idea of this approach is to use a combination between the selection volume (under the form of geometric shapes) and 2D menu systems for interaction with 3D objects. We called this approach the Selection-By-Volume approach.

By using the Selection-By-Volume approach, the selection task can be performed in two steps:

• Pre-selection: highlight objects (that include the object of interest) located within a predefined volume (known as the selection volume).

93

Page 94: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• Item selection: pick up the target object by pointing at the corresponding label (or the menu item) in a menu which displays all the labels of objects falling within the selection volume.

The next sections discuss two main components of this approach: the selection volume and the 2D menu system.

5.4.1 Selection volume

The most important issue in this approach is to define the selection volume. The selection volume is characterized by the shape that represents the selection volume and by the size, the material of the selection volume.

5.4.1.1 Shape

The basic shapes such as cone, cylinder, sphere, cube etc. as well as other special, complex shapes could be used to represent the volume. Interaction techniques created from this approach will be different from each other by the way of defining the selection volume.

5.4.1.2 Size

The size of the selection volume decides the number of objects that fall within the volume. This number decides the number of labels on the menu. In the same 3D scene, a large selection volume could contain more objects than a small one (as in Figure 5-7).

Small size volume Large size volume

Figure 5-7. With the same scene, a large selection volume could contain more objects than a small one

In case of large selection volume, many objects can fall within the volume, and it will take more time to select the object of interest. However, it’s easier to capture the target object with a large selection volume. On the contrary, a small selection volume may require several handlings to capture the target object. However, there will be fewer objects within the selection volume and it will take less time to select the object of interest. Therefore, a reasonable choice of the selection volume size helps to obtain an optimal number of objects which can fall within the volume and thus improve the performance of the selection task.

94

Page 95: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

High density scene Low density scene

Figure 5-8. With the same selection volume size, more objects could locate within the selection volume in the high-density scene than in the low-density scene.

Nevertheless, the number of objects located within the volume selection depends on the density of objects within the 3D scene as well. With the same selection volume size, there could be more objects that fall within the volume in a 3D scene with high density of objects than in a 3D scene with low density of objects (as in Figure 5-8). Reducing the size of the volume selection may support a quicker selection of the target objects because the list of items that has to be scanned in the selection menu will be also shorter

Intuitively, the size of the selection volume seems to be a function of the density of the objects displayed in the 3D scene.

5.4.1.3 Material

The material of selection volume is characterized by the colour and the transparency.

Colour. The colour should be used to make the selection volume easy to see. For example, the use of colour for the border of the selection volume could help the user to distinguish the parts inside and outside the selection volume.

Transparency. A specific degree of transparency for the selection volume should be considered in order to keep avoiding the situations in which the selection volume occludes or reduces the visibility of the objects located within the selection volume.

5.4.2 2D Menu System

Menu is one of the key components in the WIMP (Window, Icons, Mouse, Pointer) paradigm of the 2D user interface. In computer science, menu is known as a list of options (the menu items) from which the user can make a selection and afterwards perform a desired action. The use of menu in this case turns the selection of 3D objects to the selection of menu items arranged on a plane, which is easier and help to avoid the ambiguousness.

Three important features of menu design are the placement of menu, the representation of menu and the way to select items on the menu [Jacoby 1992].

95

Page 96: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5.4.2.1 Menu placement

Item 1

Item 2

Item 3

Item 4

Item 2.2

Item 2.4

Item 2.3

Item 2.1

Ite m 1

Item 2

Item 3

Item 4

Item 2.2

Item 2.4

Item 2.3

Item 2.1

Item 1

Item 2Item 1

Item 2

Item 3

Menu Placement

View-referencedMenus

World-referencedMenus

Object-referencedMenus

Figure 5-9. Three methods of menu placement

The placement of the menu influences the user’s ability to access the menu and the amount of occlusion of the environment (because menu can occlude the objects in the scene) [Bowman 2004]. In general, there are three types of menu placement: the world-referenced menu (the menu system that is placed at a fixed location in the 3D scene), the view-referenced menu (the menu system is always placed at a fixed location compared to the viewpoint position) and the object-referenced menu (the menu system is attached to an object in the 3D scene) [Kim 2000] (Figure 5-9).

5.4.2.2 Menu Representation

Orientation and style are among the key features of menu presentation.

Orientation. The extra depth dimension results in several possibilities for positioning the menu in 3D scene. In fact, menus can be displayed on any XY plane, XZ plane, YZ plane or any other plane within 3D space. Anyhow, the menu should be positioned, oriented in such a manner that it is visible and it is easy to manipulate.

Style. The commonly used menu styles in 2D user interface such as pop-up menu (the menu that appears temporarily when the user performs a particular action and usually disappears once the selection is made) and pull-down menu (a special type of pop-up menu that appears directly beneath the command you selected) can be used for the menu design in this approach.

Another feature of the menu representation is the colour of menu. In general, the menu colour should make menu visible and should be designed with regard to the colour of the scene. The colour design is a huge topic in human computer interaction; the detailed analysis of the colour design will go beyond the scope of this thesis.

96

Page 97: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5.4.2.3 Item selection

Menu placement and menu presentation help to visualize the menu before selecting an item on the menu. In order to pick up the target object, the user has to select its corresponding item on the menu. The item selection thus plays an important role for selection performance in this approach. The problem of occlusion disappears when selecting menu items. The traditional Ray- casting technique is usually used for the item selection.

5.4.3 Interaction techniques derived from Selection-By-Volume approach

An interaction technique can be created from this approach by choosing the features for volume like shape, color, size etc and the features for menu like the style, the menu placement, and the selection technique for selecting menu items. Detailed choices of each feature are mentioned above. Figure 5-10 shows a summary of the main features of menu and volume. A simple use of the Selection-By-Volume approach is applying it to existing interaction techniques. As mentioned above, several existing selection techniques have been created from the principle “Selection by touching”. The Selection-By-Volume approach goes out of this principle and helps to create new interaction techniques. By applying this new idea to the Virtual Hand and Ray-casting techniques, we proposed two new interaction techniques known as the Transparent Sphere and the Transparent Cylinder. Transparent Sphere uses a sphere as the selection volume while a cylinder is used as the selection volume for Transparent Cylinder.

Interaction technique

Volume

ItemSelection

Menu

Shape Transparency Size MenuPlacementStyle OrientationColor

Figure 5-10. A listing of choices for every feature of an interaction technique created from the Selection-By-Volume Approach.

These two techniques were also subjected to the evaluation on the Selection-By-Volume approach. The following sections discuss in detail the design of the Transparent Sphere and the Transparent Cylinder techniques.

97

Page 98: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5.5 Transparent Sphere

5.5.1 Description

The Transparent Sphere technique works in a similar way to the Virtual Hand technique. Instead of using a hand-shaped cursor to touch the target object for the object selection task, a sphere volume is used. All objects lying within the sphere volume are highlighted. The list of all the highlighted objects appears in a menu. In order to select one object, it is sufficient to point at the corresponding label in the menu and then to trigger a command on the input device (e.g., push a Wand button). The menu displays all the names of the objects falling within the sphere volume, also the occluded (hidden) ones.

5.5.2 Design

RP1

P2

P3P4

P6

P5P2

P3

P1A

Figure 5-11. Transparent Sphere Technique: the objects located within the transparent sphere are highlighted by a different colour (yellow)

The sphere volume defined by R is transparent so that users can see the objects inside the sphere. However, some cues of the border of the sphere are enabled to make the sphere visible (as in Figure 5-11). Regarding the menu, an upright pop-up menu attached to the sphere is used for displaying the menu items. This is a kind of object-referenced menu which appears after the pre-selection step and disappears after having done the item selection step. The Ray-casting technique is used for the item selection.

The user handles an input device (a tracked wand or a data glove) to control the motion of the sphere in the 3D scene and to trigger the commands.

98

Page 99: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5.5.3 Selection

Figure 5-12. The object selection task using the Transparent Sphere

Pre-selection. Firstly the user has to explore the 3D environment in order to determine which object s/he wants to select. As the user moves the input device (and consequently the sphere), all the objects located within the sphere volume become highlighted (Figure 5-12(a)).

Item selection. When there is only one highlighted object within the sphere volume, the user can trigger a command to select it directly. In case of many highlighted objects within the sphere volume, a menu that contains labels of all highlighted objects will be shown. A light ray is automatically shown to help the user to point to the label on the menu (Figure 5-12(b)). As the ray points at one label, this one changes colour. The user can then trigger a command (e.g., click a button) to select the corresponding object. The selected object is moved to the centre of the sphere and it is ready for any manipulation later on.

Selection of occluded objects

The occluded objects or the hidden objects can all be shown on a 2D menu in which all labels are shown on a plane (Figure 5-13). The menu thus helps to overcome the occlusion problem.

99

Page 100: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 5-13. In the scene, the ball P09 is occluded by the ball P01. P09 can be selected by pointing to its label on the menu

5.5.4 Manipulation

Moving. The user moves the selected object by moving the input device. A linear function is used to define the correspondence between the motion of the input device and the motion of the sphere volume. The motion speed is defined in advance based on the calculable size of 3D scene so that the sphere can reach to every location within the 3D scene.

Placement. While moving, the user can place the attached object at any location at any moment by trigger a command.

Rotation. The selected object is attached to the input device. As a result, a rotation of input device results in a corresponding rotation of the selected object.

5.5.5 Undo

In every step of the selection and manipulation tasks, the user can undo any operation by doing a special command (e.g., push on a special Wand button).

5.6 Transparent Cylinder

5.6.1 Description

The Transparent Cylinder technique was inspired from the Ray-casting technique. Instead of using a light ray to touch the target object for the object selection task, a cylinder volume is used. Similar to Transparent Sphere, the objects lying within the cylinder volume are all highlighted. The list of all the highlighted objects appears in a menu. In order to select one object, it is sufficient to point at the corresponding label in the menu and then to trigger a command on the input device (e.g., push a Wand button). The menu displays all the names of the objects falling within cylinder volume, also the occluded (hidden) ones

100

Page 101: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5.6.2 Design

L

RP1

P7

P3

P2

P8

P5

P0

P9

P4

P6P2

P3

P7

P8

P1A

Figure 5-14. Transparent Cylinder technique: the objects located within the cylinder are highlighted by a different colour (yellow)

The cylinder volume (defined by cylinder length L and cylinder radius R) is transparent so that users can see the objects inside the cylinder. Also, some cues of the border of the sphere are enabled to make the cylinder visible (as in Figure 5-14) and an upright pop-up menu is used for showing the menu items. This is a kind of object-referenced menu which appears after the pre-selection step and disappears after having done the item selection step. The Ray-casting technique is used for item selection. The menu is made available at the point A.

The user handles an input device (a tracked wand or a data glove) to control the motion of the cylinder in the 3D scene and to trigger the commands.

5.6.3 Selection

Pre-selection. Users use a cylinder volume to explore the 3D scene, in order to determine which object s/he wants to select. All the objects located within the cylinder volume become highlighted. The user moves the input device to control the motion of the cylinder. The cylinder volume, which stretches out in depth, helps to scan every object in the depth without moving the input device so much. Therefore, there are usually many objects within the cylinder.

Item Selection. When there is only one highlighted object within the cylinder volume, the user can trigger a command to select it directly. In case of many highlighted objects within the cylinder volume, a menu that contains labels of all highlighted objects will be shown. A light ray is automatically shown to help the user to point to the labels on the menu. As the ray points at one label, this one changes colour. The user can then trigger a command (e.g., push a Wand button) to select the corresponding object. The selected object is moved to the axis of the cylinder and it is ready for manipulation tasks later on.

101

Page 102: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 5-15. The object selection task using the Transparent Cylinder

Selection of occluded objects

The occluded objects or the hidden objects can all be shown on a 2D menu in which all labels are shown on a plane (Figure 5-16). The menu thus helps to overcome the problem of occlusion

Figure 5-16. In the scene, the ball P01 is occluded by the ball P05. P01 can be selected by pointing to its label on the menu

5.6.4 Manipulation

Moving. The user moves the selected object by moving the input device. A linear function is used to define the correspondence between the motion of the input device and the motion of the cylinder volume.

Placement. While moving, the user can place this moved object at any location at any moment by trigger a command.

Rotation. The selected object is attached to the input device. As a result, a rotation of input device results in a corresponding rotation of the selected object.

102

Page 103: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5.6.5 Undo

In every step of the selection and manipulation tasks, the user can undo any operation by triggering a command (e.g., push on a special Wand button).

5.7 A Usability Evaluation of the Selection-By-Volume Approach Using the Cognitive Walkthrough Method

In order to uncover users’ related implications related to the interaction techniques derived from the Select-By-Volume approach (prior to its full implementation), a low prototype was created. This prototype consists of paper written system specifications. It was tested with the Cognitive Walkthrough [Polson 1992].This task-oriented methodology allows the analyst to survey system’s functionalities. When a specific task is chosen (in our case is the selection of objects) the sequence of actions required is analyzed and the difficulties that a user may encounter are identified. The following sections will describe the analysis performed and the results of the analysis.

5.7.1 The Cognitive Walkthrough

The Cognitive Walkthrough use “a detailed procedure to simulate user’s problem solving at each step through the dialogue, checking if the simulated user’s goals and memory content can be assumed to lead to the next correct action” [Nielsen 1990]. In other words, the evaluators try to follow the reasoning that a hypothetical user follows in order to accomplish a task. Once the task is chosen, the analysts examine each step required to perform the task, taking into account three elements [Rieman 1995]

1. the accessibility of the correct control; 2. the quality of the match between the control's label and the goal; 3. the feedback provided after the action. These elements are related to the users through pertinent questions. For every action taken by the user, the system responses are described.

This type of evaluation does not include the end-users in the evaluation process (their role is “played” by the evaluators) and it does not provide quantitative results. Yet, the method has some advantages. For instance it is fast and economic to carry out; it allows identifying usability problems prior the implementation of a system with only the system specifications as a basis. It supports the design process since the risk of implementing deficient and unusable systems is reduced.

5.7.2 Object Selection

The general context of the task to be analyzed in this study is the selection of 3D objects with the interaction technique derived from the Select-By-Volume approach. The Transparent Cylinder technique was selected as the subject for evaluation.

5.7.3 Scenario and Analysis

Using the Wand the user has to explore the 3D scene in which several objects are represented. As the desired object is identified, the user has to “freeze” the highlight in a

103

Page 104: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

cylinder by pressing a button in the Wand. This action will open a menu in which the objects names are listed. With the ray the controller can point at one item of the list, click a Wand button so that the selection is finally achieved.

To summarize, the tasks that the user has to perform in order to select an object are exploration, freezing, selection of an item in the menu. The exploration and freezing are in the pre-selection step. The selection of an item in the menu is the item selection step. All these tasks are analyzed in the following sections.

5.7.3.1 Exploration

Question 1: Is the correct action evident to the user?

Answer: At first yes, in fact the only way to interact with the 3D scene is the Wand. So the choice of manipulating this device is constrained.

Question 2: Is the association between the correct action and the effect to be achieved clear?

Answer: Firstly the user has to explore the 3D scene, in order to determine which object s/he wants to select.

As the user moves the Wand (and consequently the ray), all the objects located within the cylinder volume become highlighted. Even if the user has no experience with this 3D interaction technique, the “highlight system response” is widely used in 2D interfaces. In 2D environments, when the mouse pointer passes over some objects, which become highlighted, it usually means: “ready to be clicked”. However in those cases, the highlight is unique and single, there is a “one-to-one” correspondence between the pointer and the target object.

• Flaw n.1: The Selection-By-Volume metaphor used with the Transparent Cylinder should be learned, as the multiple highlights are not common.

• Flaw n.2: The radius R of the cylinder is arbitrarily chosen.

Question3: If the correct action is performed, does the system provide an appropriate feedback?

Answer: The exploration task provides a noticeable feedback thanks to the highlight.

5.7.3.2 Freezing the highlighted area of objects

Question1: Is the correct action evident to the user?

Answer: By experience, the user may speculate that a “click” is necessary. Usually in 2D systems, when an object is highlighted it can be selected (opened, dragged, etc.) by clicking the mouse button. As a matter of fact, according to the system specifications, the user has to freeze the highlighted area by pressing one button located in the Wand.

• Flaw n.3: The system does not provide sufficient indications in this sense. The users should be informed that the highlighted area has to be “frozen” before the selection task.

Question2: Is the association between the correct action and the effect to be achieved clear?

Answer: There is no evidence of what correct action has to be done to perform the freezing.

• Flaw n.4: The system does not provide sufficient indications in this sense. No instructions are provided regarding the “freezing” action.

104

Page 105: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Question3: If the correct action is performed, does the system provide an appropriate feedback?

Answer: If the correct button is pressed, then the highlight is set within a small area and a menu appears. This menu contains all the names of the highlighted objects displayed in the scene (as in Figure 5-14). The user can associate the correspondence between the content of the list and the objects. Once the highlight is set and the menu appears, the user could realize that the highlighted area does not contain the target object. The “undo” option is useful in this case.

5.7.3.3 Select an object in the menu

Question1: Is the correct action evident to the user?

Answer: The choice is constrained. As the set of highlighted objects is frozen the ray can be used to point at the objects labels in the menu. As the ray points at one label, this one changes its color.

Question2: Is the association between the correct action and the effect to be achieved clear?

Answer: By experience, the users can understand that a button has to be pressed in order to select the desired item. But no explicit information is provided to the users. Another problem that the users may encounter is that an object (which is totally occluded by the others), is visible in the menu but not within the 3D scene (as in Figure 5-17, the object P7 is available in the menu, but not in the scene, because it is occluded). The idea of the menu embedded in the Transparent Cylinder was indeed designed to solve the problem of the ambiguous selection. Yet, the information displayed in the menu might be inconsistent with the overall content of the scene.

P1

P3

P2

P5

P0

P8

P4

P6P2

P3

P7

P1

Figure 5-17. The hidden object P7 is displayed in the menu

• Flaw n.5: The system does not provide instructions on which action is required to perform the selection. Moreover it is not clear which button located in the Wand has to be pressed.

• Flaw n.6: An appropriate way to visualize the occluded object ought to be found, so that consistency and coherency are maintained between the items in the menu and the objects displayed in the scene.

105

Page 106: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Question3: If the correct action is performed, does the system provide an appropriate feedback?

Answer: When the button is pressed, the menu disappears; the object is selected and can be further manipulated by the user.

5.7.4 Problems and Solutions

Through the Transparent Cylinder technique, the analysis performed revealed a number of problems of the interaction techniques derived from the Selection-By-Volume approach. The flaws and the possible solutions are herewith discussed.

5.7.4.1 Pre-selection: Exploration

• Flaw n.1: The Selection-By-Volume metaphor used with the Transparent Cylinder should be learned.

• Flaw n.2: The radius R of the cylinder is arbitrarily chosen.

Solutions. The Transparent Cylinder is quite new and it uses a different interaction approach if compared to standard 2D interaction techniques. It is necessary that users receive some training to become acquainted with this metaphor

Regarding the radius R, some issues were discussed above in the choice of size of selection volume. It is assumed (in the paper prototype) that the radius R is arbitrarily chosen. As previous stated, the size of the selection volume seems to be a function of the density of the objects displayed in the 3D scene. The existence of this functional relation can be empirically set. Another option is the implementation of an adaptable or adjustable R, so that, the users can set the size of the highlighted area according to their preferences. In the “adaptable R”, the volume size can be set based on the density of a scene and then fixed during the interaction with the 3D scene. In the “adjustable R”, the user can change the size of volume during the interaction with the 3D scene.

5.7.4.2 Freezing the highlighted area of objects

• Flaw n.3: The users should be informed that the highlighted area has to be frozen before the selection.

• Flaw n.4: No instructions are provided regarding the “freezing” action.

Solutions. Users have to receive information regarding the necessity of the “freezing” action. Ideas to consider are the implementation of a textual instruction within the interface, providing instruction with this respect or simply support the users training.

5.7.4.3 Item selection

• Flaw n.5: The system does not provide instructions on which action is required to perform the selection. It is unclear which button located in the Wand has to be pressed.

• Flaw n.6: An appropriate way to visualize the occluded object ought to be found, so that consistency and coherency are maintained between the items listed in the menu and the objects displayed in the scene.

106

Page 107: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Solutions. Users have to receive adequate instruction regarding the selection. If the “click” of a button seems an intuitive action to do, it should be made evident through the interface. Furthermore, which button should be pressed to select an item? Intuitively the same button used for task “freezing the highlight area” can be deployed. Once it is set a unique relation between a device (the button) and its function (selection), the relation should remain consistent during the whole activity.

Few more considerations are necessary at this stage of the analysis. First, the order according to which the objects’ names appear in the menu; in other words, how the objects displayed in the scene should be mapped with their names displayed in the menu? The simplest solution is to display the names following the alphabetic order. Alternatively, the mapping could reflect the geographical distribution within the scene (e.g., the objects which is farthest away from the user could be listed as the first one in the menu).

Second, the flaw number 6 identifies a severe inconsistency. An occluded object is detected by the system and its label is visualized in the menu. The interface should visually prompt the presence of the occluded objects, for example putting a small flag on its top or simply considering the implementation of semi-transparent objects (which allow the see others objects behind them), etc.

5.8 Discussion

5.8.1 Advantages

Solution for the problem of occlusion. A solution for the problem of occlusion as confirmed in the experimental results with Transparent Sphere and Transparent Cylinder later on in Chapter 6 can be considered as the main advantage of interaction techniques created from the Selection-By-Volume approach.

Moving objects. Thanks to the selection volume, the time that the target object stays inside the volume could be long enough for performing the selection command. Therefore, with the selection volume, the users can catch the moving objects easier than in the case of the ray-casting technique or with the virtual hand.

5.8.2 Disadvantages

Two-step selection. The selection techniques created from this approach are indirect techniques. In fact, the selection of object is performed within two steps and through the menu selection. The direct selection techniques such as the Ray-casting and the Virtual hand can be more efficient in selecting the non-occluded objects.

5.8.3 Problems to Consider in the Design of Interaction Techniques Using the Selection-By-Volume Approach

The size of the selection volume. As mentioned in the analysis using the Cognitive Walkthrough method, the size of the selection volume should be considered with regard to the density of the objects displayed in the 3D scene. Also, the adaptable size or adjustable size could be a good way to deal with the choice of the selection volume size.

107

Page 108: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

The high number of objects in the selection volume. If there are numerous items in the environment, the menu could potentially be long to ensure the user could quickly identify the objects. This menu could then occlude the objects. This issue should be taken into account when choosing the size of selection volume and the design of 2D menu systems.

The order of items on the menu. As also stated in the usability evaluation using Cognitive Walkthrough method, the alphabetic order or the geographical distribution can be considered as solutions for the order of items on the menu.

5.9 Conclusion This chapter presents the problem of interaction with occluded objects by using the ray-casting technique. This problem was discovered thanks to some training sessions with controllers on the use of the ray-casting technique for interaction with 3D objects.

The Selection-By-Volume approach was proposed to solve this problem. This approach makes use of the combination of geometric shape and 2D menu system for interaction with the 3D objects. The Selection-By-Volume can be considered as a new approach for designing the interaction techniques in 3D environment. The interaction techniques created from this approach are different by the choice of the selection volume, the 2D menu system and the way to interact with the menu. Beside the solution for the problem of occlusion, this approach also introduces the principles for creating a new class of interaction techniques. Transparent Sphere and Transparent Cylinder, two interaction techniques created from this approach, are shown to be able to solve the problem of interaction with occluded objects and are subjects for an empirical evaluation that will be presented in Chapter 6.

108

Page 109: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 6

EVALUATION

109

Page 110: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

110

Page 111: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 6 - Evaluation

6.1 Objectives The objective of the evaluation was to compare the users’ performance with three different interaction techniques: the Transparent Cylinder technique, the Transparent Sphere technique and the ray-casting technique.

The subjects who took part to the experiment were engaged in some selection and placement tasks.

Since the main hypothesis is that our proposal (the Selection-By-Volume approach) should support a better performance (in term of reaction time) when the subjects have to select partially occluded objects, the subjects engaged in the experiment performed their tasks with both occluded scenes (in which the target object was occluded by other objects) and non-occluded scenes (where all the objects were clearly visible). The Transparent Cylinder and Transparent Sphere were involved in the evaluation at the same time to see the effect of the geometric shape in this approach.

6.2 Experimental Design The experiment was a within-subject design, with one independent variable (the interaction technique type) with three levels: the Transparent Sphere, the Transparent Cylinder and the Ray-casting. In order to avoid carry over effects, the order among the three conditions was counterbalanced.

The reaction time of every task was recorded.

6.3 Materials The materials of the experiment were composed follows.

In all scenes, solid spheres were used as the objects to be selected and then placed in a specific point of the space. In all scenarios, the sphere P01 (in green colour in Figure 4) was the object targeted for selection while the sphere P02 (in pink colour in Figure 4) was target for the placement task. All the positions of the spheres (including the green - or P01 - and the pink - or P02 - spheres) changed for every scene.

However, as it was mentioned before, half of the scenes displayed non-occluded target objects, while the other half displayed partially occluded target objects (see Appendix D).

Therefore, for the occluded scenes, the place of the target sphere (the P01) was not randomly chosen, because its position had to be partially occluded. Thus, small adjustments on the spatial positions of the targets were done.

Every subject performed the tasks with 66 scenes (22 scenes * 3 interaction techniques). However, only 48 objective measurements were taken, because every subject was engaged in 18 training trails (6 training trails * 3 interaction techniques). Half of the training trails

111

Page 112: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

comprised occluded targets and the other half non-occluded targets. The order of presentation of all the scenes was randomised (see Appendix C).

Three interaction techniques - the Ray-casting, the Transparent Sphere and the Transparent Cylinder - used in the experiment are designed as following.

Ray-casting

This technique uses a red ray to interact with 3D objects. The user uses a wand as input device to control the ray. Object is highlighted by yellow colour when being touched by the red ray (as in Figure 6-1(b)). The user clicks a wand button to select the highlighted object. The selected object is then turned red and attached to the ray (as in Figure 6-1(b)). The user can move the wand to move the selected object.

An example of using the Ray-casting for selection is shown in Figure 6-1.

P1

P2

P3

P4

P6

P5

a)

P1

P2

P3

P4

P6

P5

b) Figure 6-1. The Ray-casting technique a) The object intersected with the red ray is highlighted by yellow colour b) After being selected, the object of interest (P1) is turned red, attached to the ray for manipulation.

Transparent Sphere

This technique uses a sphere to interact with 3D objects. The user uses a wand as input device to control a sphere volume. The sphere volume is transparent so that users can see the objects inside the sphere. However, some cues of the border of the sphere are enabled to make the sphere visible (as in Figure 6-2(a)).

P1

P2

P3

P4P6

P5

P1

P2

P3

P4P6

P5

P2

P1

P3

P1

P2

P3

P4P6

P5

a) b) c)

Figure 6-2. The Transparent Sphere technique a) The objects located within the transparent sphere are highlighted by yellow b) A ray is used to select a label (P3) on the pop-up menu c) After being selected, the object P3 is turned red, moved to the centre of the sphere and is ready for manipulation.

112

Page 113: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

The objects falling within the sphere volume are highlighted by yellow colour. To select the object of interest, the user firstly activates a pop-up menu by keep pushing a wand button. The menu contains the labels of objects within the sphere. A red ray is used to point to the items on the pop-up menu. The pointed item and the corresponding object are turned red at the same time (as in Figure 6-2(b)). When pointing to the item of interest, the user releases the wand button to select the object of interest. The menu and the red ray disappear after having done the selection step. The object of interest is moved to the centre of the sphere and ready for the manipulation task (as in Figure 6-2(c)). The user can move the wand to move the selected object.

Transparent Cylinder

This technique uses a cylinder to interact with 3D objects. The user uses a wand as input device to control a cylinder volume. Similar to the Transparent Sphere, the cylinder volume is transparent and some cues of the border of the cylinder are enabled to make the cylinder visible (as in Figure 6-3(a)).

Likewise, objects falling within the cylinder volume are highlighted by yellow colour. To select the object of interest, the user firstly activates a pop-up menu contains the labels of objects within the cylinder by keep pushing a wand button. A red ray is used to point to the items on the pop-up menu. The pointed item and the corresponding object are turned red at the same time (as in Figure 6-3(b)). When pointing to the item of interest, the user releases the wand button to select the object of interest. The menu and the red ray disappear after having done the selection step. The object of interest is moved to the axis of the cylinder and ready for the manipulation task (as in Figure 6-3(c)). The user can move the wand to move the selected object.

P1

P7

P3

P2

P8P5

P0

P9

P4

P6

P1

P7

P3

P2

P8P5

P0

P9

P4

P6

P1P7

P3

P2

P8P5

P0

P9

P4

P6

a) b) c)

P8

P1

P7

P2

P3

Figure 6-3. The Transparent Cylinder technique a) The objects located within the transparent cylinder are highlighted b) A ray is used to select a label (P1) on the pop-up menu c) After being selected, the object P1 is ready for manipulation task

Figure 6-4 shows a snapshot of the performed experiments.

113

Page 114: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 6-4. Snapshots of the test

6.4 Procedure and tasks A paper with written instructions was provided to the participants. They were allowed to ask further explanations about the task, but only before the beginning of the test. As mentioned before, every subject performed three sessions of trials: one with the Ray-casting, one with the Transparent Cylinder and one with the Transparent Sphere.

Each session started with six non-measured trials. After that, the subject had to perform sixteen measured trials: eight trials with occluded scenes and eight other trials with non-occluded scenes, whose order of presentation was counterbalanced

Every subject had to perform a selection task followed by a placement task. The tasks were as follows.

The selection task required the participants to select the target sphere (sphere P01) placed within the 3D scene. Then, they had to move the selected sphere from its initial position to the final position, indicated by the pink sphere (the sphere P02).

An automatic counter measured the time performances of every subject. For the selection task, the time was measured from the beginning of the test until the subject successfully selected the sphere (sphere P01). Then, another counter started: this counter measured the time used by every subject to move the selected sphere to the target position (defined by the pink sphere called P02).

The task was finished when the sphere was placed to the target position.

Every scene was launched under the control of the subjects.

114

Page 115: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

After the test, a questionnaire was provided to test participants. Questions consisted of 6-point Likert scale (ranging from strongly disagree to strongly agree), but blank space was left after each question allowing free comments and supplementary explanations.

When the subject had filled the questionnaire, an informal discussion about the experiment was done.

6.5 Subjects Twenty-four subjects (seven females and seventeen males) working at EUROCONTROL Experimental Centre were involved in the investigation. Their age average was 38.5, ranging from 22 to 61 years old.

6.6 Equipment In order to prevent fatigue and strain, we arranged a small stage with a chair in front of the BARCO monitor, at a distance of about 50 cm. The subjects were asked to sit while performing the task.

A test-bed was implemented using C++, Open Inventor version 3.11, CAVELib version 3.0.1 and a SGI Onyx2 workstation (512 Mbytes RAM). The scenes were displayed using a BARCO projection system BARON (the monitor was 136cmx102cm). Subjects were using Stereographics Crystal Eyes glasses (120 Hz refresh rate) equipped with Intersense tracking system IS-900 VWT. A six-degrees-of-freedom (6DOF) Intersense tracked wand with button and joystick was used as interaction device.

6.7 Experimental Results Because the Transparent Cylinder and Transparent Sphere techniques are solutions which help users to overcome the difficulties when dealing with occluded scenes, we concentrate mainly on the experimental results with the occluded scenes. However, the experimental results with the mixed scenes and the non-occluded scenes are also analyzed to show a complete view about the performance of Transparent Sphere and Transparent Cylinder in different cases.

The Friedman test is used to see if there is any difference among three levels (Transparent Cylinder, Transparent Sphere and Ray-casting). If there is no difference, we have the conclusion. Otherwise, a pair wise comparison using non-parametric Wilcoxon test will be applied to 3 pairs – (Transparent Cylinder, Ray-casting), (Transparent Sphere, Ray-casting) and (Transparent Cylinder, Transparent Sphere) – in order to see which technique have a better performance in each pair.

First, the performed time (the sum of the selection time and the placement time) is analysed, followed by the analysis on the selection and placement time. The Statistica is used for analysing the experimental results.

Following sections analyze in order the occluded scenes, the mixed scenes and the non-occluded scenes. The mean value is used for the Friedman test and the median value is used for the Wilcoxon test.

115

Page 116: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

6.7.1 Occluded scenes

6.7.1.1 Performed time

The means of the performed time are represented in Figure 6-5. The mean value is 12.013 seconds (standard deviation (SD): 2.779 seconds) for the ray-casting, 10.317 seconds (SD: 2.355 seconds) for the Transparent Cylinder and 9.820 seconds (SD: 2.103 seconds) for the Transparent Sphere. The standard deviations are also shown in Figure 6-5 (in parentheses).

Occluded scenes

9.877

11.987

9.575

10,317(2,355)

12,031(2,779) 9,820

(2,103)

0.000

2.000

4.000

6.000

8.000

10.000

12.000

14.000

median of the performed time mean of the performed time

Cylinder Ray Sphere

Figure 6-5. Results of the performed time with the occluded scenes

The difference among three conditions is significant (Friedman test, p<.05). Therefore, a pair wise comparison is applied to 3 pairs: (Transparent Cylinder, Ray-casting), (Transparent Sphere, Ray-casting) and (Transparent Cylinder, Transparent Sphere). The median value is 11.987 seconds for the ray-casting, 9.877 seconds for the Transparent Cylinder and 9.575 seconds for the Transparent Sphere.

• (Transparent Cylinder, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Transparent Cylinder technique offers a better-performed time than the ray-casting technique.

• (Transparent Sphere, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Transparent Sphere technique offers a better performed time than the ray-casting technique

• (Transparent Cylinder, Transparent Sphere): the difference between the two conditions is not significant.

6.7.1.2 Selection time

The means of the selection time are represented in Figure 6-6. The mean value is 7.721 seconds (SD: 1.864 seconds) for the ray-casting, 6.007 seconds (SD: 1.303 seconds) for the Transparent Cylinder and 6.285 seconds (SD: 1.552 seconds) for the Transparent Sphere. The standard deviations are also shown in the figure 6-6 (in parentheses).

116

Page 117: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

The difference among three conditions is significant (Friedman test, p<.05). Therefore, a pair wise comparison will to be applied to 3 pairs – (Transparent Cylinder, Ray-casting), (Transparent Sphere, Ray-casting) and (Transparent Cylinder, Transparent Sphere). The median value is 7.874 seconds for the ray-casting, 6.016 seconds for the Transparent Cylinder and 5.838 seconds for the Transparent Sphere.

Occluded scenes

6.016

7.874

5.8386,007

(1,303)

7,721(1,864)

6,285(1,552)

0.0001.0002.0003.0004.0005.0006.0007.0008.0009.000

median of the selection time mean of the selection time

Cylinder Ray Sphere

Figure 6-6. Results of the selection time with the occluded scenes

• (Transparent Cylinder, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Transparent Cylinder offers a better selection time than the ray-casting.

• (Transparent Sphere, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Transparent Sphere offers a better selection time than the ray-casting.

• (Transparent Cylinder, Transparent Sphere): the difference between the two conditions is not significant.

6.7.1.3 Placement time

The means of the placement time are represented in Figure 6-7. The mean value is 4.310 seconds (SD: 1.450 seconds) for the Ray-casting, 4.310 seconds (SD: 1.610 seconds) for the Transparent Cylinder and 3.535 seconds (SD: 1.094 seconds) for the Transparent Sphere. The standard deviations are also shown in the figure 6-7 (in parentheses).

The difference among three conditions is significant (Friedman test, p<.05). Therefore, a pair wise comparison will to be applied to 3 pairs – (Transparent Cylinder, Ray-casting), (Transparent Sphere, Ray-casting) and (Transparent Cylinder, Transparent Sphere). The median value is 4.132 seconds for the Ray-casting, 3.859 seconds for the Transparent Cylinder and 3.345 seconds for the Transparent Sphere.

117

Page 118: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Occluded scenes

3.859 4.132

3.345

4,310(1,610)

4,310(1,450) 3,535

(1,094)

0.0000.5001.0001.5002.0002.5003.0003.5004.0004.5005.000

median of the placement time mean of the placement time

Cylinder Ray Sphere

Figure 6-7. Results of the placement time with the occluded scenes

• (Transparent Cylinder, Ray-casting): the difference between the two conditions is not significant.

• (Transparent Sphere, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Transparent Sphere offers a better placement time than the ray-casting.

• (Transparent Cylinder, Transparent Sphere): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Transparent Sphere offers a better placement time than the Transparent Cylinder.

6.7.2 Mixed scenes

6.7.2.1 Performed time

The means of the performed time are represented in Figure 6-8. The mean value is 10.116 seconds (SD: 2.133 seconds) for the ray-casting, 9.921 seconds (SD: 1.951 seconds) for the Transparent Cylinder and 9.857 seconds (SD: 2.289 seconds) for the Transparent Sphere. The standard deviations are also shown in Figure 6-8 (in parentheses).

The difference among three conditions is not significant (Friedman test, p>.05). Therefore, there is no difference among three techniques in term of performed time.

118

Page 119: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Mixed scenes

9,921(1,951)

10,116(2,133)

9,857(2,289)

9.7009.7509.8009.8509.9009.950

10.00010.05010.10010.150

mean of the performed time

Cylinder Ray Sphere

Figure 6-8. Results of the performed time with the mixed scenes

6.7.2.2 Selection time

The means of the selection time are represented in Figure 6-9. The mean value is 6.009 seconds (SD: 1.264 seconds) for the ray-casting, 5.817 seconds (SD: 1.089 seconds) for the Transparent Cylinder and 6.072 seconds (SD: 1.427 seconds) for the Transparent Sphere. The standard deviations are also shown in the figure 6-9 (in parentheses).

Mixed scenes

5,817(1,089)

6,009(1,264)

6,072(1,427)

5.6505.7005.7505.8005.8505.9005.9506.0006.0506.100

mean of the selection time

Cylinder Ray Sphere

Figure 6-9. Results of the selection time with the mixed scenes

The difference among three conditions is not significant (Friedman test, p>.05). Therefore, there is no difference among three techniques in term of selection time.

6.7.2.3 Placement time

The means of the placement time are represented in Figure 6-10. The mean value is 4.107 seconds (SD: 1.208 seconds) for the ray-casting, 4.104 seconds (SD: 1.279 seconds) for the

119

Page 120: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Transparent Cylinder and 3.785 seconds (SD: 1.158 seconds) for the Transparent Sphere. The standard deviations are also shown in Figure 6-10 (in parentheses).

Mixed scenes

3.641

3.956

3.658

4,104(1,279)

4,107(1,208)

3,785(1,158)

3.4003.5003.6003.7003.8003.9004.0004.1004.200

median of the placement time mean of the placement time

Cylinder Ray Sphere

Figure 6-10. Results of the placement time with the mixed scenes

The difference among three conditions is significant (Friedman test, p<.05). Therefore, a pair wise comparison is applied to 3 pairs: (Transparent Cylinder, Ray-casting), (Transparent Sphere, Ray-casting) and (Transparent Cylinder, Transparent Sphere). The median value is 3.956 seconds for the ray-casting technique, 3.641 seconds for the Transparent Cylinder and 3.658 seconds for the Transparent Sphere.

• (Transparent Cylinder, Ray-casting): the difference between the two conditions is not significant.

• (Transparent Sphere, Ray-casting): the difference between the two conditions is not significant.

• (Transparent Cylinder, Transparent Sphere): the difference between the two conditions is not significant.

Therefore, there is no significant difference among three techniques in term of placement time in the case of mixed scenes.

6.7.3 Non-occluded scenes

6.7.3.1 Performed time

The means of the performed time are represented in Figure 6-11. The mean value is 8.220 seconds (SD: 1.896 seconds) for the ray-casting, 9.663 seconds (SD: 1.926 seconds) for the Transparent Cylinder and 9.894 seconds (SD: 2.744 seconds) for the Transparent Sphere.

The difference among three conditions is significant (Friedman test, p<.05). Therefore, a pair wise comparison is applied to 3 pairs: (Transparent Cylinder, Ray-casting), (Transparent Sphere, Ray-casting) and (Transparent Cylinder, Transparent Sphere). The median value is 7.742 seconds for the ray-casting, 9.492 seconds for the Transparent Cylinder and 8.975 seconds for the Transparent Sphere.

120

Page 121: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• (Transparent Cylinder, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Ray-casting offers a better performed time than the Transparent Cylinder.

• (Transparent Sphere, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Ray-casting offers a better performed time than the Transparent Sphere.

• (Transparent Cylinder, Transparent Sphere): the difference between the two conditions is not significant.

Non-occluded scenes

9.492

7.7428.975

9,663(1,926) 8,220

(1,869)

9,894(2,744)

0.000

2.000

4.000

6.000

8.000

10.000

12.000

median of the performed time mean of the performed time

Cylinder Ray Sphere

Figure 6-11. Results of the performed time with the non-occluded scenes

6.7.3.2 Selection time

The means of the selection time are represented in Figure 6-12. The mean value is 4.312 seconds (SD: 1.016 seconds) for the ray-casting, 5.645 seconds (SD: 1.117 seconds) for the Transparent Cylinder and 5.860 seconds (SD: 1.691 seconds) for the Transparent Sphere.

The difference among three conditions is significant (Friedman test, p<.05). Therefore, a pair wise comparison is applied to 3 pairs: (Transparent Cylinder, Ray-casting), (Transparent Sphere, Ray-casting) and (Transparent Cylinder, Transparent Sphere). The median value is 4.078 seconds for the ray-casting, 5.494 seconds for the Transparent Cylinder technique and 5.481 seconds for the Transparent Sphere.

• (Transparent Cylinder, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Ray-casting offers a better selection time than the Transparent Cylinder.

• (Transparent Sphere, Ray-casting): the difference between the two conditions is significant (Wilcoxon test, p<.05). Therefore, the Ray-casting offers a better selection time than the Transparent Sphere

• (Transparent Cylinder, Transparent Sphere): the difference between the two conditions is not significant.

121

Page 122: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Non-occluded scenes

5.494

4.078

5.4815,645

(1,117)4,312

(1,016)

5,860(1,691)

0.000

1.0002.000

3.0004.000

5.0006.000

7.000

median of the selection time mean of the selection time

Cylinder Ray Sphere

Figure 6-12. Results of the selection time with the non- occluded scenes

6.7.3.3 Placement time

The means of the placement time are represented in Figure 6-13. The mean value is 3.909 seconds (SD: 1.071 seconds) for the ray-casting, 4.018 seconds (SD: 1.232 seconds) for the Transparent Cylinder and 4.034 seconds (SD: 1.405 seconds) for the Transparent Sphere. The standard deviations are also shown in Figure 6-13 (in parentheses).

The difference among three conditions is not significant (Friedman test, p>.05). Therefore, there is no difference among three techniques in term of placement time.

Non-occluded scenes

4,018(1,232) 3,909

(1,071)

4,034(1,405)

3.000

3.200

3.400

3.600

3.800

4.000

4.200

mean of the placement time

Cylinder Ray Sphere

Figure 6-13. Results of the placement time with the non-occluded scenes

6.7.4 Questionnaire

The data derived from the questionnaires provide the following results (represented in Figure 6-14). As previously stated, answers were provided using a 6 points Likert scale.

122

Page 123: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Floating 2D Menu – In the question regarded the “easy of use” of floating 2D menu; the data account a mean value of 5.04. This shows a favourable opinion on using floating menu.

Performance – In the question regarded the self-assessment of the performance in term of selection speed, the data account a mean value of 3.67 for the ray-casting technique, 3.67 for the Transparent Cylinder and 3.96 for the Transparent Sphere with no substantial difference between the three conditions (Friedman test, p>.05). In the question regarded the self-assessment of the performance in term of test performing speed, the data account a mean value of 3.38 for the ray-casting technique, 3.63 for the Transparent Cylinder and 3.58 for the Transparent Sphere with no substantial difference between the three conditions (Friedman test, p>.05).

Usability – With questions regarded the self-assessment of the usability (easy to use), the data account a mean value of 4.04 for the ray-casting technique, 4.33 for the Transparent Sphere and 4.08 for the Transparent Cylinder with no substantial difference between the three conditions (Friedman test, p>.05).

Enjoyment – With questions regarded the self-assessment of the enjoyment, the data account a mean value of 4.67 for the ray-casting technique, 4.96 for the Transparent Sphere and 4.67 for the Transparent Cylinder with no substantial difference between the three conditions (Friedman test, p>.05).

In all questions, the scores received from the Transparent Cylinder and the Transparent Sphere techniques are always better than the scores from the ray-casting technique. Although, statistically speaking, these differences are not significant; they also show a favourable opinion on using the Transparent Cylinder and the Transparent Sphere technique.

Questionnaire

3,384,04

5,044,67

4,333,63 3,67

4,673,67

4,964,08

3,58

3,96

0,00

1,00

2,00

3,00

4,00

5,00

6,00

help performquicker

easy to use help selectquciker

enjoyable Menu is easyto use

Cylinder Ray Sphere

Figure 6-14. Statistical results of the questionnaire

123

Page 124: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

In comments received from the questionnaires and in all short discussions performed with each test subject, two main problems are revealed: the depth perception and the position of the menu in case of the Transparent Cylinder techniques.

Position of the menu – In the Transparent Cylinder technique, the menu is placed on a point located on the axis of the cylinder that is near to the users (Figure 6-15). Because the length of the cylinder is big, test subjects sometimes have to change focus on the object (sphere) to the menu, which can cause some extra mental workload. This stresses on the importance of the menu placement related to the field of regard of users.

P8

P1

P7

P2

P3

P8

P1

P7

P2

P3

L

R

P1

P7

P3

P2

P8P5

P0

P9

P4

P6

Point of view(eye)

input device(hand)

change focus point

Figure 6-15. The problem of focus in the Transparent Cylinder technique

Depth perception – During the test, users had some difficulties to judge the object position and the distance between objects in depth. Sometimes, they could not decide whether an object is behind or in front of another one.

More detailed information about the feedbacks from the test subjects can be found in the appendix G.

6.8 Discussion

6.8.1 Occluded scenes

In case of occluded scenes, the Transparent Cylinder and the Transparent Sphere technique demonstrates a better performance than the ray-casting technique in term of performed time (i.e. the sum of the selection time and the placement time).

This better performance is based mainly on the selection time. In fact, both Transparent Sphere and Transparent Cylinder report a better selection time than the ray-casting technique.

In case of placement time, the Transparent Sphere report a better time than the ray-casting technique and Transparent Cylinder technique. There is no difference between the Transparent Cylinder and the ray-casting in term of placement time. This shows that the sphere support users better than the cylinder in the placement task (in case of occluded

124

Page 125: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

scenes). This stresses also the importance of choosing the shape of the selection volume for the interaction techniques derived from Selection-By-Volume approach.

6.8.2 Mixed scenes

In case of mixed scenes, there is no difference among three techniques in term of performed time, selection time and placement time. Assume that the occluded scenes and non-occluded scenes cover all possible scenarios, this result shows that the Transparent Cylinder and the Transparent Sphere can also be used in the place of ray-casting technique for the selection and placement tasks.

6.8.3 Non-occluded scenes

In case of non-occluded scenes, the ray-casting technique is obviously better than the Transparent Cylinder technique because the selection in the Transparent Cylinder technique is performed in two steps while the ray-casting offers one-step selection. The experimental results also show that the ray-casting technique can provide a better performance than the Transparent Cylinder and the Transparent Sphere techniques in term of performed time and selection time.

6.8.4 Menu

The use of floating 2D menu received a high score (5.04/6) and positive feedbacks from test subjects. The floating 2D menu should be considered as a complement to existing 3D interaction techniques. In fact, the menu can be used to show different options for a manipulation task, or to help the existing technique to solve the problem of occlusion.

6.9 Conclusion The experimental results show that the Transparent Sphere and the Transparent Cylinder, two interaction techniques derived from the Selection-By-Volume approach, can solve well the problem of occlusion.

In addition, they offer the same performance compared to the ray-casting technique in the case of mixed scenes. Along with the positive receptions of these two techniques from test subjects, it is reasonable to conclude that the Transparent Cylinder and Transparent Sphere can be used in interaction with 3D scenes, especially in scenes with a high level of occlusion.

The experimental results also confirm and strengthen the Selection-By-Volume approach. The shape of selection volume and the menu placement, two of main issues of this approach, were shown to have a considerable effect on the performance of the interaction technique derived from this approach.

Two other issues were also revealed from the experiments: the depth perception and the use of menu as a complement to existing interaction techniques. The depth cues should be added to help the user to judge the relative position of an object in 3D scenes. The menu can be added to existing techniques to help them to deal with the problem of occlusion or with the complex manipulation like the object modification task, the informative query task.

125

Page 126: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

126

Page 127: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 7

CONCLUSIONS & FUTURE WORK

127

Page 128: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

128

Page 129: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Chapter 7 - Conclusions & Future Work

7.1 Conclusions As part of a multidisciplinary framework for the empirical analysis of the applicability of Virtual Environment-based 3D user interfaces to air traffic control (3D-ATC environment), this thesis contributes to the investigation of interaction in 3D-ATC environment through an analytic and empirical approach.

Above all, this thesis concentrates on identifying the key 3D interaction tasks and the existing 3D interaction techniques that can be used in the context of a 3D-ATC environment. Some technology-driven studies of the 3D-ATC environment expose the lack of consideration on interaction features which can complement perfectly for the visualization features in providing effective information to the user. An analysis on the interaction features of 3D-ATC environment is thus useful and required.

To perform the analysis, the present thesis considers the interaction features of a 3D-ATC environment under the relationship of 4 components: Object, View, Interaction Task and Interaction Technique. The interaction tasks include the navigation task, the object selection task and the object manipulation task. The object manipulation task includes the object placement task, the object modification task and the informative query. The navigation is considered as the interaction task performed on the “View” while the object selection and manipulation are considered as interaction tasks performed on the “Object”. Object refers to the visual representation of ATC information such as the sector, the flight. The analysis attempted to identify the appropriate interaction techniques for two contexts: Interaction Task with the Object (Object Selection and Object Manipulation) and Interaction Task with the View (Navigation).

The performed analysis recognized that the ray-casting technique could be the key technique for the object selection and manipulation tasks. Four different combinations for the object selection and manipulation, which are built mainly on the ray-casting technique, were proposed as following.

1. Using the ray-casting technique for the object selection task and the object placement task, combined to the graphical menu for the object modification and informative query tasks.

2. Using the ray-casting technique for object selection, combined to the graphical menu for the object modification and informative query tasks and using virtual hand technique for the object placement task.

3. Using the voice command for the object selection task, using the ray-casting technique for the object placement task, using the combination of ray-casting technique and graphical menu for the object modification and informative query tasks.

4. Using voice command for the object selection task, using the virtual hand technique for the object placement task, using the combination of ray-casting technique and graphical menu for the object modification and informative query tasks.

Also, the performed analysis identified two interaction techniques for the navigation task in a 3D-ATC environment: the combination of Flying Vehicle technique and wayfinding aids and the World-In-Miniature technique.

129

Page 130: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

The analysis can be considered as the first contribution of the thesis. This analysis contributes to the research of 3D user interfaces for ATC.

The ray-casting technique was chosen as the first technique for the usability evaluation. Some training sessions for controllers to use the ray-casting technique were organized. The problem of interaction with the occluded objects (in short, the problem of occlusion) was discovered thanks to these training sections. In fact, the problem of occlusion reduces the user’s performance in the object selection and manipulation task. However, no work has taken into consideration this problem previously.

The available interaction techniques do not solve well this problem. This required a new interaction technique. That is the reason why the Selection-By-Volume approach was proposed. The Selection-By-Volume can be considered as a new approach for designing interaction techniques. The interaction techniques created from this approach, Transparent Sphere and Transparent Cylinder, were empirically proved to be able to solve well the problem of interaction with the occluded objects and can be used for interaction with 3D objects in general.

The Selection-By-Volume approach and two derived interaction techniques, Transparent Sphere and Transparent Cylinder, are the second contribution of the thesis. This contributes to solve the problem of interaction with occluded objects. Moreover, the Selection-By-Volume approach is a contribution to the research of 3D user interfaces in general.

7.2 Future work The works performed in this thesis contribute to the research of 3D-ATC environment and of the 3D User Interfaces. Several works can be continued from the performed works in this thesis. They can be organized in two main axes as following:

7.2.1 The analysis on Object, View, Interaction Task and Interaction Technique

The analysis performed in Chapter 4 identified the interaction techniques that can be used for interaction tasks in a 3D-ATC environment. Different combinations are proposed for the selection, the manipulation task and the navigation task.

In fact, this analysis proposed four combinations for the object selection and manipulation task.

1. Combination of ray-casting technique and graphical menu: Ray-casting technique for object selection and object placement tasks, combined to the graphical menu for object modification and informative query task.

2. Combination of HOMER technique and graphical menu: Ray-casting technique for object selection, combined to the graphical menu for object modification and informative query task and virtual hand technique for object placement.

3. Combination of voice command, ray-casting technique and graphical menu: Voice command for the object selection task, ray-casting technique for the object placement task, combined to the graphical menu for the object modification and informative query tasks.

130

Page 131: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

4. Combination of voice command, virtual hand technique, ray-casting technique and graphical menu. Voice command for the object selection task, Virtual hand technique for the object placement task, a combination of ray-casting technique and graphical menu for the object modification and informative query tasks.

Following the evaluation on the ray-casting technique, evaluations on three techniques - the virtual hand, the combination of ray-casting and graphical menu and the voice command - should be organized. Then, the evaluations on the usability of the four combinations should be carried out. The evaluations can help to figure out the potential interaction problems and to finally identify the most suitable combinations for the object selection and object manipulation tasks.

This analysis also identified the fact that both the combination of Flying Vehicle technique and wayfinding aids (the map, the artificial landmark and the trail) and the World-In-Miniature technique can be suitable for the navigation in a 3D-ATC environment.

The usage of three main wayfinding cues – the map, the artificial landmarks and the trails - in combining with the Flying Vehicle technique should be evaluated firstly. The criterion for evaluation is the amount of disorientation in navigation in 3D-ATC environment. The wayfinding cues are known as effective ways to reduce the disorientation in 3D space. The future works should exploit the use of wayfinding cues in navigation in 3D-ATC environment. Two evaluations should be organized for this perspective.

1. Effect of wayfinding aids in navigation. The experiment will compare the effect of disorientation between navigation without wayfinding cue and navigation with a simple wayfinding cue (a map can be used in this case). The experimental result will confirm the usefulness of wayfinding cues. Although the wayfinding cues are indispensable in navigation as stated in several studies [Darken 1996b], such an experiment in the context of 3D-ATC environment is still helpful.

2. Different wayfinding cues in navigation. The experiment will compare the effect of different wayfinding cues in navigation. This experiment will help to find out a set of useful wayfinding tool that can be included in a 3D-ATC environment.

These evaluations can help to identify a suitable way to combine the wayfinding cues to the Flying Vehicle technique. Then, the evaluation to compare this combination and the World-In-Minature should be carried out to find out the most appropriate technique for the navigation in a 3D-ATC environment.

7.2.2 The Selection-By-Volume approach

The 2D menu systems and the selection volume are two main components of interaction techniques derived from Selection-By-Volume approach. The choice of selection volume and menu representation has a considerable effect on the performance of the derived techniques. This effect can be found in the performance of Transparent Sphere and Transparent Cylinder in Chapter 6. More works can be carried out on the choice of selection volume, menu representation and menu placement.

1. Selection volume. The future works will concentrate on two of main features of the selection volume: the size and the shape.

131

Page 132: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

• Shape – The sphere and the cylinder were intuitively chosen to demonstrate the interaction techniques derived from the Selection-By-Volume approach. Although Transparent Sphere and Transparent Cylinder are effectively useful in dealing with the occluded objects, we do not know which is the most suitable shape of selection volume yet. Therefore, more studies should be conducted to find out a best way to choose the selection volume.

• Size – In the experimentation with Transparent Sphere and Transparent Cylinder, the size of the sphere and cylinder were calibrated based on the geometry of the scene. As analysed in the Chapter 6, the size have considerable effects on the performance of the interaction techniques derived from this approach. What are the criteria and the strategy to determine the size of the volume? For example, should we use an adaptable size for the selection volume, which is controlled to adapt every new scene, or we use an adjustable size for the selection volume which is controlled by the users instead? As the result, more studies should be conducted to find out an efficient way to control the size of selection volume.

2. Menu Representation. The future works will concentrate mainly on the choice of menu. The pop-up menu, which is intuitively chosen in the design of Transparent Sphere and Transparent Cylinder techniques, received many positive feedbacks from the users. However, a comparison with other types of menu representation should be organized to identify the characteristics of a good type of menu representation.

3. Menu Placement. The position of 2D menu can occlude the objects located in the selection volume (as stated in the feedbacks of test subjects in Chapter 6). More studies should be performed to find out the most appropriate way for menu placement.

Another work will be the evaluation of the Selection-By-Volume approach on moving objects. The selection volume, in theory, permits selecting easier moving objects. However, the empirical evidence should be provided to valid this.

As a result, the future works will strengthen the Selection-By-Volume approach and also help to explore more the potential of this approach. This contributes to the development of 3D User Interfaces.

132

Page 133: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

APPENDICES

133

Page 134: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

134

Page 135: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Appendix A – Acronyms

2D Two-dimensional

3D Three-dimensional

3D UI 3D User Interface

ATC Air Traffic Control

ATCOs Air Traffic Controllers

ATM Air Traffic Management

DOF Degree of Freedom

ERC EUROCONTROL Research Centre

HCI Human Computer Interaction

HOMER Hand-Centered Object Manipulation Extending Ray-casting

NVIS Norrköpings Visualisation & Interaction Studio

VE Virtual Environment

WIM World In Miniature

WIMP Window, Icon, Mouse, Pointer

135

Page 136: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Appendix B – Questionnaire

NAME AGE Please mark the number on the scale

1. I performed quicker with the ray (than with the cylinder and the sphere) Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

2. The test with the transparent sphere was enjoyable Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

3. The ray is easy to use Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

4. I performed quicker with the transparent sphere (than with the cylinder and the ray)

Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

5. The test with the ray was enjoyable Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

136

Page 137: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

6. The transparent sphere is easy to use

Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

7. I performed quicker with the transparent cylinder (than with the cylinder and the sphere)

Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

8. The floating menu is easy to see Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

9. I selected objects quicker with the transparent cylinder (than with the ray and the sphere)

Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

10. I selected objects quicker with the ray (than with the cylinder and the sphere)

Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

11. The test with the transparent cylinder was enjoyable Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

137

Page 138: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

12. I selected objects quicker with the transparent sphere (than with the

cylinder and the ray) Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments:

13. The transparent cylinder is easy to use Strongly Disagree

Disagree ModeratelyDisagree

ModeratelyAgree

Agree Strongly Agree

1 2 3 4 5 6 Other comments: Special comments

138

Page 139: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Appendix C – Orders of the scenes

ID ORDER (RAY)

1 14 9 12 15 11 13 18 7 10 20 21 06 17 08 16 19 2 20 19 10 06 21 13 15 18 07 08 17 16 11 14 09 12 3 08 09 18 11 15 07 17 16 10 13 21 20 06 12 19 14 4 08 20 11 16 19 18 15 13 09 17 14 06 21 12 07 10 5 11 08 13 09 14 15 10 16 21 06 17 18 19 12 20 07 6 08 06 12 11 19 10 13 07 16 15 09 18 14 17 20 21 7 17 06 08 19 07 18 16 15 13 20 09 10 21 11 14 12 8 07 12 14 11 16 20 19 18 15 13 06 07 09 08 10 21 9 09 13 17 10 18 15 14 20 11 16 19 07 21 06 08 12 10 13 07 11 19 15 10 17 12 20 16 09 21 08 06 18 14 11 15 10 13 18 11 12 07 19 14 20 16 21 17 08 06 09 12 08 07 12 09 18 06 20 16 10 17 14 15 11 13 19 21 13 08 18 10 13 12 19 20 09 11 17 14 21 15 06 16 07 14 18 11 16 20 08 06 09 12 10 13 14 07 19 15 17 21 15 16 18 13 20 14 21 07 10 19 11 08 12 06 15 09 17 16 21 10 13 11 18 14 12 19 09 07 16 15 08 17 20 06 17 16 11 19 12 06 13 14 20 18 08 15 09 10 07 21 17 18 17 11 18 20 13 06 15 12 16 10 19 09 08 21 07 14 19 08 15 07 13 18 16 10 12 21 09 20 17 19 11 14 06 20 11 12 09 17 15 06 13 14 19 20 21 07 10 18 16 08 21 07 13 16 18 12 08 14 17 09 06 20 10 21 11 19 15 22 10 14 17 11 16 08 15 13 12 09 21 06 20 07 19 18 23 14 06 09 15 11 16 07 08 19 10 20 13 21 18 17 12 24 06 14 15 13 07 11 16 12 17 19 18 21 20 10 08 09

Table 1. Order of scenes in case of the ray-casting technique

139

Page 140: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

ID ORDER (CYLINDER)

1 20 18 09 12 11 15 14 10 08 06 07 16 19 17 13 21 2 21 12 11 16 10 14 19 08 06 09 18 07 17 13 20 15 3 20 07 10 15 12 13 21 16 17 08 14 18 11 19 06 09 4 10 11 17 18 07 16 20 06 21 13 19 09 14 12 15 08 5 14 08 09 07 11 06 18 13 15 17 20 10 21 12 19 16 6 15 20 16 17 10 06 08 19 11 13 21 07 12 09 18 14 7 06 18 13 20 10 16 15 17 19 14 21 11 08 12 09 07 8 13 14 08 19 21 06 18 07 12 11 20 09 17 10 15 16 9 08 09 10 12 13 15 19 06 17 20 18 14 11 16 21 07 10 07 09 18 17 10 13 08 21 06 14 16 11 19 15 12 20 11 18 12 10 11 16 09 13 15 08 14 07 17 06 21 20 19 12 10 12 06 09 07 15 11 14 19 13 21 18 20 08 16 17 13 20 21 09 07 15 06 17 18 16 08 11 13 12 14 10 19 14 15 13 07 10 11 20 09 14 06 19 12 18 21 08 17 16 15 14 13 10 17 20 08 11 12 15 21 19 09 06 07 18 16 16 12 08 17 07 13 11 19 14 15 10 16 06 21 09 18 20 17 07 09 14 12 18 06 21 15 17 13 20 08 11 19 10 16 18 15 13 21 16 12 19 10 11 20 18 07 06 08 14 17 09 19 16 11 08 07 06 12 15 19 14 21 17 09 18 20 10 13 20 07 19 16 11 14 13 21 17 18 20 06 10 15 08 12 09 21 06 11 16 08 20 19 17 12 09 18 15 14 10 13 07 21 22 09 06 12 20 17 21 15 16 10 07 13 19 18 14 08 11 23 07 13 14 06 16 18 15 17 12 19 10 11 20 21 09 08 24 10 12 18 14 16 19 17 13 21 07 15 20 11 06 08 09

Table 2. Order of scenes in case of the Transparent Cylinder technique

140

Page 141: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

ID ORDER (SPHERE)

1 13 15 09 14 17 12 10 16 06 07 19 11 21 08 20 18 2 18 17 10 11 16 12 13 09 21 07 06 20 19 08 14 15 3 09 10 19 11 08 16 20 14 13 12 06 18 21 17 15 07 4 19 12 08 15 07 06 20 11 17 10 14 13 18 09 21 16 5 20 14 19 06 17 21 12 15 09 08 13 10 07 18 16 11 6 13 14 16 20 21 17 11 12 07 19 06 08 10 09 15 18 7 19 13 11 08 16 12 10 17 15 21 09 06 14 18 07 20 8 07 19 12 06 13 18 16 11 15 14 21 08 20 09 17 10 9 14 07 09 10 15 11 18 17 16 08 19 20 13 06 12 21 10 10 17 08 13 11 15 19 16 14 18 09 06 20 21 07 12 11 21 08 14 09 17 18 12 16 07 11 06 13 15 19 10 20 12 15 06 08 13 09 21 10 17 16 14 11 19 20 18 12 07 13 13 10 20 15 06 11 16 14 08 12 19 18 07 21 09 17 14 09 15 08 17 19 11 14 10 13 20 18 07 06 12 16 21 15 18 08 13 17 12 15 21 20 09 06 16 07 19 10 11 14 16 10 07 21 15 09 18 12 14 17 11 16 13 19 08 20 06 17 19 20 09 11 10 17 12 13 16 21 07 15 08 18 06 14 18 21 12 18 09 11 14 06 16 20 08 17 10 07 13 19 15 19 13 08 20 14 21 15 19 16 06 07 10 11 12 09 17 18 20 13 10 08 17 14 06 15 20 12 18 07 11 21 09 16 19 21 09 13 10 17 06 15 07 14 12 11 21 16 20 19 08 18 22 11 08 12 18 17 14 13 07 15 06 16 20 10 21 09 19 23 06 12 08 11 20 15 14 16 13 10 18 07 21 19 17 09 24 10 09 18 07 06 19 08 20 12 21 13 16 17 15 11 14

Table 3. Order of scenes in case of the Transparent Sphere technique

141

Page 142: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Appendix D – Scenarios

Figure 0-1. Scene 00 (trial)

Figure 0-2. Scene 01 (trial)

Figure 0-3. Scene 02 (trial)

142

Page 143: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 0-4. Scene 03 (trial)

Figure 0-5. Scene 04 (trial)

Figure 0-6. Scene 05 (trial)

143

Page 144: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 0-7. Scene 06 (test)

Figure 0-8. Scene 07 (test)

Figure 0-9. Scene 08 (test)

144

Page 145: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 0-10. Scene 09 (test)

Figure 0-11. Scene 10 (test)

Figure 0-12. Scene 11 (test)

145

Page 146: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 0-13. Scene 12 (test)

Figure 0-14. Scene 13 (test)

Figure 0-15. Scene 14 (test)

146

Page 147: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 0-16. Scene 15 (test)

Figure 0-17. Scene 16 (test)

Figure 0-18. Scene 17 (test)

147

Page 148: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 0-19. Scene 18 (test)

Figure 0-20. Scene 19 (test)

Figure 0-21. Scene 20 (test)

148

Page 149: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Figure 0-22. Scene 21 (test)

149

Page 150: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Appendix E – Numerical Results

Selection Placement Total

Cylinder Ray Sphere Cylinder Ray Sphere Cylinder Ray Sphere4.195 4.543 4.040 2.956 3.162 3.198 7.151 7.706 7.238 5.424 6.719 5.324 4.135 5.403 4.004 9.559 12.122 9.327 4.537 4.286 4.510 4.214 2.972 2.487 8.751 7.258 6.997 5.016 4.201 5.228 3.504 3.012 3.857 8.520 7.212 9.085 5.837 5.132 5.557 6.925 5.536 4.865 12.761 10.667 10.423 6.790 5.384 5.479 4.217 5.403 3.674 11.007 10.787 9.153 4.979 7.416 5.046 3.863 5.819 3.699 8.843 13.235 8.745 5.496 6.823 7.211 3.190 4.986 6.927 8.686 11.809 14.138 4.228 6.072 5.758 3.092 3.569 3.086 7.320 9.641 8.844 5.377 6.568 4.782 3.760 4.473 3.136 9.137 11.041 7.917 6.330 8.473 6.888 6.829 5.834 4.642 13.159 14.307 11.530 6.690 7.709 4.738 3.522 3.304 3.106 10.212 11.013 7.844 7.357 7.053 9.382 3.505 3.762 3.448 10.862 10.816 12.830 7.798 5.485 5.592 2.725 2.858 2.588 10.523 8.343 8.180 4.283 5.780 6.041 2.724 2.415 2.847 7.007 8.195 8.888 6.634 4.814 5.380 3.508 2.347 2.692 10.142 7.161 8.072 7.480 7.855 7.097 5.807 4.525 4.223 13.287 12.380 11.319 5.788 7.176 6.123 5.182 6.564 3.641 10.970 13.740 9.764 5.089 4.821 8.985 2.867 3.572 4.666 7.957 8.393 13.651 5.525 5.012 8.235 4.088 4.150 6.547 9.613 9.162 14.782 6.145 6.313 7.114 5.087 4.920 4.222 11.232 11.233 11.336 7.428 4.629 7.238 6.351 4.189 4.098 13.779 8.818 11.336 6.205 7.107 4.845 3.439 3.172 2.446 9.644 10.280 7.291 4.978 4.844 5.143 3.012 2.622 2.730 7.990 7.466 7.873

Table 4. Mixed scenes

150

Page 151: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Selection Placement Total

Cylinder Ray Sphere Cylinder Ray Sphere Cylinder Ray Sphere4.142 6.021 4.604 3.570 3.515 3.050 7.712 9.536 7.654 5.849 7.946 5.448 4.209 5.309 3.835 10.059 13.256 9.283 4.086 5.136 4.824 4.709 3.029 2.524 8.795 8.164 7.348 4.729 5.164 5.637 3.229 3.330 3.831 7.958 8.494 9.468 5.353 6.586 5.708 7.854 5.873 4.165 13.207 12.459 9.873 6.391 6.124 6.476 4.277 5.490 3.376 10.668 11.614 9.852 5.010 8.551 5.219 4.337 6.414 3.433 9.347 14.965 8.652 6.594 9.238 8.073 3.347 5.105 6.395 9.942 14.343 14.468 5.244 8.071 7.617 2.635 3.559 2.830 7.879 11.630 10.447 5.508 8.444 4.521 3.463 4.382 2.767 8.971 12.826 7.288 7.131 9.805 7.132 8.567 6.911 3.726 15.698 16.715 10.858 6.787 10.018 4.485 4.605 3.389 3.314 11.391 13.406 7.799 7.185 10.177 10.310 3.305 3.934 3.446 10.489 14.110 13.756 8.993 7.801 4.908 2.546 2.646 2.440 11.539 10.447 7.348 4.945 7.991 5.960 2.929 2.556 2.905 7.874 10.547 8.864 6.351 5.789 4.988 3.426 2.286 2.877 9.777 8.075 7.865 7.289 11.255 7.899 5.760 5.156 3.935 13.049 16.411 11.834 6.183 9.891 6.692 4.789 7.539 3.257 10.971 17.430 9.949 5.068 5.616 9.240 2.701 3.339 2.480 7.768 8.955 11.719 5.299 6.715 6.215 4.113 4.329 6.771 9.412 11.045 12.986 6.216 7.461 5.716 4.649 4.883 3.966 10.865 12.344 9.682 9.002 5.765 7.966 7.375 4.706 4.282 16.376 10.471 12.249 6.206 10.132 5.232 3.605 3.290 2.503 9.811 13.422 7.734 4.612 5.598 5.964 3.448 2.478 2.731 8.060 8.077 8.695

Table 5. Occluded scenes

151

Page 152: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Selection Placement Total Cylinder Ray Sphere Cylinder Ray Sphere Cylinder Ray Sphere

4.249 3.066 3.477 2.292 2.809 3.346 6.541 5.876 6.823 5.033 5.491 5.199 4.027 5.497 4.172 9.060 10.988 9.372 5.128 3.437 4.196 4.108 2.914 2.450 9.236 6.352 6.646 5.183 3.237 4.820 3.720 2.694 3.882 8.903 5.931 8.702 6.154 3.677 5.407 6.670 5.198 5.566 12.824 8.875 10.973 7.206 4.645 4.482 4.439 5.316 3.973 11.645 9.960 8.454 4.915 6.280 4.873 3.714 5.224 3.965 8.629 11.504 8.837 4.380 4.408 6.349 3.338 4.867 7.459 7.717 9.274 13.808 3.759 4.074 3.898 3.287 3.578 3.343 7.046 7.652 7.241 4.880 4.692 5.042 3.977 4.565 3.504 8.857 9.257 8.547 5.593 7.142 6.644 6.189 4.756 5.558 11.782 11.899 12.202 6.921 5.401 4.991 2.733 3.219 2.898 9.653 8.621 7.889 7.530 3.930 8.453 3.706 3.591 3.450 11.236 7.521 11.903 6.603 3.169 6.277 2.904 3.071 2.736 9.507 6.239 9.013 3.622 3.569 6.112 2.520 2.273 2.797 6.141 5.842 8.908 6.916 3.839 5.772 3.591 2.408 2.508 10.506 6.247 8.280 7.671 4.455 6.295 5.854 3.895 4.510 13.525 8.349 10.805 5.394 4.800 5.555 5.575 5.712 4.024 10.969 10.512 9.579 5.111 4.026 8.730 3.034 3.805 6.853 8.145 7.832 15.583 5.751 3.309 10.256 4.064 3.971 6.323 9.815 7.280 16.579 6.074 5.165 8.512 5.524 4.957 4.479 11.598 10.122 12.991 5.855 3.493 6.510 5.326 3.671 3.914 11.181 7.164 10.424 6.204 4.082 4.458 3.273 3.055 2.390 9.476 7.138 6.848 5.344 4.090 4.322 2.576 2.767 2.730 7.920 6.856 7.052

Table 6. Non-occluded scenes

152

Page 153: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Appendix F – Results from Questionnaire

perform quicker

easy to use

select quicker enjoyable

ID Age Nationality C R S C R S C R S C R S menu is easy to use 1 40 Romanian 2 4 6 3 5 6 3 5 6 3 5 6 6 2 37 Belgian 6 2 2 5 2 2 5 2 2 5 5 5 5 3 50 French 2 4 4 3 3 5 3 4 5 3 3 6 5 4 54 Irish 6 2 3 6 5 5 6 2 4 6 6 6 5 5 54 English 2 2 5 5 2 5 2 2 5 5 5 5 5 6 52 French 6 3 3 5 4 4 6 3 4 5 4 4 6 7 48 Irish 3 3 5 5 4 5 3 3 5 5 5 5 6 8 61 German 5 2 2 5 5 2 5 5 2 5 5 5 4 9 46 English 5 5 2 6 6 4 5 6 3 6 6 6 6 10 53 Irish 2 2 6 2 4 6 1 1 6 4 4 6 5 11 56 Dutch 3 5 4 5 5 5 2 4 5 5 5 5 5 12 51 English 5 2 3 5 3 4 4 2 3 4 4 4 5 13 25 Slovak 6 5 1 5 5 1 5 5 1 5 5 2 3 14 44 French 5 3 4 4 4 4 5 3 4 5 3 4 4 15 26 Moroccan 5 2 2 4 3 2 6 2 2 5 5 4 5 16 25 French 3 5 3 5 5 4 5 5 3 5 5 5 6 17 29 German 1 1 5 2 3 5 1 2 5 2 4 5 5 18 22 French 4 4 3 4 5 5 4 6 5 6 2 5 5 19 26 French 4 4 3 5 2 2 5 3 3 4 3 4 5 20 29 Swedish 2 6 2 3 4 3 2 6 2 4 5 5 3 21 20 French 2 3 6 5 4 6 1 3 6 6 6 6 6 22 24 English 3 2 5 5 5 5 3 4 6 6 6 6 6 23 26 Austrian 2 5 4 3 5 5 3 5 5 3 6 5 5 24 25 Turkish 3 5 3 4 4 3 3 5 3 5 5 5 5

Table 7. Results from the questionnaire

153

Page 154: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

Appendix G – Responses To The Questionnaire

All responses begin with the ID number of the subject making the response. All responses were written by the test subjects.

1. I performed quicker with the ray (than with the cylinder and the sphere)

01:4

07:3

10:2

06:3 – Difficult in case of balls behind another one in perspective

12:2 – Required more accuracy; hand-shake became important issue

09:5 – I liked the ray best for selecting individual objects and moving them.

04:2 – Sometimes difficult to find the first spot on P01

08:2

02:2

11:5

03:4 – It was quicker with isolated points but, I think, more difficult when the point to be selected was among a cluster of other points.

05:2

17:1

20:6

15:2 – In clustering configurations, selecting the wanted object need an accurate position of the hand.

14:3

19:4

13:5

18:4

24:5 – It was more difficult if two balls were overlapped

21:3 – It becomes hard when spheres are close.

16:5 – Je pense que le rayon est ce qu’il y a de mieux, cependant, il faudrait rajouter une sélection lorsqu’il y a plusieurs objets de près en même temps.

23:5 – But it depends if (P01) is in the background or not. If one object is in front of (P01), the sphere can be used easily for the selection of (P01).

22:2 – It was hard to select object when there were many together using the ray.

154

Page 155: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

2. The test with the transparent sphere was enjoyable.

01:6

07:5

10:6

06:4 – Sometimes difficult to select but menu simplify the selection

12:4 – Seemed to have more control due depth affect

09:6

04:6

08:5

02:5

11:5

03:6 – That’s the one I preferred.

05:5

17:5

20:5

15:4 – If you like games with expert level rather than beginner’s, you’ll enjoy it.

14:4

19:4

13:2

18:5

24:5

21:6

16:5

23:5

22:6 – It was enjoyable, but it would be easier standing so that you can move about more when reaching a far away object.

155

Page 156: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

3. The ray is easy to use

01:5

07:4

10:4

06:4 – Need not to be nervous

12:3 – Required accuracy with angle

09:6

04:5

08:5

02:2

11:5

03:3 – OK for isolated balls; not when they are close to each other.

05:2

17:3

20:4 – It seemed easier to “hook” balls with the ray when they were “single”. However when the desired object was overlapped by other objects it was difficult to select only the desired.

15:3 – It’s not difficult, but it needs some accuracy.

14:4

19:2

13:5

18:5

24:4

21:4

16:5

23:5 – Except for the comment (Question 1)

22:5 – It is simple to use, there is a good sense of depth. Except that it is hard to reach an object when they are clustered together.

156

Page 157: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

4. I performed quicker with the transparent sphere (than with the cylinder and the ray)

01:5

07:5

10:6

06:3 – Quicker compare with the ray but not with the cylinder

12:3 – Difficult to select when only target illuminated i.e. no menu available

09:2

04:3 – Quicker than ray, slower than cylinder

08:2

02:2

11:4

03:4 – I preferred the sphere but I am not quite sure that I performed quicker.

05:5

17:5

20:2

15:2 – In clustering configurations, selecting the wanted object need to move the sphere (in depth)

14:4

19:3

13:1

18:3

24:3

21:6

16:3

23:4 – It was performed quicker with the ray.

22:5 – Easy to select objects

157

Page 158: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

5. The test with the ray was enjoyable.

01:5

07:5

10:4

06:4

12:4

09:6

04:6

08:5

02:5

11:5

03:3 – Not so easy to handle with in front of points

05:5

17:4

20:5

15:5 – You feel like making barbecue!

14:3

19:3

13:5

18:2

24:5

21:6

16:5

23:6

22:6

158

Page 159: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

6. The transparent sphere is easy to use.

01:6

07:5

10:6

06:4

12:4

09:4

04:5

08:2

02:2

11:5

03:5

05:5

17:5

20:3 – Perhaps it is more difficult because acquisition area is large → often several balls are selected → user has to use the menu (more time consuming than it individual object can be hooked from start)

15:2 – It’s not difficult, but compared to the other methods, it needs some effort.

14:4

19:2

13:1

18:5

24:3

21:6

16:4

23:5

22:5 – The sphere was easy to use, but it was harder when there was only one object in the sphere as I was unsure how long to hold the button for, so it seemed quicker to select several objects and select from the list.

159

Page 160: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

7. I performed quicker with the transparent cylinder (than with the ray and the sphere)

01:2

07:3

10:2

06:6 – It is quicker to select the second ball in perspective (distance from me)

12:5 – Seemed less inactive to shake. Allowed access to drop menu which appeared quicker.

09:5 – Between the sphere and cylinder, I preferred the cylinder. I felt it gave me a reference for spatial depth.

04:6

08:5

02:6

11:3

03:2 – Might be because it was the first series but I did not really enjoy it.

05:2

17:1

20:2

15:5 – Even in clustering configurations, selecting the wanted object requires only reading the menu. Thanks to the orientation of the cylinder.

14:5

19:4

13:6

18:4

24:3

21:2 – I almost had several spheres for each selection → lose time.

16:3 – Le problème est qu’il arrive qu’on a une liste assez longue pour le choix de l’avion.

23:2

22:3

160

Page 161: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

8. The floating menu is easy to see.

01:6

07:6

10:5

06:6

12:5

09:6 – Very useful when multiple objects are adjacent or superimposed

04:5 – Sometimes … if there is a cluster…. P01

08:4

02:5

11:5

03:5 – Yes, it is clear.

05:5

17:5 – Use the same order all the time e.g. increasing of number.

20:3 –

15:5 – Despite my myopia, I’ve not found a problem while choosing the objects.

14:4

19:5

13:3

18:5

24:5

21:6

16:6

23:5

22:6 – I think it is very good, quick, easy and simple to use.

161

Page 162: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

9. I selected objects quicker with the transparent cylinder (than with the ray and the sphere)

01:3 – Cylinder is too big, you will have lots of objects selected.

07:3

10:1

06:6

12:4

09:5 – Especially when there were multiple objects.

04:6

08:5

02:5

11:2

03:3

05:2 – The cylinder appears to have greater depth and thus more arm movement is needed for capture.

17:1

20:2 –Extra workload caused by changing focus from the balls to the menu (which is always in front of screen). Sometimes menu is covered by other balls. Also translating the ball symbol to the text in the menu causes some extra mental workload.

15:6 – Easy in selecting task

14:5

19:5

13:5

18:4

24:3

21:1

16:6 – Ca me semble logique car le volume de sélection est plus grand que celui de la sphère.

23:3

22:3 – I think using the sphere was easier but the ray was also quite easy to.

162

Page 163: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

10. I selected objects quicker with the ray (than with the cylinder and the sphere)

01:5

07:3

10:1

06:3 – Not if two or more objects are closed in perspective, you need to orientate the ray not to select the closed ones.

12:2

09:6 – Only when there were not multiple objects highlighted.

04:2

08:5

02:2

11:4

03:4 – For isolated pointes I think so.

05:2

17:2

20:6

15:2

14:3

19:3

13:5

18:6

24:5

21:3 – It is quicker as long as spheres are not close and as two spheres are not the same.

16:5

23:5 – Actually it depends on the configuration of 3D objects and it depends on the view: for objects in the background, the sphere is easier to use than the ray.

22:4 – It was easy to select objects using the ray when it was by itself, but took longer when there were other objects nearby.

163

Page 164: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

11. The test with the transparent cylinder was enjoyable.

01:3

07:5

10:4

06:5

12:4

09:6

04:6

08:5

02:5

11:5

03:3

05:5

17:2

20:4

15:5 – It’s useless to hide, criminal!

14:5

19:4

13:5

18:6

24:5

21:6

16:5

23:3

22:6

164

Page 165: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

12. I selected objects quicker with the transparent sphere (than with the ray and the sphere)

01:6

07:5

10:6

06:4

12:3

09:3

04:4 – Quicker than ray, slower than cylinder

08:2

02:2

11:5

03:5

05:5

17:5

20:2

15:2

14:4

19:3

13:1

18:5

24:3

21:6

16:3

23:5

22:6 – As it was a sphere it remained more as a consistent shape than the cylinder so it felt easier to use.

165

Page 166: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

13. The transparent cylinder is easy to use.

01:3

07:5

10:2

06:5 – I use binoculars not for this distance to the screen, so it looks me a certain time to be used the first time with the sphere, mainly in the perspective (distance from me)

12:5

09:6

04:6 – 3D scenes kept slipping when having to stretch forward.

08:5

02:5 – For me, it is obvious that the cylinder was the best method to use.

11:5

03:3 – The cylinder shape does not seem “natural” for this kind of thing. I think the sphere should be more appropriate.

05:5 – I had more occasions to press button several times using the cylinder compared with the sphere.

17:2

20:3

15:

14:

19:5

13:5

18:4

24:4

21:5

16:5

23:3

22:5 – It was easy to use. The drop down menu is good. But as it is a long shape it was harder to imagine where the sides of the cylinder was and what is included in it.

166

Page 167: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

14. Other comments

01:

07: As I became more used to the ray and the cylinder, I tried to position P01 onto the ray to make it easier to connect with the second target. Depending on the distance the target was from me, I would try to position P01 close to me it target was further away than original P01 position. Hands (wand) close to chest to select to allow me to extend arms to connect with target.

10:

06:

12:

09:I would like to use the ray with an option to highlight objects that are spatially behind others.

04:Good fun!

08:

02:

11:

03:The ray is ok for isolated points. Otherwise, the sphere seems to be the best compromise for selecting object either isolated or in a cluster (with the list in that case)

05:

17:The 3D effect was very weak; use of a texture perhaps increases it.

-Matching the balls with the center and not with the sphere is unenjoyable.

- Matching menu – similar to selection menu.

20:Perhaps the ray in combination with a selection mechanism when several balls interfere could be a good idea.

15:As far as the performance is concerned, I think the method have no influence on reaching the target.

14:Le rayon permet d’utiliser le déplacement de la scène avec la tête alors que le cylindre est bien adapte pour un poste assis sans mouvement de la scène avec la tête.

19:

13:I enjoyed most the transparent cylinder, it’s easier to use, gives better perception of space and distances.

18:

24:The more difficult part of the test was to understand if the selected object was near or far to me.

21:

16:L’avantage de la sphère par rapport au cylindre est qu’on sélectionne moins d’avions, mais nécessite de bouger en avant/ arrière pour la sélection. Le problème du cylindre peut annuler

167

Page 168: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

l’intérêt de la 3D lorsque tous les avions sont concentres (autant avoir des le début la liste de tous les avions). Il serait intéressant de tester une prise de l’avion sans sélection préalable mais dans la liste entière.

23:Nice work!

22:Would be good to stand up as I felt restricted in the forward and backward movement that I had and I spent time thinking about the depth when I selected each object.

168

Page 169: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

REFERENCES

169

Page 170: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

170

Page 171: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

REFERENCES

[Arns 2002] Arns, L. L. (2002). A new taxonomy for locomotion in virtual environments, Iowa State University. PhD Thesis

[Azuma 1996] Azuma, R., Daily,M.,Krozel,J. (1996). Advanced Human-Computer

Interfaces for Air Traffic Management and Simulation. Proceedings of 1996 AIAA Flight Simulation Technologies Conference, San Diego, CA.

[Azuma 2000] Azuma, R., Neely III,H., Daily,M.,Geiss,R. (2000). "Visualization

Tools for Free Flight Air-Traffic Management." IEEE Computer Graphics and Applications 20(5): 32-36.

[BARCO 2005] BARCO (2005). Accessed 2005-03-15.http://www.barco.com/ [Bolt 1980] Bolt, R. (1980). Put-that-there. In Proc. of SIGGRAPH 1980, ACM

press. [Bowman 1999a] Bowman, D. (1999). Interaction technique for common tasks in

immersive virtual environments - Design, Evaluation And Application, Georgia Institute of Technology. PhD Thesis

[Bowman 1997a] Bowman, D., & Hodges, L. (1997). An Evaluation of Techniques for

Grabbing and Manipulating Remote Objects in Immersive Virtual Environments. Proceedings of the ACM Symposium on Interactive 3D Graphics.

[Bowman 1997b] Bowman, D., Koller, D., & Hodges, L (1997). Travel in Immersive

Virtual Environments: An Evaluation of Viewpoint Motion Control Techniques. Proceedings of the Virtual Reality Annual International Symposium.

[Bowman 2000] Bowman, D., Kruijff, E., LaViola, J., and Poupyrev, I. (2000). The

Art and Science of 3D Interaction. IEEE Virtual Reality 2000. New Brunswick, New Jersey. Accessed 22 March 2005.http://www.mic.atr.co.jp/~poup/3dui/TUT3DUI/

[Bowman 2004] Bowman, D., Kruijff, E., LaViola, J., Mine, M., and Poupyrev, I.

(2004). 3D User Interfaces: Theory and Practice, Addison Wesley. [Bowman 2003] Bowman, D., North, C., Chen, J., Polys, N., Pyla, P., Yilmaz, U.

(2003). Information-Rich Virtual Environments: Theory, Tools, and Research Agenda. Proceedings of ACM Virtual Reality Software and Technology, Osaka, Japan.

171

Page 172: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

[Bowman 1998b] Bowman, D. A., Hodges, L.F., Bolter, J. (1998). "The Virtual Venue: User-Computer Interaction in Information-Rich Virtual Environments." Presence 7(5): 478-493.

[Bowman 1998a] Bowman, D. A., Koller, D., Hodges, L.F. (1998). "A Methodology

for the Evaluation of Travel Techniques for Immersive Virtual Environments." Virtual Reality: Research, Development, and Applications 3(2): 120-131.

[Brown 1994a] Brown, M. (1994). Display for Air Traffic Control: 2D, 3D and VR.

A preliminary investigation, University of London [Brown 1994b] Brown, M. (1994). On the evaluation of 3D Displays Technologies

for Air Traffic Control, University of London [Burnett 1991] Burnett, M. S., Barfield, W. (1991). Perspective versus plan view air

traffic control (ATC) displays: survey and empirical results. Proceedings of the Sixth International Symposium on Aviation Psychology, Columbus, Ohio State University.

[Chance 1998] Chance, S. S., Gaunet, F., Beall, A., Loomis, J.M. (1998).

"Locomotion Mode Affects the Updating of Objects Encountered During Travel: The Contribution of Vestibular and Proprioceptive Inputs to Path Integration." Presence: Teleoperators & Virtual Environments 7(2): 168-178.

[Cruz-Neira 1993] Cruz-Neira, C., Sandin,D. J.,DeFanti, T. A. (1993). Surround-Screen

Projection-Based Virtual Reality: The Design and Implementation of the CAVE. SIGGRAPH 1993 Proceedings.

[Dang 2005a] Dang, N. T. (2005). Selection-By-Volume Approach: Using

Geometric Shape and 2D Menu System for 3D Object Selection. IEEE VR 2005, Bonn, Germany.

[Dang 2005b] Dang, N. T. (2005). Transparent Sphere: a new three-dimensional

interaction technique. HCI International 2005, Las Vegas, USA. [Dang 2003c] Dang, N. T., Le, H. H., Tavanti, M., Duong, V. (2003). A

Multidisciplinary Framework for Empirical Analysis of the Applicability of 3D Stereoscopic in Air Traffic Control. Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation, Kobe, Japan.

[Dang 2003a] Dang, N. T., Le, H.H., Tavanti, M. (2003). Empirical Analysis of the

Applicability of 3D Stereoscopic in Air Traffic Control. In Proceedings of 6th IEEE International Conference on Intelligent Transportation Systems, Shanghai, China.

172

Page 173: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

[Dang 2003d] Dang, N. T., Le, H.H., Tavanti, M. (2003). Visualization and Interaction on Flight Trajectory in a 3D Stereoscopic Environment. Proceedings of the AIAA/IEEE 22nd Digital Avionics Systems Conference, Indianapolis, Indiana.

[Darken 1999] Darken, R., Cevik, H. (1999). Map Usage in Virtual Environments:

Orientation Issues. Proceedings of Virtual Reality '99. [Darken 1997] Darken, R., Cockayne, W., Carmein, D. (1997). The Omni-

Directional Treadmill: A Locomotion Device for Virtual Worlds. UIST'97.

[Darken 2001] Darken, R., Peterson, B. (2001). Spatial Orientation, Wayfinding and

Representation. Handbook of Virtual Environment Technology. K. Stanney. Mahway, NJ, Lawrence Erlbaum Associates.

[Darken 1996a] Darken, R., Sibert, J. (1996). "Navigating in Large Virtual World."

The International Journal of Human-Computer Interaction 8(1): 49-72.

[Darken 1996b] Darken, R., Sibert, J. (1996). Wayfinding Strategies and Behaviors in

Large Virtual Worlds. Proceedings of CHI'96. [Dorighi 1994] Dorighi, N., Ellis, S. R., Grunwald, A., Stassart, P. (1994). Advanced

Display and Manipulative Interface for Air Traffic Management.http://humanfactors.arc.nasa.gov/ihh/spatial/research/adsp_air_traffic_management.html

[DVRL 2005] DVRL (2005). Data Visualization Research Lab, The Center for

Coastal and Ocean Mapping at the University of New Hampshire. Accessed 22 March 2005.www.ccom.unh.edu/vislab/HCI/Powerpoints/Navigation.ppt

[Ellis 1987] Ellis, S. R., McGreevy, M.W. (1987). "Perspective Traffic Display

Format and Airline Pilot Traffic Avoidance." Human Factors 29(2): 371-382.

[Esposito 1996] Esposito, C. (1996). User Interfaces for Virtual Reality Systems.

Human Factors in Computing Systems, CHI'96 Conference Tutorial Notes.

[Fakespace 2005] Fakespace (2005). Fakespace Labs. Mountian View, California.

Accessed 2005-03-15.http://www.fakespacelabs.com [Forsberg 1996] Forsberg, A., Herndon, K.,Zeleznik, R. (1996). Aperture based

selection for immersive virtual environments. Proceedings of the ACM Symposium on User Interface Software and Technology.

173

Page 174: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

[Gabbard 1997] Gabbard, J. L. (1997). A Taxonomy of Usability Characteristics for Virtual Environments. Department of Computer Science, Virginia Tech. MSc Thesis

[Hix 1993] Hix, D., Hartson, H. (1993). Developing User Interfaces: Ensuring

Usability through Product & Process, John Wiley and Sons. [Immersion 2005] Immersion (2005). San Jose, California. Accessed 2005-03-

15.http://www.immersion.com/ [INO 2003] INO (2003). Activity Report 2003. Bretigny sur Orge,

EUROCONTROL Experimental Centre [Intersense 2005] Intersense (2005). Accessed 2005-03-15.http://www.isense.com/ [Iwata 1999] Iwata, H. (1999). Walking about Virtual Environments on an Infinite

Floor. Proceedings of IEEE VR'99. [Jacoby 1992] Jacoby, R., Ellis, S. (1992). Using Virtual Menus in a Virtual

Environment. Proceedings of Visual Data Interpretation, 1668. [Kim 2000] Kim, N., Kim, G. J., Park, C-M., Lee, I., Lim, S.H. (2000).

Multimodal Menu Presentation and Selection in Immersive Virtual Environments. VR, Pohang University of Science and Technology, Korea: 281.

[Kruger 1995] Kruger, W., Bohn, C.A., Frohlich, B., Schuth, H., Strauss, W., and

Wesche, G. (1995). "The Responsive Workbench: A Virtual Work Environment." IEEE Computer 28(7): 42-48.

[Lange 2003] Lange, M., Hjalmarsson, J., Cooper, M., Ynnerman, A., Duong, V.

(2003). 3D Visualization and 3D Voice Interaction in Air Traffic Management. SIGGRAD.

[LaViola 2000] LaViola, J. (2000). "A Discussion of Cybersickness in Virtual

Environments." SIGCHI Bulletin 32(1). [LaViola 1999] LaViola, J., Zeleznik, R. (1999). Flex and Pinch: A Case Study of

Whole-Hand Input Design for Virtual Environment Interaction. Proceedings of International Conference on Computer Graphics and Imaging'99.

[Le 2005] Le, H. H. (2005). 3D Visualization in ATC: Design Principles for

Controller's Working Position. Ecole Pratique des Hautes Etudes. Paris. PhD Thesis

[Liang 1994] Liang, J., Green, M. (1994). "JDCAD: A highly interactive 3D

modeling system." Computer & Graphics 4(18): 499-506.

174

Page 175: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

[MOVES 2005] MOVES (2005). The MOVES Institute, Naval Postgraduate School.

Accessed 22 March 2005.http://www.movesinstitute.org/ [Nielsen 1990] Nielsen, J., Molich, R. (1990). Heuristic evaluation of user interfaces.

Proc. ACM CHI'90 Conf., Seattle, WA. [Olwal 2003] Olwal, A., Feiner, S. (2003). The Flexible Pointer: An Interaction

Technique for Selection in Augmented and Virtual Reality. ACM Symposium on User Interface Software and Technology (UIST '03), Vancouver, BC.

[Pausch 1995] Pausch, R., Burnette, T., Brockway, D., & Weiblen, M. (1995).

Navigation and Locomotion in Virtual Worlds via Flight into Hand-held Miniatures. Proceedings of the 22nd annual conference on Computer graphics and interactive techniques.

[Persiani 2000] Persiani, F., Liverani, A. (2000). Semi-immersive synthetic

environment for cooperative air traffic control. In Proceedings of 22nd International Congress of Aeronautical Sciences, Harrogate, UK.

[Pierce 1999a] Pierce, J., Stearns, B., Pausch, R. (1999). Two Handed Manipulation

of Voodoo Dolls in Virtual Environments. Proceedings of Symposium on Interactive 3D Graphics.

[Pierce 1999b] Pierce, J. S., Stearns, B. C.,Pausch, R. (1999). Voodoo Dolls: Seamless

Interaction at Multiple Scales in Virtual Environments. Proceedings of the 1999 Symposium on Interactive 3D Graphics.

[Pol 1999] Pol, R. V. D., Ribarsky, W., Hodges, L., Post, F. (1999). Interaction in

Semi-Immersive Large Display Environments. Virtual Environments '99 Proceedings of the Eurographics Workshop, Wien.

[Polson 1992] Polson, P. G., Lewis, C., Rieman, J., Wharton, C. (1992). "Cognitive

walkthroughs: A method for theory- based evaluation of user interfaces." International Journal of Man-Machine Studies 36: 741-773.

[Poupyrev 1996] Poupyrev, I., Billinghurst, M., Weghorst, S. & Ichikawa, T., (1996).

Go-Go Interaction Technique: Non-Linear Mapping for Direct Manipulation in VR. Proceedings of the ACM Symposium on User Interface Software and Technology.

[Poupyrev 1998] Poupyrev, I., Weghorst, S., Billinghurst, M., Ichikawa, T (1998).

"Egocentric Object Manipulation in Virtual Environments: Empirical Evaluation of Interaction Techniques." Computer Graphics Forum 17(3): 41-52.

175

Page 176: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

[Preece 2002] Preece, J., Rogers, Y., and Sharp, H. (2002). Interaction Design:

Beyond Human-Computer Interaction, John Wiley. [Rieman 1995] Rieman, J., Franzke, M., Redmiles, D. (1995). Usability evaluation

with the cognitive walkthrough. Proceedings of CHI'95. [Rolland 2001] Rolland, J. P., Davis,L., Baillot, Y. (2001). A Survey of Tracking

Technology for Virtual Environments. Augmented Reality and Wearable Computers. M. Barfield&Caudell. NJ.

[Shneiderman 1997] Shneiderman, B. (1997). Designing the User Interface: Strategies for

Effective Human-Computer Interaction, Addison-Wesley. [Slater 1995] Slater, M., Usoh, M., Steed, A. (1995). "Taking Steps: The Influence

of a Walking Metaphor on Presence in Virtual Reality." ACM Transactions on Computer Human Interaction 2(3): 201-209.

[Stanney 1998] Stanney, K. M., Mourant,R.R. and Kennedy, R.S. (1998). "Human

Factors Issues in Virtual Environments: A Review of the Literature." Presence: Teleoperators and Virtual Environments 7(4).

[Stereographics 2005] Stereographics (2005). Accessed 2005-03-

15.http://www.stereographics.com/ [Stoakley 1995] Stoakley, R., Conway, M., & Pausch, R. (1995). Virtual reality on a

WIM: interactive worlds in miniature. Proceedings of CHI '95: the 1995 Conference on Human Factors in Computing Systems.

[Tavanti 2004b] Tavanti, M. (2004). On the Relative Utility of 3D Interfaces, Uppsala

University. PhD Thesis [Tavanti 2004c] Tavanti, M., Dang, N.T., Le, H.H. (2004). Usability Inspection of a

3D Interaction Metaphor. In Proceedings of the 2nd International Conference RIVF'04 Research Informatics Vietnam-Francophony, Hanoi, Vietnam.

[Tavanti 2003] Tavanti, M., Le, H. H., Dang, N.T. (2003). Three-dimensional

Stereoscopic visualization for Air Traffic Control Interfaces: a preliminary study. Proceedings of AIAA/IEEE 22nd Digital Avionics Systems Conference, Indianapolis, Indiana.

[Tavanti 2004a] Tavanti, M., Le, H.H.,Dang, N.T. (2004). "3D for ATC displays: ask

controllers, they know better." Submitted to the AIAA Journal of Aerospace Computing, Information, and Communication.

176

Page 177: A STEREOSCOPIC VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC ... · A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW

[Tham 1993] Tham, M., Wickens, C. D. (1993). Evaluation of perspective and stereoscopic displays as alternative to plan view displays in air traffic control. Savoy, IL, Aviation Res. Lab

[Usoh 1999] Usoh, M., Arthur,K.,

Whitton,M.C.,Bastos,R.,Steed,A.,Slater,M.,Frederick, P., Brooks, Jr. (1999). Walking > Walking-in-Place > Flying, in Virtual Environments. SIGGRAPH 1999 Proceedings.

[Ware 1993] Ware, C., Arthur, K.,Booth, K.S. (1993). Fish Tank Virtual Reality.

INTERCHI 93 Proceedings. [Ware 1989] Ware, C., Jessome, D.R. (1989). Using the Bat: A Six-Dimensional

Mouse for Object Placement. Proceedings on Graphics interface '88, Edmonton, Alberta, Canada.

[Ware 1990] Ware, C., Osborne, S. (1990). Exploration and Virtual Camera

Control in Virtual Three Dimensional Environments. Proceedings of the 1990 symposium on Interactive 3D graphics, Utah, United States.

[Wickens 1989] Wickens, C. D., Todd,S., Seidler, K. (1989). Three-Dimensional

Displays: Perception,Implementation, and Applications, University of Illinois at Urbana-Champaign Aviation Research Laboratory

[Wloka 1995] Wloka, M., Anderson, B. (1995). Resolving Occlusion in Augmented

Reality. Proceedings of the 1995 Symposium on Interactive 3D Graphics.

[Zeltzer 1997] Zeltzer, D., Drucker, S. (1997). A Virtual Environment System for

Mission Planning. Proceedings of the IMAGE VI conference, Scottsdale, AZ.

[Zhai 1994] Zhai, S., Buxton,W.,Milgram, P. (1994). The "Silk Cursor":

Investigating Transparency for 3D Target Acquisition. CHI 1994 Proceedings.

177