interactive virtual building walkthrough using oculus rift ... 2015 ocul… · interactive virtual...

3
Interactive Virtual Building Walkthrough Using Oculus Rift and Microsoft Kinect Will Woodard and Somsak Sukittanon University of Tennessee at Martin Department of Engineering Martin, TN USA [email protected], [email protected] Abstract— Virtual walkthroughs of buildings and venues have been in use for as long as three-dimensional modeling has existed. They are an efficient way to portray designs in a more personal way than the standard two-dimensional drawings, so those with less technical drawing experience can envision what a future project’s final design might look like. Most virtual walkthroughs are directly rendered as a movie, and thus incorporate very little user input, and are limited to the route that the creator wishes the end user to experience. This paper presents a fully immersive experience for the user in the virtual space, by using an Oculus Rift headset for head tracking and 3D viewing. Hardware also includes Microsoft Kinect integration, for realistic environment interaction using limb tracking. In terms of 3D modeling software, Rhinoceros3 was chosen due to its ability to texture map complex objects, as well as its long list of compatible file types. The physics engine chosen is Unity3D standard, due to its ability to add C# scripts to objects for user interaction, and its Kinect and Oculus Rift compatibility. The strategy of using hardware and software that is very versatile in terms of compatibility is chosen in order to minimize the proportion of rework if a sudden change in software/hardware becomes necessary, so that old work can still seamlessly integrate into the new platform. The subject of this prototype walkthrough is the Johnson Engineering and Physical Sciences Building at UT Martin, and will in time expand to a full campus walkthrough. I. INTRODUCTION For many years people have tried to come up with a way for the world to have experiences in virtual reality. They want these for different reasons and to be used in different fields [1-4]. It could also be implemented into different forms of recreational and commercial use. Such examples include, but are not limited to, mental therapies [5-6] (PTSD, phobias), controlling of robots/rovers in space [7], and training in the medical field [3]. NASA has been trying to utilize virtual reality so people can experience walking and exploring space, specifically Mars [7]. Another field that is working on virtual reality is architecture. Jon Brouchoud [8] has come up with a way for people to take 3-D walkthroughs of buildings where customers would stand in the unfinished building and see what it will look like after construction. Others are being used for architects to construct the building with the correct materials with the glasses before starting on the project. The paper presents an interactive virtual walkthroughs of buildings and venues using virtual reality glasses. The scope of this work includes the modeling and texture mapping of a virtual university campus with commercial modeling software, as well as integrating this model into a 3-D engine capable of rendering a realistic environment for the end-user. The virtual walkthrough is to be designed with immersion and user experience in mind. In order to be as immersive as possible, the project has to conform to 1) Three Dimensional Viewing, 2) Full User-Navigation Integration, and 3) an Interactive Environment. Keeping these three piers in mind throughout the design process allows immersion to be a part of the final design. II. SYSTEM DESIGN The process for creating the virtual world is as follows: (A) Modeling and Texturing the Building, (B) Environment Creation, (C) Interactive Scripting, and (D) Hardware Integration. In order to model and texture map the building, Rhinoceros modeling software was chosen, but any modeling software will suffice. To create the environment, Unity3D was chosen. This piece of the project involves terrain creation, player character creation, and interactive object creation that will be used further into development. The Interactive Scripting element of the project is done in C# and JavaScript, to be integrated with Unity3D, linked to each object the scripting is to involve. The hardware incorporation includes utilizing the SDKs included with the Oculus Rift Developer Kit 2, and the Microsoft Kinect A. Modeling and Texturing the Building The modeling process is one of the most time consuming pieces of the design process, so it is of utmost importance that the process is started correctly, to reduce any rework in the future. That being said, it is also one of the most rewarding parts of the design process. In order to get a model of a building, one must first have floor plans of the building. The software package Rhinoceros was chosen for its ability to import PDF files as lines, which streamlines the process if one were to have digital copies of the floor plans. Once imported onto the plane and scaled accordingly, one must create surfaces from the outlines of walls, beams, and columns for each floor of the building. After each surface has been created, it is a simple task to select all surfaces, and extrude a solid object from the surface a length vertically that represents the height of the walls on that floor. Rinsing and repeating for each floor applicable, one now has an accurate representation of the building. Adding stairs, floors, and ceilings is now a trivial matter, and doors will be looked at in the interactive object section of this paper.

Upload: phungminh

Post on 17-Mar-2018

217 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Interactive Virtual Building Walkthrough Using Oculus Rift ... 2015 Ocul… · Interactive Virtual Building Walkthrough Using Oculus Rift and Microsoft Kinect Will Woodard and Somsak

Interactive Virtual Building Walkthrough Using Oculus Rift and Microsoft Kinect

Will Woodard and Somsak Sukittanon University of Tennessee at Martin

Department of Engineering Martin, TN USA

[email protected], [email protected] Abstract—  Virtual walkthroughs of buildings and venues have been in use for as long as three-dimensional modeling has existed. They are an efficient way to portray designs in a more personal way than the standard two-dimensional drawings, so those with less technical drawing experience can envision what a future project’s final design might look like. Most virtual walkthroughs are directly rendered as a movie, and thus incorporate very little user input, and are limited to the route that the creator wishes the end user to experience. This paper presents a fully immersive experience for the user in the virtual space, by using an Oculus Rift headset for head tracking and 3D viewing. Hardware also includes Microsoft Kinect integration, for realistic environment interaction using limb tracking. In terms of 3D modeling software, Rhinoceros3 was chosen due to its ability to texture map complex objects, as well as its long list of compatible file types. The physics engine chosen is Unity3D standard, due to its ability to add C# scripts to objects for user interaction, and its Kinect and Oculus Rift compatibility. The strategy of using hardware and software that is very versatile in terms of compatibility is chosen in order to minimize the proportion of rework if a sudden change in software/hardware becomes necessary, so that old work can still seamlessly integrate into the new platform. The subject of this prototype walkthrough is the Johnson Engineering and Physical Sciences Building at UT Martin, and will in time expand to a full campus walkthrough.

I. INTRODUCTION For many years people have tried to come up with a way

for the world to have experiences in virtual reality. They want these for different reasons and to be used in different fields [1-4]. It could also be implemented into different forms of recreational and commercial use. Such examples include, but are not limited to, mental therapies [5-6] (PTSD, phobias), controlling of robots/rovers in space [7], and training in the medical field [3]. NASA has been trying to utilize virtual reality so people can experience walking and exploring space, specifically Mars [7]. Another field that is working on virtual reality is architecture. Jon Brouchoud [8] has come up with a way for people to take 3-D walkthroughs of buildings where customers would stand in the unfinished building and see what it will look like after construction. Others are being used for architects to construct the building with the correct materials with the glasses before starting on the project. The paper presents an interactive virtual walkthroughs of buildings and venues using virtual reality glasses. The scope of this work includes the modeling and texture mapping of a virtual university campus with commercial modeling

software, as well as integrating this model into a 3-D engine capable of rendering a realistic environment for the end-user. The virtual walkthrough is to be designed with immersion and user experience in mind. In order to be as immersive as possible, the project has to conform to 1) Three Dimensional Viewing, 2) Full User-Navigation Integration, and 3) an Interactive Environment. Keeping these three piers in mind throughout the design process allows immersion to be a part of the final design.

II. SYSTEM DESIGN The process for creating the virtual world is as follows:

(A) Modeling and Texturing the Building, (B) Environment Creation, (C) Interactive Scripting, and (D) Hardware Integration. In order to model and texture map the building, Rhinoceros modeling software was chosen, but any modeling software will suffice. To create the environment, Unity3D was chosen. This piece of the project involves terrain creation, player character creation, and interactive object creation that will be used further into development. The Interactive Scripting element of the project is done in C# and JavaScript, to be integrated with Unity3D, linked to each object the scripting is to involve. The hardware incorporation includes utilizing the SDKs included with the Oculus Rift Developer Kit 2, and the Microsoft Kinect A. Modeling and Texturing the Building

The modeling process is one of the most time consuming pieces of the design process, so it is of utmost importance that the process is started correctly, to reduce any rework in the future. That being said, it is also one of the most rewarding parts of the design process. In order to get a model of a building, one must first have floor plans of the building. The software package Rhinoceros was chosen for its ability to import PDF files as lines, which streamlines the process if one were to have digital copies of the floor plans. Once imported onto the plane and scaled accordingly, one must create surfaces from the outlines of walls, beams, and columns for each floor of the building. After each surface has been created, it is a simple task to select all surfaces, and extrude a solid object from the surface a length vertically that represents the height of the walls on that floor. Rinsing and repeating for each floor applicable, one now has an accurate representation of the building. Adding stairs, floors, and ceilings is now a trivial matter, and doors will be looked at in the interactive object section of this paper.

Page 2: Interactive Virtual Building Walkthrough Using Oculus Rift ... 2015 Ocul… · Interactive Virtual Building Walkthrough Using Oculus Rift and Microsoft Kinect Will Woodard and Somsak

(a)

(b)

(c)

Fig. 1. (a) shows the solid model of Johnson EPS Building. Notice the object parenting on the right panel for multiple-floored organization, (b) shows the textures created for the walls (left) and floors (right) of the building, and (c) shows a terrain, building file, and first-person character imported into Unity.

As shown in Figure 1a, once a solid representation of the building has been created, a texture must be added to the surface of the object. Most buildings are very rectilinear in nature, so it was decided that the box-mapping feature in Rhinoceros for wrapping the textures on the solid objects would be the most effective way to add textures. The textures that will be used to texture map the building will be taken from images taken from the actual building, and digitally rendered to be used as textures without seams between the images being visible. In order to texture map the building, it is important to group objects based on the material their real-world counterparts are made of. For example, selecting all the walls objects and giving the selection a box mapping of the left image in Figure 1b will create realistic brick tiling only on the objects selected. This process can be repeated for floors, ceilings, doors, windows, and for any other objects involved with the building of choice.

B. Environment Creation The Environment Creation process is done primarily in

Unity3D, a physics engine used primarily for video game creation, so in order to create a tour within Unity, one must

go about it with a game-like routine. As shown in Figure 1c, first there must be a terrain for our object to lie on, otherwise exiting the building would lead to an infinite fall to nothing. The next step to the process is to import the project from Rhinoceros to Unity, which is easily done by saving the project in Rhino as a .fbx file, with textures included in the save file. Once saved, the file can be added to the assets in the Unity project, and can be added to the scene, making sure to activate the “Import Mesh Colliders” box on the assets menu, to assure the player will not be able to walk through the walls created. The texture files must also be present as assets in the same directory as the file, otherwise Unity will not be able to link the correct textures to the file on inclusion to the scene.

Once the files have been added to the scene, a character must be made to be able to take the tour. Unity has a standard asset for first person characters and third person characters, so the best choice would be the first person standard asset. Once imported into the scene, a person would be able to use the WASD keys to move and the Space key to jump within the project. There are other small additions to the scene that could be considered just for the sake of world

Page 3: Interactive Virtual Building Walkthrough Using Oculus Rift ... 2015 Ocul… · Interactive Virtual Building Walkthrough Using Oculus Rift and Microsoft Kinect Will Woodard and Somsak

building, such as light probes for illumination, and skyboxes for a non-gray sky and atmospheric effect, but they require no further explanation, as they are rather straight-forward to implement.

C. Interactive Scripting Interactive Scripting is to be added in order to create

more than a static representation of a building. This interaction includes work with doors and light fixtures, with plans to expand to elevators and whiteboards. In order to create interactivity in Unity with doors and lights, one must first create a mesh collider that is a bit larger than the object to be interacted with. Once the collider is created and added as a child element to the object needing interaction, there will be a script added in either C# or Java that checks to see whether the character object is within the mesh collider, and if a designated keystroke has been pressed. Once both of these criteria have been met, the object goes through a rotation about an axis (in the door case, a 90 degree rotation about its perceived hinge, and in the light switch case, a 45 degree rotation as well as a light probe being disabled/enabled). Future directive involving elevators would move the elevator object and character up a story based on similar parameters and with smooth movement as seen with the door scripting in Figure 2.

Fig. 2. Example of scripting done to open and close door upon keystroke and proximity of door.

D. Hardware Integration In terms of hardware for this project, the Oculus Rift

DK2 and first generation Kinect were chosen based on their prospective SDK’s compatibility with virtually any physics engine, including of course Unity. In order to gain Oculus Rift compatibility in a Unity Scene, one must import the most recent Oculus Rift SDK for Unity into the project, and select a prefabricated camera view and Oculus Object to base your work on. The stereoscopic view is obtained from two camera views that are children of an OVRCameraComtroller object, so replacing the camera on the character object with this object will suffice to give Oculus compatibility. The character motor script included with the First Person character was linked to the original camera view, so linking it to either the Left or Right OVR Camera will suffice to re-

enable the character controller. In order to get Kinect compatibility, the most recent Kinect Suite from Microsoft must be installed, and the most recent Kinect SDK for Unity added as an asset to the Unity project. To get limb tracking from the Kinect, an object must be added as a child to the first person character, with the included Kinect Manager Script added to the Object as a component. There must also be an Avatar Controller Script added to the avatar being used, with each of the limbs that will be mirrored linked to the script. Figure 3 shows the final result.

Fig 3. Example of Oculus Rift Stereoscopic vision and Kinect limb tracking from within Unity.

III. CONCLUSION The goal of this project was not just to create a virtual walkthrough of a building, but to design a process of creating such a walkthrough that can be altered to fit the needs of other projects. Using readily available tools and software packages, a working product can be made through this process in a very short amount of time. The process was created with scalability in mind, so a one-building walkthrough can be transformed into a campus-wide or city-wide walkthrough, given the necessary time and manpower. Goals for the future of this project include automating the modeling process, creating more intricate interactions with created environments, and gesture controlled interaction utilizing limb tracking.

IV. REFERENCES [1] Blaha, J.; Gupta, M., “Diplopia: A virtual reality game designed to help amblyopics,” IEEE Virtual Reality (VR), pp. 163-164, 2014. [2] Kangsoo Kim, et al., “Augmented reality tour system for immersive experience of cultural heritage,” International Conference on VRCAI, pp. 323-324, 2009. [3] Falah, J.; et al., "Virtual Reality medical training system for anatomy education," Science and Information Conference (SAI), 2014 , vol., no., pp.752,758, 27-29 Aug. 2014. [4] Oudatzi, K., “Virtual reality in restoration of historic buildings: 3d model projection of the restoration project of Alaca Imaret Câmi with intuitive and interactive application through hyper realism technology,” International Conference on VSMM, 2010. [5] Rizzo, A.; et al., "Development and Clinical Results from the Virtual Iraq Exposure Therapy Application for PTSD," Virtual Rehabilitation International Conference, vol., no., pp.8,15, 2009. [6] www.psychologytoday.com/articles/199411/virtual-therapy [7] Norris, J.; Davidoff, S., "NASA Telexploration Project demo," IEEE Virtual Reality (VR), vol., no., pp.183,184, 2014. [8] http://jonbrouchoud.com