design and development of a portable virtual reality headsetraman/agcl/vr_paper.pdf · design and...

4
Design and Development of a Portable Virtual Reality Headset ABSTRACT This paper describes the design and the development of hardware and software of a portable Virtual Reality Headset. The portability of the device is achieved using a small credit-card sized, yet a powerful computer, Raspberry Pi ® . It is responsible for the execution of the main software and display of the rendered images on the screen. The Inertial Measurement Unit (IMU) present in the device tracks the user’s head movements and communicates with Raspberry Pi using the Arduino ® platform. This technology focuses on developing a dedicated hardware platform for the Virtual Reality purposes. Throughout the design phase of the project, cost was kept minimal without compromising on the performance of the system. Keywords Virtual Reality; Augmented Reality; Head Tracking; Head Mounted Display; Portability. 1. INTRODUCTION In the past few years, researchers have been working on Virtual Reality (VR) and Augmented Reality (AR) systems [1, 2]. These systems create a visual sensation to the user in the form of virtual or mixed reality. The rendered images are updated on the screen of the Head Mounted Display (HMD) [2] according to the movement of the user. The HMDs available in the market are bulky and wired to user’s computer that executes the software. These systems require a processor with a good computational power and a Graphic Processing Unit (GPU). Due to the above mentioned constraints, the user cannot move around wearing the headset. In this paper, we discuss an alternative approach for the hardware and software design, aimed at making the system portable and compact, enabling user’s movement while wearing the headset. 2. RELATED WORK In 2012, Oculus launched the first version of VR headset for development purpose that was named as DK1 [3]. This product had a great impact in the VR industry. Various commercial products that were built based on this technology [4, 5, 6] used personal computer (PC) for processing the data and displaying it on the HMD. Thus, one of the major limitations in the existing technology is portability of the device. Some of the devices available in market use phone’s internal processing capabilities and display, for VR purpose. The processing speed and graphics is also limited by the phone’s hardware. 3. HARDWARE DESIGN Figure 1 shows the block diagram of the proposed portable VR headset. It consists of three major components namely – the tracking unit, computing unit, and the display unit. Research shows that the maximum angular velocity and linear acceleration of the user’s head does not exceed 230 deg/s and 2g (user is not jumping) respectively [7]. MPU6050 is used as IMU in the design. It is programmed to measure an angular velocity of ±250 deg/s and linear acceleration of ±2g. The Digital Motion Processor (DMP) present on MPU6050 reduces the computation loads on the main processor used in the device. Arduino and Raspberry Pi hardware platforms were used for computations. These are small form factor computers with reasonably good computation capabilities and are suitable for our requirements. The processed data is displayed on the screen present in the VR headset. A pair of lenses is used for proper visualization of the rendered image on the screen. In figure 1, the tracking unit tracks the movement of the user’s head. The sensor’s data consists of the coordinates of the user’s orientation. This data is fed to the computing unit which generates the real time rendered images. The rendered images are then displayed on the screen, thereby creating stereo visual sensations. Figure 2 shows the layout of the proposed system. The details of the modules are given in the sections below. Figure 2. Product Layout Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. VRIC’16, March 23–27, 2016, Laval, France. Copyright 2015 ACM 978-1-4503-3313-9…$15.00. Figure 1. Hardware Design

Upload: others

Post on 28-May-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Design and Development of a Portable Virtual Reality Headsetraman/agcl/VR_Paper.pdf · Design and Development of a Portable Virtual Reality Headset ABSTRACT tracking unit, computing

Design and Development of a Portable Virtual Reality Headset

ABSTRACT This paper describes the design and the development of hardware and software of a portable Virtual Reality Headset. The portability of the device is achieved using a small credit-card sized, yet a powerful computer, Raspberry Pi®. It is responsible for the execution of the main software and display of the rendered images on the screen. The Inertial Measurement Unit (IMU) present in the device tracks the user’s head movements and communicates with Raspberry Pi using the Arduino® platform. This technology focuses on developing a dedicated hardware platform for the Virtual Reality purposes. Throughout the design phase of the project, cost was kept minimal without compromising on the performance of the system.

Keywords Virtual Reality; Augmented Reality; Head Tracking; Head Mounted Display; Portability.

1. INTRODUCTION In the past few years, researchers have been working on Virtual Reality (VR) and Augmented Reality (AR) systems [1, 2]. These systems create a visual sensation to the user in the form of virtual or mixed reality. The rendered images are updated on the screen of the Head Mounted Display (HMD) [2] according to the movement of the user. The HMDs available in the market are bulky and wired to user’s computer that executes the software. These systems require a processor with a good computational power and a Graphic Processing Unit (GPU). Due to the above mentioned constraints, the user cannot move around wearing the headset. In this paper, we discuss an alternative approach for the hardware and software design, aimed at making the system portable and compact, enabling user’s movement while wearing the headset.

2. RELATED WORK In 2012, Oculus launched the first version of VR headset for development purpose that was named as DK1 [3]. This product had a great impact in the VR industry. Various commercial products that were built based on this technology [4, 5, 6] used personal computer (PC) for processing the data and displaying it on the HMD. Thus, one of the major limitations in the existing technology is portability of the device. Some of the devices available in market use phone’s internal processing capabilities and display, for VR purpose. The processing speed and graphics is also limited by the phone’s hardware.

3. HARDWARE DESIGN Figure 1 shows the block diagram of the proposed portable VR headset. It consists of three major components namely – the

tracking unit, computing unit, and the display unit. Research shows that the maximum angular velocity and linear acceleration of the user’s head does not exceed 230 deg/s and 2g (user is not jumping) respectively [7]. MPU6050 is used as IMU in the design. It is programmed to measure an angular velocity of ±250 deg/s and linear acceleration of ±2g. The Digital Motion Processor (DMP) present on MPU6050 reduces the computation loads on the main processor used in the device. Arduino and Raspberry Pi hardware platforms were used for computations. These are small form factor computers with reasonably good computation capabilities and are suitable for our requirements. The processed data is displayed on the screen present in the VR headset. A pair of lenses is used for proper visualization of the rendered image on the screen. In figure 1, the tracking unit tracks the movement of the user’s head. The sensor’s data consists of the coordinates of the user’s orientation. This data is fed to the computing unit which generates the real time rendered images. The rendered images are then displayed on the screen, thereby creating stereo visual sensations.

Figure 2 shows the layout of the proposed system. The details of the modules are given in the sections below.

Figure 2. Product Layout

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. VRIC’16, March 23–27, 2016, Laval, France. Copyright 2015 ACM 978-1-4503-3313-9…$15.00.

Figure 1. Hardware Design .

Page 2: Design and Development of a Portable Virtual Reality Headsetraman/agcl/VR_Paper.pdf · Design and Development of a Portable Virtual Reality Headset ABSTRACT tracking unit, computing

3.1 MPU6050 MPU6050 is motion tracking unit which contains 3-axis gyroscope, 3-axis accelerometer and a DMP [8]. Using the sensors with the sensor fusion technology, real-time motion tracking is achieved. The size of the sensor board is 4 mm×4 mm×0.9 mm. MPU6050 has 16-bit analog-to-digital convertors (ADCs) for digitizing the data given by gyroscope and accelerometer. This device can be programmed as per the requirements stated earlier. MPU6050 works with a voltage range of 2.375 V - 3.46 V. It interfaces with the host micro-controller (Arduino) on the I2C interface.

3.2 Arduino Arduino® is an open-source hardware platform. It has 8-bit microcontroller- ATmega328 and executes instructions at a clock speed of 16 MHz. This module works with an input voltage of 7 V-12 V fed at the DC power jack on the board. It can also be powered using the on-board USB port. It receives the data from the IMU on the I2C bus. The processed data is sent to the Raspberry Pi (main computing unit) on the serial bus (USB). Arduino platform has well developed libraries for MPU6050 which makes it easier to use the module for quick prototyping [9, 12]. 3.3 Raspberry Pi 2 Raspberry Pi 2® is a credit-card sized single board computer. In our research, we used Raspberry Pi 2 Model B. This version of Raspberry Pi is based on Broadcom BCM2836 processor. It has 1 GB of RAM which is also shared with the GPU and an ARM Cortex A7 processor running with a clock of 900 MHz. It has the capability to communicate with a display unit having a resolution of 1900 × 1200 pixels using HDMI port. There is also a provision of placing a microSD card on the board that has the Operating System (OS). The dimension and weight of the computer is approximately 90 mm × 60 mm and 45 gms, respectively. The power rating of the device is 5V≈1A [10]. The OS installed on the device is Linux based Raspbian version.

3.4 Display Unit One of the requirements of the design is that the screen must cover the field of view of the user to achieve proper visualization of the image. A seven-inch screen was a good choice, as it covers most of the user’s field of vision when placed at a distance of 4-5 cm from the eyes. Adafruit 7” display [11] with a HDMI and power input port is used in our design. The driver for this device is present behind the screen itself. This makes the system compact. Resolution of this screen is 800 × 480 pixels which is sufficient for VR purpose. Peak current rating of this display is 600 mA, which can further be reduced by adjusting the intensity of the backlight.

3.5 Lens and Casing The lenses used in the design are bi-convex lenses with a focal length of 40 mm. Diameter of the lenses is sufficient to cover the field of view of user when kept near the eyes. The casing of the device is designed in such a way that it can accommodate all the electrical and mechanical components. It has been designed and fabricated in-house using acrylic sheets. It contains the screen and lenses at appropriate position for focusing and to cover the user’s field of view. Care has been taken to design the casing such that no part of the casing blocks the display.

4. SOFTWARE DESIGN The flow of overall software is as shown in figure 3. The data from the accelerometer and the gyroscope corresponds to the orientation of the device. Libraries related to MPU6050 were used to acquire the data [12]. Preliminary calculations are done on the Arduino platform and the processed data is sent to Raspberry Pi. The processes like rendering of the image are executed in Raspberry Pi computer using OpenGL ES [13]. The details of the software modules are explained in the following sections.

Figure 3. Software Design

4.1 Tracker Software running on Arduino is used to interpret the data given by MPU6050 sensor. It contains a 1024 bytes FIFO buffer in which values from the sensor are stored as and when acquired. Once the buffer is full, an interrupt is sent to Arduino following which, Arduino reads the data from the buffer. The raw data is converted into corresponding yaw, pitch, and roll values which are later used in the process of rendering.

4.2 Viewport Update Viewport is a rectangular portion on the screen where the image is seen [14]. As soon as the coordinates given by tracker are updated, the viewport is updated as shown in figure 4.

Figure 4. Result of camera or tracker movement

Suppose there exists a virtual camera in a virtual world whose motion is driven by controlling the tracker. In other words, the virtual camera mimics the tracker. Now, if the tracker is rotated in clockwise direction, the whole scene rotates in anti-clockwise direction with respect to the tracker or the virtual camera. Since the whole scene is still, the virtual camera has to be rotated in clockwise direction to reflect the change. The output from both the viewports are same as shown in figure 4.

4.3 Pincushion Transformation The screen of HMD is kept very close to eyes (approximately 4-5 cms away). The user will not be able to focus on the image with this separation. Thus, a pair of lenses is used. As the user focuses on the display using the lens, this leads to a pincushion distortion on the image which is clearly noticeable to the user. To eliminate this distortion, pin-cushion transformation [15] (barrel distortion) is applied on the image before it is displayed on the screen.

Page 3: Design and Development of a Portable Virtual Reality Headsetraman/agcl/VR_Paper.pdf · Design and Development of a Portable Virtual Reality Headset ABSTRACT tracking unit, computing

Figure 5. Distortion and correction due to lens

Figure 5 shows the above mentioned effect. The top image shows the pincushion distortion. This is noticed when a rectangular image is seen through the lens. The bottom image shows how a pincushion transformed image is visible as a rectangular image, when seen through the lens.

4.4 Duplication and Stereo Visualization After the pincushion transformation is applied, we obtain a single image that is visible on the screen. However, we need to have a separate image for each eye. To accomplish this, the image is duplicated and displayed side by side on the screen. These images, when seen through the lenses do not appear as a single image to the user. After adjusting the distance between the duplicate images on the screen, based on the Interpupillary Distance (IPD) of user, the two images start superimposing on each other. This creates a visualization of a single image when seen through the lenses. However, even this image looks flat, and its depth cannot be perceived. To include detailed information in the scene, we render the scene from two view points to mimic left and right eyes. This when displayed, creates a stereo perception to user.

5. PROTOTYPE AND RESULTS Figure 6 shows the functional prototype of the product that has been developed and fabricated in-house.

Figure 6. Functional Prototype of Device

In this functional prototype, we used Raspberry Pi as the main processing unit running Linux OS (Rasbian version). This computer does not support OpenGL. Hence the software was developed in C++ with OpenGL ES libraries.

Figure 7. Table top Set-up of the device

Figure 7 shows the table top model of the device with the output as seen on a computer screen.

6. DISCUSSION AND FUTURE WORK The key motivation for developing this product was to design and develop a dedicated hardware and software for VR purposes to make portability possible. Available devices use either the phone’s processor and display or Personal Computers’ hardware. The next version of the device will be developed with better display to address the problem of aliasing. Even the casing design for the device will be optimized for shape and mass. We also plan to incorporate cameras in the device to make AR possible. The device will capture the images from camera and place these images in background to render in real-time. This will enable superimposing the real and virtual world images, thereby giving an entirely new viewing experience.

7. REFERENCES [1] K. S. Hale and K. M. Stanney, Handbook on Virtual

Environments, 2nd edition, CRC Press, 2015.

[2] Steaven M. Lavelle, Virtual Reality URL: http://msl.cs.uiuc.edu/vr/, 2015.

[3] https://www.oculus.com/en-us/, 2015. [4] http://www.htcvive.com/us/, 2015.

[5] https://www.playstation.com/en-in/explore/ps4/features/playstation-vr/, 2015.

[6] https://www.visusvr.com/, 2015.

[7] William R. Bussone, Linear and Angular Head Accelerations in Daily Life, URL : http://scholar.lib.vt.edu/theses/available/etd-08182005-222028/unrestricted/thesis.pdf, 2005.

[8] InvenSense, “MPU-6000 and MPU-6050 Product Specification Revision 3.4”, URL : http://store.invensense.com/datasheets/invensense/MPU-6050_DataSheet_V3%204.pdf, 2013.

[9] Arduino, “Arduino Uno”, URL: https://www.arduino.cc/en/Main/ArduinoBoardUno, 2015.

[10] Raspberry Pi, “Raspberry Pi Hardware” URL: https://www.raspberrypi.org/documentation/hardware/raspberrypi/, 2015.

Page 4: Design and Development of a Portable Virtual Reality Headsetraman/agcl/VR_Paper.pdf · Design and Development of a Portable Virtual Reality Headset ABSTRACT tracking unit, computing

[11] Adafruit, “HDMI 7" 800x480 Display Backpack - Without Touch” URL: https://www.adafruit.com/products/2406, 2015.

[12] https://github.com/jrowberg/i2cdevlib/tree/master/Arduino/MPU6050.

[13] The Standard for Embedded Accelerated 3D Graphics, https://www.khronos.org/opengles/, 2015.

[14] World Windows, Viewports & Clipping, URL: http://webserver2.tecgraf.puc-rio.br/ftp_pub/lfm/L1J_WindowViewport.pdf, 2015.

[15] K.T. Gribbon, C.T. Johnston, and D.G. Bailey, “A Real-time FPGA Implementation of a Barrel Distortion Correction Algorithm with Bilinear Interpolation ˮ URL: http://sprg.massey.ac.nz/pdfs/2003_IVCNZ_408.pdf

[16] Code used to generate the object shown in figure 7 is the example code in Raspbian OS, modified for our requirement with Copyright notice included: pi@raspberrypi /opt/vc/src/hello_pi/hello_triangle $ ./hello_traingle.bin

[17] Image in figure 2. Courtesy : http://orig09.deviantart.net/dd96/f/2006/347/c/2/kuttu_and_chinnu_09_by_anoop_pc.jpg

[18] Image in figure 4. Courtesy: http://orig03.deviantart.net/e1ea/f/2008/234/e/2/day_and_night_cartoon_scene_by_p0larboy.jpg

[19] Image in figure 6. Courtesy: https://s-media-cache-ak0.pinimg.com/736x/d8/9c/bf/d89cbff0d4377fe4c77255de04f64eb5.jpg