immersive displays the other senses…. 1962… classic human sensory systems sight (visual) hearing...

Post on 18-Jan-2016

217 Views

Category:

Documents

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Immersive DisplaysThe other senses…

1962…

Classic Human Sensory SystemsSight (Visual)Hearing

(Aural)Touch (Tactile)Smell

(Olfactory)Taste

(Gustatory)

Relevance to VR#1 – Sight#2 – Hearing#3 – Touch#4 – Smell #5 – Taste

1,2,3 are well studied but still have plenty of research left

4 and 5 are incredibly difficult, but some examples exist

Other relevant sensorsTemperature SensorsProprioceptive sensors (gravity) Stretch sensors found in muscles,

skin, and jointsVestibular (inner ear) sensors

Which can we control in VR?◦Cue conflicts cause nausea, vomiting

Audio (Sound Rendering)Easiest way to improve a VR

system◦Think of watching a movie without

sound

Easy to use (Sound APIs)

Cheap to produce great results (headphones) <$100

Audio DisplaysAn arrangement of speakers

◦Spatially Fixed – Loudspeakers (many types)◦Head-Mounted – Headphones (many types)

Speaker quality affects your ability to generate sound wave frequencies, loudness◦Amplifiers very important for good results

Immersive AudioOur hearing system can sense the 3D source of

a sound◦ A VR system should be able to produce what the

ears should hear from a 3D sourceBinaural recordings in real life (like stereoscopic

video)3D sound rendering in the virtual world (like

stereoscopic rendering)◦ Works best with headphones

Head Related Transfer Function (HRTF)In the frequency domain,

at frequency f◦H(f) = Output (f) / Input (f)

HRTF is dependent on spatial position, X,Y,Z, or in the far field, direction.

Complex HRTF caused by the Pinnae of the ears◦Unique to each person

HRTF learned by each person from childhood to sense 3D source

3D sound renderingIn the API (what you program)

◦position, velocity, intensity of source◦position, velocity, *orientation* of listener

Dependent on your renderer capabilities◦HRTF of actual listener for best results

Measure with molds or in-ear microphones Default HRTF is identity (basically you only get

left-right distinction)

◦Reverb (echoing) or other effects◦Speaker arrangement (usually defined in OS)

Sound APIOpenAL and DirectSound are popular

◦Sort of like OpenGL and Direct3DAPI for talking to a 3D renderer (usually

hardware)◦Similar to the idea of OpenGL

Allows you to load sounds (utility toolkit), specify 3D sound properties, and specify listener properties. ◦Must use single-channel sound files! Multi-

channel sound files do not make sense. The renderer “generates” multi-channel sound.

Example

Haptics (Touch rendering)Reproduction of forces exerted on the

human body◦Striking a surface (e.g. hitting a ball)◦Holding an object◦Texture of a surface

Lack of touch rendering is the #1 problem in VR systems◦Enormous actuation area

The entire surface of the human body

Existing solutions are encumbering and task specific

Categories of Haptic DisplaysPassive vs Active

◦Passive – Can stop motion but cannot create it

◦Active – Can generate motionFixed vs Sourceless

◦Fixed – Mounted to the environment (e.g. a wand)

◦Sourceless – Mounted to the user (e.g. a glove)

Forces, torques, vibrations◦Types of output a Haptic device can be

capable of

Haptic RenderingSpecify forces, torques,

rotations at actuation points◦Most commonly one

APIs are available◦From manufacturer◦OpenHL?

Very similar to physics rendering, except much more difficult◦Requires extremely high update

rates (1000hz for imperceptibility)

top related