426 lecture6b: ar interaction
Post on 19-May-2015
1.337 Views
Preview:
DESCRIPTION
TRANSCRIPT
COSC 426: Augmented Reality
Mark Billinghurst
mark.billinghurst@hitlabnz.org
August 14th 2012
Lecture 6: AR Interaction
experiences
applications
tools
components
Sony CSL © 2004
Building Compelling AR Experiences
Tracking, Display
Authoring
Interaction
AR Interaction
Designing AR System = Interface Design Using different input and output technologies
Objective is a high quality of user experience Ease of use and learning Performance and satisfaction
User Interface and Tool Human User Interface/Tool Machine/Object Human Machine Interface
© Andreas Dünser
Tools
User Interface
User Interface: Characteristics Input: mono or multimodal Output: mono or multisensorial Technique/Metaphor/Paradigm
© Andreas Dünser
Input
Output Sensation of movement
Metaphor: “Push” to accelerate “Turn” to rotate
Human Computer Interface Human User Interface Computer System Human Computer Interface=
Hardware +| Software Computer is everywhere now HCI electronic
devices, Home Automation, Transport vehicles, etc
© Andreas Dünser
More terminology Interaction Device= Input/Output of User
Interface Interaction Style= category of similar
interaction techniques Interaction Paradigm Modality (human sense) Usability
Back to AR You can see spatially registered AR..
how can you interact with it?
Interaction Tasks 2D (from [Foley]):
Selection, Text Entry, Quantify, Position 3D (from [Bowman]):
Navigation (Travel/Wayfinding) Selection Manipulation System Control/Data Input
AR: 2D + 3D Tasks and.. more specific tasks?
[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D., V. Wallace & P. Chan. IEEE Computer Graphics and Applications(Nov.): 13-48. 1984. [Bowman]: 3D User Interfaces: Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
AR Interfaces as Data Browsers
2D/3D virtual objects are registered in 3D “VR in Real World”
Interaction 2D/3D virtual viewpoint
control
Applications Visualization, training
AR Information Browsers Information is registered to
real-world context Hand held AR displays
Interaction Manipulation of a window
into information space Applications
Context-aware information displays
Rekimoto, et al. 1997
Architecture
Current AR Information Browsers Mobile AR
GPS + compass
Many Applications Layar Wikitude Acrossair PressLite Yelp AR Car Finder …
Junaio AR Browser from Metaio
http://www.junaio.com/
AR browsing GPS + compass 2D/3D object placement Photos/live video Community viewing
Web Interface
Adding Models in Web Interface
Advantages and Disadvantages
Important class of AR interfaces Wearable computers AR simulation, training
Limited interactivity Modification of virtual
content is difficult
Rekimoto, et al. 1997
3D AR Interfaces
Virtual objects displayed in 3D physical space and manipulated HMDs and 6DOF head-tracking 6DOF hand trackers for input
Interaction Viewpoint control Traditional 3D user interface
interaction: manipulation, selection, etc.
Kiyokawa, et al. 2000
AR 3D Interaction
AR Graffiti
www.nextwall.net
Advantages and Disadvantages Important class of AR interfaces
Entertainment, design, training
Advantages User can interact with 3D virtual
object everywhere in space Natural, familiar interaction
Disadvantages Usually no tactile feedback User has to use different devices for
virtual and physical objects Oshima, et al. 2000
Augmented Surfaces and Tangible Interfaces Basic principles
Virtual objects are projected on a surface
Physical objects are used as controls for virtual objects
Support for collaboration
Augmented Surfaces
Rekimoto, et al. 1998 Front projection Marker-based tracking Multiple projection surfaces
Tangible User Interfaces (Ishii 97) Create digital shadows
for physical objects Foreground
graspable UI
Background ambient interfaces
Tangible Interfaces - Ambient Dangling String
Jeremijenko 1995 Ambient ethernet monitor Relies on peripheral cues
Ambient Fixtures Dahley, Wisneski, Ishii 1998 Use natural material qualities
for information display
Tangible Interface: ARgroove Collaborative Instrument Exploring Physically Based Interaction
Map physical actions to Midi output - Translation, rotation - Tilt, shake
ARgroove in Use
Visual Feedback Continuous Visual Feedback is Key Single Virtual Image Provides:
Rotation Tilt Height
i/O Brush (Ryokai, Marti, Ishii)
Other Examples Triangles (Gorbert 1998)
Triangular based story telling
ActiveCube (Kitamura 2000-) Cubes with sensors
Lessons from Tangible Interfaces Physical objects make us smart
Norman’s “Things that Make Us Smart” encode affordances, constraints
Objects aid collaboration establish shared meaning
Objects increase understanding serve as cognitive artifacts
TUI Limitations
Difficult to change object properties can’t tell state of digital data
Limited display capabilities projection screen = 2D dependent on physical display surface
Separation between object and display ARgroove
Advantages and Disadvantages
Advantages Natural - users hands are used for interacting
with both virtual and real objects. - No need for special purpose input devices
Disadvantages Interaction is limited only to 2D surface
- Full 3D interaction and manipulation is difficult
Orthogonal Nature of AR Interfaces
Back to the Real World
AR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private views
TUI + AR = Tangible AR Apply TUI methods to AR interface design
Space-multiplexed Many devices each with one function
- Quicker to use, more intuitive, clutter - Real Toolbox
Time-multiplexed One device with many functions
- Space efficient - mouse
Tangible AR: Tiles (Space Multiplexed)
Tiles semantics data tiles operation tiles
Operation on tiles proximity spatial arrangements space-multiplexed
Space-multiplexed Interface
Data authoring in Tiles
Proximity-based Interaction
Object Based Interaction: MagicCup Intuitive Virtual Object Manipulation
on a Table-Top Workspace
Time multiplexed Multiple Markers
- Robust Tracking
Tangible User Interface - Intuitive Manipulation
Stereo Display - Good Presence
Our system
Main table, Menu table, Cup interface
Tangible AR: Time-multiplexed Interaction
Use of natural physical object manipulations to control virtual objects
VOMAR Demo Catalog book:
- Turn over the page
Paddle operation: - Push, shake, incline, hit, scoop
VOMAR Interface
Advantages and Disadvantages
Advantages Natural interaction with virtual and physical tools
- No need for special purpose input devices
Spatial interaction with virtual objects - 3D manipulation with virtual objects anywhere in physical
space
Disadvantages Requires Head Mounted Display
Wrap-up Browsing Interfaces
simple (conceptually!), unobtrusive
3D AR Interfaces expressive, creative, require attention
Tangible Interfaces Embedded into conventional environments
Tangible AR Combines TUI input + AR display
AR User Interface: Categorization
Traditional Desktop: keyboard, mouse, joystick (with or without 2D/3D GUI)
Specialized/VR Device: 3D VR devices, specially design device
AR User Interface: Categorization Tangible Interface : using physical object Hand/
Touch Interface : using pose and gesture of hand, fingers
Body Interface: using movement of body
AR User Interface: Categorization Speech Interface: voice, speech control Multimodal Interface : Gesture + Speech Haptic Interface : haptic feedback Eye Tracking, Physiological, Brain Computer
Interface..
OSGART: From Registration to Interaction
Keyboard and Mouse Interaction Traditional input techniques OSG provides a framework for handling keyboard
and mouse input events (osgGA) 1. Subclass osgGA::GUIEventHandler 2. Handle events:
• Mouse up / down / move / drag / scroll-wheel • Key up / down
3. Add instance of new handler to the viewer
Keyboard and Mouse Interaction
class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {
public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }
virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) {
switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: break; case osgGA::GUIEventAdapter::KEYDOWN: break; }
return false; } };
viewer.addEventHandler(new KeyboardMouseEventHandler());
Create your own event handler class
Add it to the viewer to receive events
Keyboard Interaction
case osgGA::GUIEventAdapter::KEYDOWN: {
switch (ea.getKey()) { case 'w': // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case 's': // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case 'a': // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case 'd': // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true; case ' ': // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; }
break;
Handle W,A,S,D keys to move an object
localTransform = new osg::MatrixTransform(); localTransform->addChild(osgDB::readNodeFile("media/car.ive")); arTransform->addChild(localTransform.get());
Keyboard Interaction Demo
Mouse Interaction Mouse is pointing device… Use mouse to select objects in an AR scene OSG provides methods for ray-casting and
intersection testing Return an osg::NodePath (the path from the hit
node all the way back to the root)
Projection Plane (screen) scene
Mouse Interaction
case osgGA::GUIEventAdapter::PUSH:
osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections;
// Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) { targets[i]->setSelected(false); }
// Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) {
if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target->setSelected(true); return true; } } }
break;
Compute the list of nodes under the clicked position Invoke an action on nodes that are hit, e.g. select, delete
Mouse Interaction Demo
Proximity Techniques Interaction based on
the distance between a marker and the camera the distance between multiple markers
Single Marker Techniques: Proximity Use distance from camera to marker as
input parameter e.g. Lean in close to examine
Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium
Single Marker Techniques: Proximity // Load some models osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg"); osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg"); osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");
// Use a Level-Of-Detail node to show each model at different distance ranges. osg::ref_ptr<osg::LOD> lod = new osg::LOD(); lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m away lod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm away lod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away
arTransform->addChild(lod.get());
Define depth ranges for each node Add as many as you want Ranges can overlap
Single Marker Proximity Demo
Multiple Marker Concepts Interaction based on the relationship between
markers e.g. When the distance between two markers
decreases below threshold invoke an action Tangible User Interface
Applications: Memory card games File operations
Multiple Marker Proximity
Transform A Transform B
Switch A Switch B
Model A1
Model A2
Model B1
Virtual Camera
Distance > Threshold
Model B2
Multiple Marker Proximity
Transform A Transform B
Switch A Switch B
Model A1
Model A2
Model B1
Virtual Camera
Distance <= Threshold
Model B2
Multiple Marker Proximity
virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {
if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) {
osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length();
if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); }
}
}
traverse(node,nv);
}
Use a node callback to test for proximity and update the relevant nodes
Multiple Marker Proximity
Paddle Interaction Use one marker as a tool for selecting and
manipulating objects (tangible user interface) Another marker provides a frame of reference
A grid of markers can alleviate problems with occlusion
MagicCup (Kato et al) VOMAR (Kato et al)
Paddle Interaction Often useful to adopt a local coordinate system
Allows the camera to move without disrupting Tlocal
Places the paddle in the same coordinate system as the content on the grid Simplifies interaction
osgART computes Tlocal using the osgART::LocalTransformationCallback
Tilt and Shake Interaction
Detect types of paddle movement: Tilt
- gradual change in orientation
Shake - short, sudden changes in translation
Building Tangible AR Interfaces with ARToolKit
Required Code Calculating Camera Position
Range to marker
Loading Multiple Patterns/Models Interaction between objects
Proximity Relative position/orientation
Occlusion Stencil buffering Multi-marker tracking
Tangible AR Coordinate Frames
Local vs. Global Interactions
Local Actions determined from single camera to marker
transform - shaking, appearance, relative position, range
Global Actions determined from two relationships
- marker to camera, world to camera coords. - Marker transform determined in world coordinates
• object tilt, absolute position, absolute rotation, hitting
Range-based Interaction Sample File: RangeTest.c
/* get the camera transformation */ arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans);
/* find the range */ Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3]; range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
Loading Multiple Patterns Sample File: LoadMulti.c
Uses object.c to load
Object Structure typedef struct { char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;
Finding Multiple Transforms Create object list ObjectData_T *object;
Read in objects - in init( ) read_ObjData( char *name, int *objectnum );
Find Transform – in mainLoop( ) for( i = 0; i < objectnum; i++ ) { ..Check patterns ..Find transforms for each marker }
Drawing Multiple Objects
Send the object list to the draw function draw( object, objectnum );
Draw each object individually for( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para); }
Proximity Based Interaction
Sample File – CollideTest.c Detect distance between markers
checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction
Multi-marker Tracking
Sample File – multiTest.c Multiple markers to establish a
single coordinate frame Reading in a configuration file Tracking from sets of markers Careful camera calibration
Sample File - Data/multi/marker.dat
Contains list of all the patterns and their exact positions
#the number of patterns to be recognized 6
#marker 1 Data/multi/patt.a 40.0 0.0 0.0 1.0000 0.0000 0.0000 -100.0000 0.0000 1.0000 0.0000 50.0000 0.0000 0.0000 1.0000 0.0000 …
MultiMarker Configuration File
Pattern File
Pattern Width + Coordinate Origin
Pattern Transform Relative to Global Origin
Camera Transform Calculation Include <AR/arMulti.h> Link to libARMulti.lib
In mainLoop() Detect markers as usual arDetectMarkerLite(dataPtr, thresh,
&marker_info, &marker_num)
Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info,
marker_num, config)) < 0 ) {
argSwapBuffers(); return; }
Paddle-based Interaction
Tracking single marker relative to multi-marker set - paddle contains single marker
Paddle Interaction Code Sample File – PaddleDemo.c
Get paddle marker location + draw paddle before drawing background model
paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, &cparam);
/* draw the paddle */ if( paddleInfo->active ){ draw_paddle( paddleInfo); }
draw_paddle uses a Stencil Buffer to increase realism
Paddle Interaction Code II
Sample File – paddleDrawDemo.c
Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])
Sample File – paddleTouch.c Finds the paddle position:
findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);
Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
General Tangible AR Library command_sub.c, command_sub.h Contains functions for recognizing a range of
different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( );
Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)
top related