a projector based hand-held display system leung man chuen wong kin hong 1
TRANSCRIPT
A Projector Based Hand-held Display System
LEUNG Man ChuenWONG Kin Hong
1
Outline• Introduction• Previous work• System overview• System detail• Implementation and Experimental Results• Limitations and discussions• View dependent projection and application• Conclusion• Question and Answer Session
2
Introduction
3
Introduction• Aim of this project:– Build a movable hand-held display system • Uses an ordinary cardboard as the display• Uses a projector to project display content
• Desired result:
4
Introduction
• Motivation:– Popular forms of graphical interfaces are usually
in fixed positions• Monitor• Projection onto a wall / screen
– Hand-held display device gives users greater freedom of control• Viewing angle, viewing distance
5
Introduction• Motivation
– Screen of hand-held display devices are small• Mobile phone, portable DVD player, iPod…
– Increasing the size of the screen => heavy !– Electronic paper
• Technology not very mature yet• Not available at low cost
– Therefore, an alternative solution is needed!
6
Introduction
• Our approach– Use a projector-camera pair to serve as the input
and output devices– Use a white cardboard as the movable display
surface• Can be light weight• Flexible screen size
7
Introduction
• Contributions– Proposed a computer vision approach to enable the
projector to project display content precisely onto a movable ordinary cardboard in 3D space• Projector-camera calibration method to find the projection
matrix from a 3D point in camera coordinate to 2D projector image coordinate
• Real-time quadrangle detection and tracking algorithm to detect and track the display in real time
• Experimental results show that projection content can be registered onto the hand-held display precisely
• This work has been published in the Proc. Of CVPR2009
8
• Contributions (con’t)– Proposed an application with view dependent
projection• “Hand-held 3D Model Viewer”• Use a head-mounted camera to track the viewing
position• Project views of a 3D model according to user view• Give user an perception of a real object sitting on the
display
9
Previous work
10
Previous work• Sensor based methods
– Dynamic Shader Lamp [Bandyopadhyay et al], magnetic sensor– Free Form Projection Display [Kondo et al], magnetic sensor– PaperWindows [Holman et al], infrared reflective marker– Foldable Interactive Display [Lee et al], IR emitter– Projector Based Tracking [Lee at al], light sensors
– Disadvantages: • Need to attached sensors or special markers onto the projection
surface• Accurate tracking systems are very expensive
– Magnetic location tracker– Vicon Motion Capturing System, for infrared reflective marker
11
Previous work• Vision based methods– Portable Display Screen [Borkowski et al]
• Put the rectangular screen to a particular position for detection• Use Kalman filter to track the motion• Register the image to the portable screen using homographies:• A homography is a 3x3 matrix that match a projective plane to a
projective plane that maps straight lines to straight lines
• Disadvantages:– Hpc is calibrated on an initial plane, not updated at each time step– Mismatch when the screen move out of the calibration plane
12
Hps: projector-screen homographyHcs: camera-screen homographyHpc: projector-camera homography
Previous work• Vision based methods (con’t)
– Active Pursuit Tracking [Gupta et al]• Calibrate Hpc on an initial plane• Put at least four color markers onto the projection screen• Detect these four color markers in camera image• Project four virtual markers onto the projection screen according to
the detected real markers and previous Hpc
• Update Hpc using the image points of the virtual markers
• Disadvantages:– Color markers are needed to be put on
the projection screen– Virtual markers are needed to be
projected onto the screen constantly– Make the projection screen unnecessarily large– Once lose tracking, need to start from the calibration plane again
13
System overview
14
System overview
15
Cardboard size,Camera parametersImage frame
Real-time quadrangle detection and tracking
Pre-warp display content
Projection
Offline calibration of projector-camera pair
Gp
Relative pose (R, T)
Pre-warped display content
System detail
16
Part 1: New projector-camera pair calibration
• What to calibrate?
17
1
]|[
100
0
0
43c
c
c
xvpvp
upup
Z
Y
X
TRdf
df
S
V
U Gp (3x4)
Part 1: Projector-camera pair calibration
1c
c
c
p Z
Y
X
G
S
V
U
18
1. Create image points in known positions
2. Project onto a cardboard with known size
3. Capture image using camera
4. Calculate 3D positions of the corners
6. Calculate 3D positions of projected points
7. Collect a set of corresponding points 8. Solve Gp
Part 2: Quadrangle detection and tracking
• Line feature extraction– Line detector based on Hough transform in
OpenCV• Open source Computer Vision library
– We detect line segments
19
HoughTransformin OpenCV
Input: Image Edge map Line feature map
Part 2: Quadrangle detection and tracking• Quadrangle detection– Select long line segments– Check for every 4 segments– Find the formed quadrangle– Check criteria• Based on property of a quadrangle, angles, ratio of
side lines…
20
Part 2: Quadrangle detection and tracking• Quadrangle tracking– Using particle filter• An state estimation algorithm
– Two main components• State dynamic model• Observation model
– To avoid degeneracy• Re-sampling
– Output• The weighted sum of samples
21
Initial samples setState update
Updated samples
Evaluation function
Weighted samples
Re-sampling
New sample set
Part 2: Quadrangle detection and tracking
• Quadrangle tracking (cont’)– Tracking target: relative pose(R, T) of the
camera and the cardboard• Calculated from detection result
– State at time k:
– State dynamic model• Random walk [Pupilli] based on uniform density
– Observation• Line segment feature map of current frame
22
zyxzyxk tttrrrq
rx : rotation along x-axisry : rotation along y-axisrz : rotation along z-axistx : translation along x-axisty : translation along y-axistz : translation along z-axis
),()|( 111 eqeqUqqp kkkk
p : density function qk-1 : state at time k-1qk : state at time kU : uniform density functione : uncertainty about movement
Part 2: Quadrangle detection and tracking
• Evaluation method– Re-projection
23
Part 2: Quadrangle detection and tracking
• Evaluation method– closest line matching and criteria checking
24
Part 2: Quadrangle detection and tracking
• Evaluation method– Particle content replacement
25
Part 2: Quadrangle detection and tracking
• Evaluation method– Weight update
• According to quadrangle detection criteria
• Re-sampling
• Handling tracking failure– Determination of tracking failure
• Use distribution particles• If variance is greater than a
certain threshold– Lost tracking !– Perform detection again
26
Part 3: Pre-warping and projection
• From tracking result– Get 3D corner positions
• From calibration result– Get projector image points
• Warp display content• Project onto the plane
27
Projector imageImage to be displayed
T T
T
Implementation and experimental results
28
System setup
• Projector camera pair– Projector: 1280x1024– Camera: 320x240
• Testing platform– Dual core processor at 2.16GHz– 1GB RAM
• White cardboard– 351mm x 300mm
29
Calibration of projector-camera pair
• Project one point each time• 32 images are collected• Mark the points manually• Sample images
30
Calibration of projector-camera pair
• Calibration error:– About 4.6pixels
31
pixel
pixel
O : projection points+ : re-projection points
Detection and tracking experiments
32
• High tracking precision
• Robust to partial occlusions
• Robust to dense clutter
• Works well when both camera and cardboard are moving
Projection results
• Display content can be projected precisely onto the cardboard• Match to the cardboard in different poses
33
Processing time• Processing time for line feature extraction– ~16ms
• Processing time of tracking algorithm– Depends on number of line segments used– Depends on number particles
34
Processing time
• Processing time of tracking algorithm– About 30ms/frame for 80 particles and 20 line
segments• About 30fps
• Processing time for image warping– Negligible
• Total processing time– ~ 46 ms– ~20 fps
35
Projection latency• Estimation method:– Move the cardboard from left to right– Stop sharply, wait until the projection image match the
edge
– Repeat for several times– Use an external video camera to capture the process
• Count the number of frames between stopping the motion and the projection image match the edge of the cardboard
• Average about 5 frames, 167 ms 36
View Dependent Projection and Application
37
View dependent projection • Project according to view of the user
• Head pose tracking– Mount a magnetic location tracker on the head of the user
• [Cruz-Neira et al], [Kondo et al]• Expensive• Tracked pose is relative to the receiver, not to the cardboard directly
– Mount a fixed camera in front of the user• Track the eye positions of the user• Track point model head pieces• Not very accurate and easily be occluded
38
View dependent projection• Our approach
– Use a head mounted camera– Apply the proposed quadrangle detection and tracking algorithm– Advantage: accurate and get the relative pose directly
View dependent projection• Application: Hand-held 3D Model Viewer
– Project different views of a 3D model according to head poses– User can change the view by moving the display or his/her head– Projection is distortion compensated– Give the user an illusion of a real object sitting on the display
• Motivation– Traditional way of interacting with a virtual model:
• Mouse and keyboard– Natural way of interacting with a real object:
• Hold the object in the hands and interact with it directly– To simulate such natural way of interacting with objects
Design detail• Create virtual scene
– Implemented using OpenGL– An virtual object sitting on an virtual
cardboard
• Create user view image– Set a virtual camera – Apply tracked head pose– Generate virtual camera image– Equivalent to desired user view
Design detail• Generate projector image
Experimental results
• Use another webcam to be the head-mounted camera
• Move the cardboard and the camera to different relative poses
• Capture the view of this webcam• Result video:
Discussion
44
Discussion• Limitation on projection resolution– Down sampled in the image warping process– Possible solutions :
• Warp physically , need a 6dof optical redirect system• Use a projector with higher resolution
• Limitation on depth of field– About 30 cm in our prototype– Possible solution:
• Use multiple projectors
45
Discussion• Handling projected light– Projected light may affect tracking stability– Solved using property of camera:
• Set the contrast and gain of light to high value• Projected light on cardboard is brighter - > saturated
• Hand-held 3D Model Viewer– Limitation on speed of motion
• Fast motion will break the interaction experience because of projection latency
• Acceptable error:– ~4cm in translation, ~5 degrees in rotation
• Limited motion:– ~24cm / second, ~30 degrees / second
46
Discussions– Limitation on viewing angle
Discussions• Possible extensions
– Use multiple cardboards– Project onto a cube instead of a cardboard
Conclusion
49
Conclusion• Proposed a projector-based hand-held display system• Proposed a novel computer vision approach to guide the projector
to project onto a hand-held display in the 3D space– Calibration method to find the projection matrix from 3D camera
coordinate system to 2D projector image coordinate system– Robust quadrangle detection and tracking algorithm– Precise and can run in real time
• Implementation of the whole system– Project onto the cardboard correctly in real-time
• An interactive application with view dependent projection is proposed– “Hand-held 3D Model Viewer”– Use head-mounted camera to obtain user view to the display– Give 3D perception to user
50
Question and Answer Session
Thanks
51