system description - surplussales.com · system description overview my 'rig ... sequence of...

6
5/14/2014 EE148 3-D Photography - Class Project, Spring 1998 http://web.archive.org/web/20130514055353/http://www.multires.caltech.edu/teaching/courses/3DP/presentations/Kubicky/desc.htm 1/6 System Description Overview My 'rig' allows 3-D information to be recovered from a solid object using triangulation. Following a calibration procedure, a line- projecting laser is swept across the target object and a series of images (one per laser position) is stored on the disk. Using this sequence of images and the parameters calculated during the calibration phase, 3-D data can be recovered. The basic idea for this project came from the coursework covered during the term in EE148 and a paper by Jean-Yves Bouguet (hereafter referred to as JY) and Pietro Perona titled 3D Photography Using Shadows in Dual-Space Geometry. Detailed explanations of the math used throughout this discussion can be found in the paper. Hardware There are three main components to the system harware: the host computer, the camera/tripod and the laser/mechanical stage. The Rig Host Computer Pretty much any modern PC clone will due. I used a laptop from Compaq (Armada 1550DMT, 32MB, Windows 95 - nice machine!), which makes the whole setup very portable. Camera/Tripod

Upload: donguyet

Post on 17-Apr-2018

221 views

Category:

Documents


5 download

TRANSCRIPT

5/14/2014 EE148 3-D Photography - Class Project, Spring 1998

http://web.archive.org/web/20130514055353/http://www.multires.caltech.edu/teaching/courses/3DP/presentations/Kubicky/desc.htm 1/6

System Description

Overview

My 'rig' allows 3-D information to be recovered from a solid object using triangulation. Following a calibration procedure, a line-projecting laser is swept across the target object and a series of images (one per laser position) is stored on the disk. Using this

sequence of images and the parameters calculated during the calibration phase, 3-D data can be recovered.

The basic idea for this project came from the coursework covered during the term in EE148 and a paper by Jean-Yves Bouguet

(hereafter referred to as JY) and Pietro Perona titled 3D Photography Using Shadows in Dual-Space Geometry. Detailedexplanations of the math used throughout this discussion can be found in the paper.

Hardware

There are three main components to the system harware: the host computer, the camera/tripod and the laser/mechanical stage.

The Rig

Host Computer

Pretty much any modern PC clone will due. I used a laptop from Compaq (Armada 1550DMT, 32MB, Windows 95 - nice

machine!), which makes the whole setup very portable.

Camera/Tripod

5/14/2014 EE148 3-D Photography - Class Project, Spring 1998

http://web.archive.org/web/20130514055353/http://www.multires.caltech.edu/teaching/courses/3DP/presentations/Kubicky/desc.htm 2/6

The camera is a Color Quickcam 2 from Connectix Corp. The Quickcam uses a 640x480 pixel CCD imager with an overlayed color

filter. Interpolation is used to provide full-resolution (640x480), full-color (RGB) images. The Quickcam connects directly to theparallel port of the host computer and draws power from the host's keyboard port. (Note to Connectix: The Quickcam is a very nice

product, but give me a monochrome 640x480 camera with better optics and I'd really be thrilled.)

It is essential that the camera be fixed in a particular position and orientation from the time the system is calibrated throughout all dataacquisition, so at least a simple tripod is necessary. The Quickcam accepts a standard 1/4" camera tripod screw, so any small cameratripod should due. I picked up the one shown in the photos at a camera shop at the local mall for around $20.00.

Laser/Mechanical Stand

Most of my time went into designing and building the mechanical stand to hold the laser. There is obviously a lot of flexibility here inhow this is designed, what parts are used, etc., but the main requirement is a stable platform with the ability to repeatably and

accurately position the laser.

The general idea behind the stand can be seen in the photos. I built a small wooden structure out of 1"x4" and 1"x6" poplar from thelocal Home Depot (they can be very helpful in cutting the wood if you know what you want ahead of time). My design doesn't

require any accurate cuts. A small stepper motor is mounted to the base, and gearing is used to produce a 10:1 reduction in angularmotion between the motor and the shaft. The shaft, which is an 8.5" length of precision ground 1/4" steel rod, is radially supported by

two nylon sleave bearings which are pressed into slightly under-sized holes to hold the shaft snuggly. I used a small piece of plasticunder the lower bearing to provide a low-friction axial support for the shaft. Lining up the upper and lower bearing is probably thehighest-accuaracy operation required here, but even this only has to be good enough for the gearing to work well. The calibration

procedure removes the need for any precise mechanical positioning - only repeatability and stability are required.

The laser is mounted to the shaft via a small block of wood that is drilled at the desired inclination angle and pressed onto the end of

the shaft. A better design would allow adjustment of the angle between the laser and the shaft while still being rigid.

The stepper motor is an Astrosyn type 17PY-Q202-03 hybrid stepper. This is a fairly precision device, and no doubt was a bitpricey when new, but I picked some up for $2.00 each from a local surplus store (All Electronics). The motor has 384 steps per

revolution, or a little less than 1 deg per step. I also picked up a surplus 6V, 500mA DC wall transformer from All Electronics to

power the stepper-motor controller.

I purchased some small spur gears from PIC Design (800-243-6125, although service was poor and prices were high). I fitted the

motor with an 18 tooth, 48 pitch gear, which couples to a 180 tooth 48-pitch gear on the shaft. Thus I get less than 0.1 deg ofangular motion for every step of the motor. The motor is fastened to the stand via long 4/40 screws that push up through the bottom.

This arrangement allows the motor position to be adjusted so there is essentially no 'play' between the two gears (and thus ensuring

repeatability of positioning in both directions).

I also built a small wooden stage to scan the objects on. The stage is used to calibrate the system, and the bottom and back side of itmust be as nearly square (90 deg) as possible. It is a good idea to paint the stage white to provide better contrast during the

calibration procedure.

5/14/2014 EE148 3-D Photography - Class Project, Spring 1998

http://web.archive.org/web/20130514055353/http://www.multires.caltech.edu/teaching/courses/3DP/presentations/Kubicky/desc.htm 3/6

Gearing detail

To control the stepper motor, I built a small controller using an AT90S2313 microcontroller from Atmel. This is one of the new

'AVR' family of micros from Atmel, and they are very nice parts indeed. I've worked with quite of few different families ofmicrocontrollers (Microchip, Motorola, Intel), and these have been by-far the most pleasant to work with in terms of assembly

language and overall device architecture. A postscript file of the design can be found here.

The motor is driven via a Telcom TC4468 quad 1.5A MOSFET driver IC (push- pull type outputs). I have a standard RS-232

interface that allows the controller to connect to any standard PC serial port. The firmware for the controller, which allows stepping ineither direction and also allows the user to set and return to a 'home' position, can be found here.

I was very fortunate in finding an inexpensive source of high-quality laser 'line' pointers: World Star Technology. They sell focusable

5mW line modules at a number of fan angles for around US $90 (the next closest price for a similar unit was $300 from Edmund

Scientific). Service from World Star Tech was excellent and the module has perfomed very well. The model I'm using is rated at

650nm, 45 deg fan-angle, 3V supply.

Software

Several different programs must be used in careful coordination to operate the system and produce 3-D data. A single, integrated

application would have been terrific, but it just wasn't in the schedule.

Calibration

Image acquisition is done with the Connectix QPICT32 program that comes with the Quickcam. The camera is calibrated by laying a

grid of known dimensions (in this case, a checkerboard of 0.75x0.75" squares) on the horizontal plane. A single image is then

captured and stored at 640x480 resolution (1000's of colors mode seems to work well). A sample calibration image is shown below.

5/14/2014 EE148 3-D Photography - Class Project, Spring 1998

http://web.archive.org/web/20130514055353/http://www.multires.caltech.edu/teaching/courses/3DP/presentations/Kubicky/desc.htm 4/6

Calibration Image

Once the camera calibration is done, a sequence of laser calibration images must be taken. The bare white stage should be used here

with the checkerboard pattern removed. The image capture can be accomplished most easily by setting the QPICT32 program to

capture a sequence of images (with an appropriate base filename, such as 'calib') at fixed time intervals (say, 5 seconds). While this ishappening, a terminal session (i.e. Hyperterminal) can be used to send commands to the stepper-motor controller to cause the motor

to step a fixed number of steps (say, 40) between each image capture. When a sufficient number of images is stored, QPICT32 can

be stopped.

The laser calibration and later data capture steps must be done with the room darkened. While capturing laser-illuminated images, theQuickcam's Auto-Brightness and Auto-Hue features should be disabled, and the camera parameters should be adjusted so the laser

beam is the only thing the camera 'sees' in its field (with the room lights off). (Actually, a some dimmly illuminated pixels can be

tolerated as they are subsequently filtered out.) Note that all images should be captured at full 640x480 resolution.

Data capture can be done with a piece of black cloth covering the stage. Take care not to disturb the position of the stage, the laser

or the camera/tripod while draping the cloth - the rig is now calibrated, and any movement would require re-calibration! After thecloth is draped, the target object should be placed on the stage and cetered in the camera view. With the laser on, the beam may

appear very clear against the black cloth. However, as long as the target object is sufficiently light in color, the beam will appear much

brighter to the camera where it intersects the target. Note that brightness and contrast may need to be re-adjusted here to produce

good results. With a suitable cloth bachground and properly adjusted camera and parameters, a light- colored object can be scanned

with essentially NO noise points.

Data capture should be done with a much smaller 'step' increment than the calibration was done with. I've used step values between 3

and 8 for the actual data capture (as opposed to 30-40 for calibration).

Once the data capture is done, all of the actual image processing is done via programs written in Matlab. Images are loaded into

Matlab using the imread() function. The red plane is then converted to an array of doubles (green and blue are discarded) and usedfor all subsequent processing. The camera calibration program, calibration.m, uses the single camera calibration image to calculate

the intrinsic camera parameters (focal lengths and distortion) as well as the location of the horizontal plane in 'camera-space'.

5/14/2014 EE148 3-D Photography - Class Project, Spring 1998

http://web.archive.org/web/20130514055353/http://www.multires.caltech.edu/teaching/courses/3DP/presentations/Kubicky/desc.htm 5/6

Once the camera calibration is done, the laser/stage is calibrated using the sequence of calibration images. These images are

mergered into a single image (by generatecalimage.m) and then used to extract the vertical plane and omega values of the laser at

each calibration position. The user selects four rows - two that contain the vertical-plane intersections and two that contain the

horizontal-plane intersections. Getpeaklist.m then locates the peak values along each row at sub-pixel accuracy by first filtering andthen fitting a parabola to each peak points and it's two nearest neighbors. The recovered lines are intersected, and this set of

intersection points is used to determine the vertical/horizontal plane intersection line using FitQuad.m. This intersection line is then

used to determine the location of the vertical plane (which is, by design, orthogonal to the horizontal plane). At this point we have

enough information to determine the vector (omega) that describes the position of the laser plane for each calibration position. The

user is prompted for the number of motor steps between each calibration image, and the program uses this value to linearly

interpolate a list of omega values for the intermediate motor step positions. This is all accomplished with the programgenomegalist.m, which generates a display similar to this:

Laser Calibration Image

(Not sure why best-fit intersection line is above intersection points -

possibly a bug in my display routine.)

Processing (3-D Recovery)

Once the calibration is complete and the data is acquired, processing to recover 3-D data is fairly straight-forward. The program

proc3d.m reads a sequence of images and extracts 3-D points from each file. The first image should have been taken with the laser at

its 'home' position, and the step between subsequent images is specified by the user (images must currently be separated equal

numbers of steps).

Presently, only a single pixel per row is located and triangulated, which pretty much requires using the black cloth to removebackground points as described above. For each row, the maximum-intensity pixel is recovered at sub-pixel (column) accuracy (as in

laser calibration, above), by getpoints.m. These pixels are then triangulated and added to a buffer. The current buffer format does

5/14/2014 EE148 3-D Photography - Class Project, Spring 1998

http://web.archive.org/web/20130514055353/http://www.multires.caltech.edu/teaching/courses/3DP/presentations/Kubicky/desc.htm 6/6

not lend itself well to mesh-generation, but it would be easy enough to change this (given a little more time). It would also be possible

to support multiple peaks per line (and thus allow, for example, inclusion of the vertical background plane, which may be desireable in

some applications), although some thought would be necessary regarding how close two 'separate' peaks can be.

Processing Update (Mesh Generation):

I modified proc3d.m to produce a new file proc3dmesh.m that processes the file sequence and generates a mesh at the same time.(Support files are getpointsmesh.m, getdisksq.m, and addtriangle.m.) It's easy to do here because all we need to look at is the

3-D point list for the current laser step and the previous one.

There is one parameter that can be teaked in addtriangle.m to set the maximum 'aspect ratio' of a triangle to add to the mesh, and

this value needs to be set relatively high if the data is taken with a large number of steps between each image. Quadrangles are splitinto triangles on the basis of the minimum-diagonal, and no hole-filling or smoothing is performed - this was put together in a bit of a

hurry!

To see how well it all worked, go on to the results.