pome a mobile camera system for accurate indoor...

Post on 01-Aug-2020

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

© 2010. All rights reserved.

Paul Montgomery & Andreas Winter

November 2 2016

1

POME A mobile camera system for accurate

indoor pose

© 2010. All rights reserved.

ICT – Intelligent Construction Tools

• A 50-50 joint venture between Trimble and Hilti

• Vision: revolutionize the way construction is done

– Many inefficiencies persist on the construction site

• The central technical problem - Accurate and robust indoor positioning

11/7/2016 2

© 2010. All rights reserved.

Overview

1. ICT and the construction market

2. A brief history of indoor positioning

3. POME concept, theory of operation, system tradeoffs

4. POME error budget

5. System components

6. The usual problems

7. POME accuracy

8. The future

11/7/2016 3

static kinematic

© 2010. All rights reserved.

Construction Site

11/7/2016 4

© 2010. All rights reserved.

Construction tools today

• A precision instrument (not a tool)

• 5 arc sec sensor

– ~2.5mm at 100 m (Horizontal, Vertical)

• $30-60K (expensive)

• Requires experience to set up and use

– careful installation on a stable tripod

– correct referencing

• Single user

• Subject to line of sight occlusion

• Issues in tracking at close range

• Most sites still use traditional tools for most tasks

11/7/2016 5

© 2010. All rights reserved.

Requirements • Accuracy (< 6mm)

• Robustness

– to occlusion

– to drop

– dust/dirt

• Cost

– BOM <= $400

• Ease of installation

– No cables

– Fast and reliable installation

• Room size > 30 meters

• Challenges

– rapidly changing environment

– variable lighting conditions

– many reflective surfaces

11/7/2016 6

© 2010. All rights reserved.

Existing Solutions

11/7/2016 7

Hawk-Eye, which costs $100,000 and can

pinpoint a ball to within 5 millimeters.

© 2010. All rights reserved.

Why POME?

• POME = Position & Orientation Measurement Engine

• Multi User

• Indoor (+ possibly outdoor)

• Low cost & fully solid state

• POME inside == GPS outside

– Similar weight & volume

– Similar cost

– Similar update rate

– Similar accuracy

• Like GPS, enables many applications, e.g.:

– Staked layout

– Projection systems

– Robotic systems

– Augmented Reality

11/7/2016 8

==

POME

inside

GPS

outside

© 2010. All rights reserved.

POME Applications

11/7/2016 9

© 2010. All rights reserved.

Principle of Operation

• Measure angles between known points

• Use redundant measurements

• Use least squares to solve a set of non- linear equations

• Q: How accurately can you measure angles with a (wide angle) camera?

11/7/2016

10

Θ

Θ

Θ

A

B

Θ

Θ

Θ − 𝜹Θ

A

B

A Simplified 2D example

© 2010. All rights reserved.

2D example: intersection of 2 circles

• Nonlinear problem

• Intersection of 2 circles gives candidate solutions X & Y

• Positions of A,B,C,D must be known

• Uncertainty in angle measurements results in a covariance ellipse

11/7/2016 11

Θ𝑨𝑩

D

B

C

A

Θ𝑪𝑫 X

Y

© 2010. All rights reserved.

Error Budget

• require 0.2 pixel 1 sigma with multiple cameras

• 1 pixel = 2.2 um

• Error contributors:

– Achievable calibration accuracy

· Lenses

· mechanics

– Mechanical stability

– Centroid determination with:

· saturated signals

· weak signals

– Number and geometry of targets

– Accuracy of target survey

11/7/2016 12

© 2010. All rights reserved.

System Design Considerations • Number of cameras

• Arrangement of cameras for best F.O.V.

• Type of image sensors

– Number / size of pixels

– Rolling / global shutter

– Color / monochrome

– Dynamic range of pixels

• Type of lens

– projection function

– Image sensor matching

• Type of target LED’s

– Visible / I.R., power, pattern

• Image processing considerations

– Image processing bandwidth

– Power

• Cost !!

11/7/2016 13

Early concept for 3 camera

overlapping F.O.V.

© 2010. All rights reserved.

Active Targets

• Transmit at 850 nm (near IR)

• Approximately 350 mW

• Modulated intensity

11/7/2016 14

© 2010. All rights reserved.

Projection from object space to image space

• Camera: a projection from object space to image space (x,y,z) -> (u,v)

• 1 to 1 mapping of rays (unit vectors) to image space points (u,v)

• Point of light becomes a “blob”

• a non-trivial mapping

• For pose calculation we need to convert from image space points to rays (angles)

• Mapping function is different for every camera => need to calibrate

11/7/2016

O

X

Y

optical axis

lens

Image sensor

𝜽

(u,v)

𝝓

Z

(x,y,z)

object

space

image

space

(x/z,y/z,1)

15

© 2010. All rights reserved.

Fisheye (f-theta) lens projection

• For large F.O.V. pinhole camera needs a very large image sensor

• F-theta projection => equal angle increment maps to equal number of pixels

• Camera is an angle measuring sensor

11/7/2016 16

X

Y

optical axis

optical center

R Image sensor

𝜽

𝜽

𝑹 = 𝒇 ∗ 𝒕𝒂𝒏𝜽

f

X

Y

optical axis

optical center

R Image sensor

𝜽

𝑹 = 𝒇 ∗ 𝜽

f

pinhole projection f-theta projection

© 2010. All rights reserved.

F-theta lens

• equal angle is mapped to equal distance on image sensor

• Camera measures angles

• Our cameras have ~14 pixels/degree

• => 1 pix = 250 arcsec

• => 0.2 pix = 50 arcsec

• ~ 5 mm @ 20 meters

11/7/2016 17

160 degree

Image circle

image sensor

© 2010. All rights reserved.

Example blobs using off the shelf lenses

• Non symmetry of impulse response across the F.O.V

– Strongly affects centroid determination accuracy

• Non uniformity of energy distribution across the F.O.V.

– compounds near/far problem

11/7/2016 18

© 2010. All rights reserved.

Custom Optics

• DSL627 optimized lens

11/7/2016 19

© 2010. All rights reserved.

11/7/2016 20

30 deg. elev. circle

0 deg. elev. circle

-30 deg. elev. circle

F.O.V. boundary

© 2010. All rights reserved.

Some “not very interesting” images

11/7/2016 21

© 2010. All rights reserved.

Example blob

11/7/2016 22

© 2010. All rights reserved.

Saturated blob

11/7/2016 23

© 2010. All rights reserved.

Calibration residual

• After removing an f-theta model, we are left with residual

• Calibrate by fitting a function to the residual

• Blob centroid (u,v) -> -> -> angles -> pose solution

• Lens and camera mechanics must be stable over time and temperature

11/7/2016 24

inverse residual

function inverse f-theta

© 2010. All rights reserved.

Shock and Vibration

11/7/2016 25

• Significant testing has been done to POME head to verify stability

– Below are test set up and results from shock and vibration testing with positive results

11/7/2016 25

© 2010. All rights reserved.

Lens stability testing results

11/7/2016 26 11/7/2016 26

© 2010. All rights reserved.

The Usual Problems

• Calibration + mechanical / thermal stability

• range ratio (near / far problem)

• Registration (target determination)

• Interference rejection (strong signals)

• Multipath rejection (with and without direct ray)

• Initialization

• Data rate (~3000 Mb/s) / image processing / power

• Solution (target modulation, synchronization)

11/7/2016 27

© 2010. All rights reserved.

Accuracy Testing -- Warehouse

11/7/2016 28

PLT for truth validation

Indoor warehouse location with industrial and

natural lighting ~ 15 m x 10 m x 8m

© 2010. All rights reserved.

Accuracy Testing

11/7/2016 29

© 2010. All rights reserved.

Test results for unit 0x8004

11/7/2016 30

• 48 position stations

• 16 azimuth stations at each position station

• In table on following page

– Each row is one position station

– Each column is one azimuth station

• Numbers show the position error relative to the truth system (PLT) in units of mm

• Each number represents a static mode result

Target locations are shown with black dots and numeric ID

Robot trajectory is shown with blue ‘x’

PLT location is shown with a red dot

Position Station TP07 location shown with

Warehouse dimensions are in units of meters

© 2010. All rights reserved.

11/7/2016 31

© 2010. All rights reserved.

Why is it difficult to state performance?

• Performance is characterized by error statistics

– To have significance, statistics need many measurements to validate

• We have 6 errors to characterize at each point in space

– 3 components of position error

– 3 components of orientation error

• Errors are worse in some directions than in other directions

• There are a variable number of targets and variable working volume geometry

• There are different modes of operation (here, we document static and survey modes)

• We plot the worst direction 1 sigma errors in a square working volume

11/7/2016 32

© 2010. All rights reserved.

Example of scatter plot ellipsoids • A scatter plot of results creates a clump of data points distributed around the truth value

• The statistics of the clump can be characterized by an error ellipsoid

• Shown below are example 1 sigma ellipsoids for position and orientation

11/7/2016 33

Position Error Ellipsoid Orientation Error Ellipsoid

© 2010. All rights reserved.

Simulation Target Configurations • Nominal ‘square’ room of size 10x10

meters

• Consider 6 different target arrangements in room

• Targets installed at uniform height and positioned on walls around circumference of room

• Calculate position and orientation accuracy at grid points in the room

• Calculate solutions with:

– 1 azimuth station (static mode)

– 4 azimuth stations (survey mode)

• All simulations use 0.5 pixel 1 sigma, but we expect/hope to achieve 0.2 pixel 1 sigma in practice!!

11/7/2016 34

© 2010. All rights reserved.

10x10 m room, 8 targets, 1 azim. station

• 0.5 pix 1 sigma

• 1 azimuth station

• 10x10 meter room

• Worst direction position

• Worst direction orientation

• 1 sigma results

11/7/2016 35

© 2010. All rights reserved.

10x10 m room, 8 targets, 4 azim. stations

• 0.5 pix 1 sigma

• 4 azimuth stations

• 10x10 meter room

• Worst direction position

• Worst direction orientation

• 1 sigma results

11/7/2016 36

© 2010. All rights reserved.

The future

• “In the words of Yogi Berra, ‘I never make predictions, especially about the future,'”

• Image sensors and image processing continue to develop quickly

– reduced cost

– sophisticated image processing of real images

– lenses and stability will remain challenges to accuracy

• Mobile cameras and lightweight infrastructure (scalability, infrastructure)

• Step 1: Reduce the number of required active targets, use natural features

• Step 2: sensor fusion with inertial, ranging camera, stereo camera, …

11/7/2016 37

top related