euclidean position estimation of features on a static object using a moving calibrated camera...

1
EUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECT EUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECT USING A MOVING CALIBRATED CAMERA USING A MOVING CALIBRATED CAMERA Nitendra Nath, David Braganza Nitendra Nath, David Braganza , and Darren Dawson , and Darren Dawson Department of Electrical and Computer Engineering, Clemson University, Clemson, SC 29634-0915, E-mail: [email protected] Abstract Abstract What is Euclidean Position What is Euclidean Position Estimation? Estimation? •Three-dimensional (3D) reconstruction of an object, where the Euclidean coordinates of feature points on a moving or fixed object are recovered from a sequence of two- dimensional (2D) images is known as Euclidean position estimation or more broadly known as Structure from Motion (SFM) or Simultaneous Localization and Mapping (SLAM) . •They have significant impact on several applications such as: •A 3D Euclidean position estimator using a single moving calibrated camera whose position is assumed to be measurable is developed in this paper to asymptotically recover the structure of a static object. •To estimate the structure, an adaptive least squares estimation strategy is employed based on a novel prediction error formulation and a Lyapunov stability analysis. Autonomous vehicle navigation Path planning Surveillance Geometric Model Geometric Model A geometric relationship is developed between a moving camera and a stationary object. n feature points located on a static object , denoted by are considered. ¹ m i ,[x i y i z i ] T F i 8i = 1;:::;n 3D coordinates of i th feature point w.r.t. : C Normalized Euclidean coordinates : m i , 1 z i ¹ m i = [x i =z i y i =z i 1] T Corresponding projected pixel coordinates : p i ,[u i v i ] T , u i (t)2 R v i (t)2 R Pin-hole camera model: p i = Am i = 1 z i m i A, · fk u fk u cotÁ u 0 0 fk v sin Á v 0 ¸ : Known constant intrinsic calibration matrix of the camera A2R 2£ 3 The objective of this work is to accurately identify the unknown constant Euclidean coordinates of the feature x relative to the world frame in order to recover the 3D structure of the object. Geometric relationships between the fixed object, mechanical system and the camera. R b (t)2 SO (3) x b (t)2 R 3 R c (t)2 SO(3) x c (t)2 R 3 Measurable rotation matrix and translation vector from B to W Known constant rotation matrix and translation vector from C to B x fi 2R 3 ¹ m i (t)2 R 3 Unkown Euclidean Structure Estimation Euclidean Structure Estimation From the geometric model, the following expression can be obtained: ¹ m i = R T c £ R T b (x fi ¡x b )¡ x c ¤ After utilizing pin-hole camera model, pixel coordinates of i th feature point can be written as: p i = 1 z i AR T c £ R T b (x fi ¡x b )¡ x c ¤ Corresponding depth z i = R T c3 £ R T b (x fi ¡x b )¡ x c ¤ Last row of R T c (t) in parameterized form: p i (t) p i = 1 ¦£ i i i = AR T c £ R T b (x fi ¡x b )¡ x c ¤ ¦£ i = z i = R T c3 £ R T b (x fi ¡x b )¡ x c ¤ , Prediction error for i th feature point ~ p i = 1 ¦£ i (W ¡^ p i ¦) ~ £ i Combined prediction error ~ p = B ¹ W p ~ £ ¦ (t)2 R 1£ 4 ,W (t)2 R 2£ 4 : measurable regression matrices £ i 2R 4 : unkown constant parameter vector ¹ W p (t)2 R 2n £ 4n B (t)2 R 2n £ 2n ~ £ (t)2 R 4n : measurable signal : auxiliary matrix : combined estimation error Adaptive update law is designed as: : ^ £ , Proj © ®¡ ¹ W T p ~ p ª P rojf¢g ®(t)2 R ¡(t)2 R 4n£ 4n : ensures positiveness of the term ¦ (t) ^ £ i (t) : a positive scalar function : least-squares estimation gain matrix Simulation Results Simulation Results 0 5 10 15 20 25 30 35 40 45 50 -50 0 50 100 150 200 250 300 [sec] [cm] 0 5 10 15 20 25 30 35 40 45 50 -50 0 50 100 150 200 250 300 [sec] [cm] 0 5 10 15 20 25 30 35 40 45 50 -50 0 50 100 150 200 250 [sec] [cm] Case 1: No noise added to pixel coordinates Distance Estimation Error Object Actual distance (cm) Estimated distance (cm) Error (cm) Convergence time (sec) Case 1 Length I Length II Length III 50.0 111.8 100.0 49.94 111.25 99.86 0.06 0.55 0.14 0.12 0.49 0.14 Case 2 Length I Length II Length III 50.0 111.8 100.0 49.90 111.15 99.74 0.10 0.65 0.26 0.20 0.58 0.26 Case 3 Length I Length II Length III 50.0 111.8 100.0 49.88 111.08 99.65 0.12 0.72 0.35 0.24 0.64 0.35 Experimental Results Experimental Results Case 2: Gaussian noise of variance 200 added to pixel coordinates Case 3: Gaussian noise of variance 400 added to pixel coordinates Robot Control PCVision PC 15 Hz Trigger PUMA 560 Robot Object Monochrome CCD Camera Experimental testbed with camera, robot and object 0 10 20 30 40 50 60 -100 0 100 200 300 400 500 600 [sec] [cm] Object Actual distance (cm) Estimated distance (cm) Erro r (cm) Convergence time (sec) Length I Length II Length III Length IV Length V Length VI 11.24 2.81 11.24 5.62 16.86 5.62 11.41 2.76 11.56 5.72 17.31 5.44 0.17 0.05 0.32 0.10 0.45 0.18 37.3 33.1 33.3 32.2 37.6 35.4 Object I: Checker- board Distance Estimation Error Object II: Doll- house 0 10 20 30 40 50 60 -200 0 200 400 600 800 1000 1200 1400 1600 1800 [sec] [cm] Distance Estimation Error Object Actual distance (cm) Estimated distance (cm) Erro r (cm) Convergence time (sec) Length I Length II Length III Length IV Length V Length VI 40.0 12.2 12.2 13.0 15.0 26.5 41.3 12.7 11.6 13.4 14.3 27.4 1.3 0.5 0.6 0.4 0.7 0.9 32.2 33.4 30.1 32.2 34.7 33.5 Object III: Tool- boxes 0 10 20 30 40 50 60 -50 0 50 100 150 200 250 300 350 400 [sec] [cm ] Distance Estimation Error Object Actual distance (cm) Estimated distance (cm) Erro r (cm) Convergence time (sec) Length I Length II Length III Length IV Length V Length VI 14.7 4.2 5.0 9.0 9.6 3.8 14.15 4.10 4.96 8.78 9.44 3.64 0.55 0.10 0.04 0.22 0.16 0.16 37.6 39.9 36.7 39.8 39.8 38.9 The estimator accurately identifies the Euclidean distances between the features without having any information with regard to the object’s geometry. D. Braganza is with OFS, 50 Hall Road, Sturbridge, MA 01566. KLT feature tracking algorithm was used for tracking feature points from one frame to another.

Upload: miranda-greenaway

Post on 22-Jan-2016

223 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: EUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECT USING A MOVING CALIBRATED CAMERA Nitendra Nath, David Braganza ‡, and Darren Dawson EUCLIDEAN

EUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECTEUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECT USING A MOVING CALIBRATED CAMERA USING A MOVING CALIBRATED CAMERA

Nitendra Nath, David BraganzaNitendra Nath, David Braganza‡‡, and Darren Dawson, and Darren DawsonDepartment of Electrical and Computer Engineering, Clemson University,

Clemson, SC 29634-0915, E-mail: [email protected]

AbstractAbstract

What is Euclidean Position Estimation?What is Euclidean Position Estimation?•Three-dimensional (3D) reconstruction of an object, where the Euclidean coordinates of feature points on a moving or fixed object are recovered from a sequence of two-dimensional (2D) images is known as Euclidean position estimation or more broadly known as Structure from Motion (SFM) or Simultaneous Localization and Mapping (SLAM).

•They have significant impact on several applications such as:

•A 3D Euclidean position estimator using a single moving calibrated camera whose position is assumed to be measurable is developed in this paper to asymptotically recover the structure of a static object.

•To estimate the structure, an adaptive least squares estimation strategy is employed based on a novel prediction error formulation and a Lyapunov stability analysis.

Autonomous vehicle navigation Path planning Surveillance

Geometric ModelGeometric ModelA geometric relationship is developed between a moving camera and a stationary object. n feature points located on a static object , denoted by are considered.

¹mi , [xi yi zi ]T

F i 8 i =1;:::;n

3D coordinates of ith feature point w.r.t. :C

Normalized Euclideancoordinates :

mi , 1zi¹mi = [xi=zi yi=zi 1]T

Corresponding projected pixel coordinates : pi , [ui vi ]T

,ui (t) 2 R vi (t) 2 R

Pin-hole camera model: pi =Ami =1ziA ¹mi

A ,·f ku f ku cotÁ u00 f kv

sin Á v0

¸

: Known constant intrinsic calibration matrix of the camera

A 2 R2£ 3

The objective of this work is to accurately identify the unknown constant Euclidean coordinates of the feature xfi relative to the world frame in order to recover the 3D structure of the object.

Geometric relationships between the fixed object, mechanical system and the camera.

Rb(t) 2 SO(3)

xb(t) 2 R3

Rc(t) 2SO(3)xc(t) 2R3

Measurable rotation matrix and translation vector from B to W

Known constant rotation matrix and translation vector from C to B

xf i 2 R3

¹mi (t) 2 R3 Unkown

Euclidean Structure EstimationEuclidean Structure EstimationFrom the geometric model, the following expression can be obtained: ¹mi =RT

c

£RTb (xf i ¡ xb) ¡ xc

¤

After utilizing pin-hole camera model, pixel coordinates of ith feature point can be written as:

pi = 1ziART

c

£RTb (xf i ¡ xb) ¡ xc

¤

Corresponding depth

zi =RTc3

£RTb (xf i ¡ xb) ¡ xc

¤

Last row of RTc (t)

in parameterized form: pi (t) pi = 1¦ £ i

W£ i

W£ i =ARTc

£RTb (xf i ¡ xb) ¡ xc

¤¦ £ i = zi = RT

c3

£RTb (xf i ¡ xb) ¡ xc

¤,

Prediction errorfor ith feature point

~pi = 1¦ £ i

(W ¡ p̂i ¦ )~£ i

Combined prediction error

~p=B ¹Wp~£

¦ (t) 2 R1£ 4, W (t) 2 R2£ 4 : measurable regression matrices £ i 2 R4 : unkown constant parameter vector

¹Wp (t) 2 R2n£ 4n

B (t) 2 R2n£ 2n

~£ (t) 2 R4n

: measurable signal: auxiliary matrix: combined estimation

error

Adaptive update law is designed as::

£̂ , Proj©®¡ ¹WT

p ~pª Proj f¢g

®(t) 2 R

¡ (t) 2 R4n£ 4n

: ensures positiveness of the term ¦ (t) £̂ i (t)

: a positive scalar function

: least-squares estimation gain matrix

Simulation ResultsSimulation Results

0 5 10 15 20 25 30 35 40 45 50-50

0

50

100

150

200

250

300

[sec]

[cm

]

0 5 10 15 20 25 30 35 40 45 50-50

0

50

100

150

200

250

300

[sec]

[cm

]

0 5 10 15 20 25 30 35 40 45 50-50

0

50

100

150

200

250

[sec]

[cm

]

Case 1: No noise added to pixel coordinates

Distance Estimation Error

Object Actual distance (cm) Estimated distance (cm) Error (cm) Convergence time (sec)

Case 1

Length ILength II

Length III

50.0

111.8

100.0

49.94

111.25

99.86

0.06

0.55

0.14

0.12

0.49

0.14

Case 2

Length ILength II

Length III

50.0

111.8

100.0

49.90

111.15

99.74

0.10

0.65

0.26

0.20

0.58

0.26

Case 3

Length ILength II

Length III

50.0

111.8

100.0

49.88

111.08

99.65

0.12

0.72

0.35

0.24

0.64

0.35

Experimental ResultsExperimental Results

Case 2: Gaussian noise of variance 200 added to pixel

coordinates

Case 3: Gaussian noise of variance 400 added to pixel

coordinates

Robot Control PC Vision PC

15 Hz Trigger PUMA 560 RobotObject

Monochrome CCD Camera

Experimental testbed with camera, robot

and object

0 10 20 30 40 50 60-100

0

100

200

300

400

500

600

[sec]

[cm

]

Object Actualdistance (cm)

Estimated distance (cm)

Error (cm)

Convergence time (sec)

Length ILength IILength IIILength IVLength VLength VI

11.242.8111.245.6216.865.62

11.412.7611.565.7217.315.44

0.170.050.320.100.450.18

37.333.133.332.237.635.4

Object I: Checker-board Distance Estimation Error

Object II: Doll-house

0 10 20 30 40 50 60-200

0

200

400

600

800

1000

1200

1400

1600

1800

[sec]

[cm

]

Distance Estimation Error

Object Actualdistance (cm)

Estimated distance (cm)

Error (cm)

Convergence time (sec)

Length ILength IILength IIILength IVLength VLength VI

40.012.212.213.015.026.5

41.312.711.613.414.327.4

1.30.50.60.40.70.9

32.233.430.132.234.733.5

Object III: Tool-boxes

0 10 20 30 40 50 60-50

0

50

100

150

200

250

300

350

400

[sec]

[cm

]

Distance Estimation Error

Object Actualdistance (cm)

Estimated distance (cm)

Error (cm)

Convergence time (sec)

Length ILength IILength IIILength IVLength VLength VI

14.74.25.09.09.63.8

14.154.104.968.789.443.64

0.550.100.040.220.160.16

37.639.936.739.839.838.9

The estimator accurately identifies the Euclidean distances between the features without having any information with regard to the object’s geometry.

‡ D. Braganza is with OFS, 50 Hall Road, Sturbridge, MA 01566.

KLT feature tracking algorithm was used for tracking feature points from one frame to another.