joint affinity propagation for multiple view...

Post on 26-Sep-2020

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Jianxiong XIAO, Jingdong WANG,

Ping TAN, Long QUAN

Department of Computer Science & Engineering

The Hong Kong University of Science & Technology

Joint Affinity Propagation

for

Multiple View Segmentation

ICCV 2007Eleventh IEEE International Conference on Computer Vision

Rio de Janeiro, Brazil, October 14-20, 2007

2

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

3

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

4

• Get 3D points and camera positions from 2D

images (geometry computation)

• Get 3D objects from unstructured 3D points

(objects reconstruction)

recovered 3D points recovered object modelsinput images

Image-based modeling

Two Steps Methods:

5

Structure from motion

6

Data segmentation

• Pure 2D segmentation & 3D clustering is hard!

– J. Shi and J. Malik. Normalized Cuts and Image Segmentation

– etc.

• Multiple view joint segmentation

– Simultaneously segment 3D points and 2D images

– Jointly utilize both 2D and 3D information

2D?

3D?

7

Our work

• Explore for multiple view joint segmentation by simultaneously utilizing 2D and 3D data.

• The availability of both 2D and 3D data can bring complementary information for segmentation.

• Propose two practical algorithms for joint segmentation:

– Hierarchical Sparse Affinity Propagation

– Semi-supervised Contraction

8

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

9

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

10

Problem formulation

iII The set of images

The set of regions

A joint point

A set of labels

Set of visibilities

Set of joint points

kki PI ,u

nn PPzyx ,,,,,,, 11 uux

klL

jV v

jX x

We now want to get the inference of L, given X,

V and I.

11

graph model

Graph based segmentation

Graph G = { V, E }:

V: 3D points recovered from SFM

E: each point connected to its K-nearest neighbors, and two end points of each edge both visible at least in one view

12

Joint similarity

• 3D coordinates

• 3D normal

• Color

• Contour

• Patch

jisjis

jisjisjis

tic

c

,,

,,, 3

13

3D similarity

jisjisjis

jis

jis

nd

n

ji

n

d

ji

d

,,,

2,

2,

333

2

3

2

3

2

3

2

3

nn

pp ip jp

in jn

14

2D color similarity

2

2

2,

c

ji

c

EEjis

cc

2

,

2

maxmed,

ic

vvjitv

ic

tgjis vv

.p

.q

p q

d2d(p,q)

= gradient of i-th image ig

15

Utilizing the texture information

• Hyper Graph?

• Higher Order Prior Smoothness?

• …

16

Competitive region growing

• Associate patches with each 3D point.

17

Patch filtering

• A small error around the object boundary may result in a large color difference.

18

Patch histogram similarity

jki

k

t

k

ji

t hhdt

hhdjis ,1

,,1

0

For each joint point

• Collect all its patches

• Build an average color histogram

• Down-sample the patches t-1 times

• A vector of histograms 10 ,, thhh

nP

0h

where d (·, ·) is the dissimilarity measures for histograms.

19

Learning

• The concept of segmentation is obviously subjective.

• Hence, some user assistant information will greatly improve the segmentation.

20

Handle the ambiguity

• To improve robustness and handle the ambiguity of the projections near the boundary

21

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

22

Affinity propagation [Frey & Dueck 2007]

• find several exemplars such that the sum of the similarities between the data points and the corresponding exemplars is maximized.

• i.e. searching over valid configurations of the labels so as to minimize the energy

• i.e. maximizing the net similarity

Ncc ,,1 c

N

i

icisE1

,c

N

k

kES1

ccc

23

Responsibility

• The responsibility sent from data point to candidate exemplar point , reflects the accumulated evidence for how well-suited point is to serve as the exemplar for point , taking into account other potential exemplars for point .

kir , ik

i

i

Responsibility

i k

k

24

Availability

• The availability , sent from the candidate exemplar point to point , reflects the accumulated evidence for how appropriate it would be for point to choose point as its exemplar, taking into account the support from other points that point should be an exemplar.

kia ,ik

i k

ik

Availability

k

25

Responsibility & Availability

Responsibility

i k

Availability

kii

kk

kirkkrkia

kiskiakiskir

,'

'

,',0max,,0min,

',',max,,

26

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

27

Sparse affinity propagation

• Affinity propagation on a sparse graph, called sparse affinity propagation, is more efficient as pointed in [Brendan Frey, Delbert Dueck 2007].

• Then sparse affinity propagation runs in O(T|E|) time with T the number of the iterations and |E| the number of the edges.

• Here, the time complexity is O(Tn) since |E| = O(n).

28

Original sparse AP

• The number of the data points that have the same exemplar i is at most degree(i), where degree(i) is the number of nodes connecting i.

This will result in

unexpectedly too

many fragments.

29

Hierarchical sparse AP

G’=G(V,E);

while (true)

{

[Exemplars, Label] = Sparse Affinity Propagation (G’);

G’= (V’=Exemplars, E’);

if ( Satisfy Stopping Condition ) break;

}

ji

ji cqExemplarcpExemplar

EqpVqpccE

)(,)(

,',,',,'

30

Hierarchical sparse AP

L=1L=2 L=5 L=8

L=14 L=17L=11

31

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

32

Semi-supervised contraction

0,, qqspps

33

Semi-supervised contraction

34

Semi-supervised contraction

35

Semi-supervised contraction

• Finally, when the algorithm converged, availabilities and responsibilities are combined to identify exemplars.

• For point , its corresponding label is obtained as

kirkiak

qpk,,max arg

,

*

i

36

Semi-supervised contraction

37

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

38

Results

39

Results

40

Results

41

Outline

Part 1: Introduction

Part 2: Our Approach

– Formulation

– Optimization:

• Hierarchical Sparse Affinity Propagation

• Semi-supervised Contraction

Part 3: Experiment Results

Part 4: Conclusion

42

Conclusion

43

Thank you!

Questions?

Contact: Jianxiong XIAO csxjx@cse.ust.hk

ICCV 2007Eleventh IEEE International Conference on Computer Vision

Rio de Janeiro, Brazil, October 14-20, 2007

Joint Affinity Propagation for Multiple View Segmentation

44

2D color similarity

• Contour based similarity

45

Time complexity

• Compared with the spectral clustering approach in [Quan 2007], the hierarchical sparse affinity propagation is more efficient, running in O(TLn) with T the number of the iterations and L the number of the hierarchies, and more effective.

46

Segmentation process pipeline

Automatic segmentation Assisted

segmentation

top related