1 maarten de vos sista – scd - biomed k.u.leuven on the combination of ica and cpa maarten de vos...

31
Maarten De Vos SISTA – SCD - BIOMED K.U.Leuven On the combination of ICA and CPA Maarten De Vos Dimitri Nion Sabine Van Huffel Lieven De Lathauwer

Upload: davion-avon

Post on 16-Dec-2015

221 views

Category:

Documents


2 download

TRANSCRIPT

1 Maarten De Vos

SISTA – SCD - BIOMED

K.U.Leuven On the combination of ICA and CPA

Maarten De Vos

Dimitri Nion

Sabine Van Huffel

Lieven De Lathauwer

2 Maarten De Vos

• What is ICA? What is CPA?

• Why combining ICA and CPA?

• Our algorithm

• Results

• Conclusion

Roadmap

3 Maarten De Vos

• What is ICA? What is CPA?

• Why combining ICA and CPA?

• Our algorithm

• Results

• Conclusion

Roadmap

4 Maarten De Vos

EEG1 EEG2

EEG3

Independent Component Analysis (ICA)

ICA: Estimate statistically independent sources s1, s2 and s3; and mixing coefficients mii

Decomposing a measurement (EEG) into contributing sources.

EEG3 = m31s1 + m32s2 + m33s3

EEG1 = m11s1 + m12s2 + m13s3

EEG2 = m21s1 + m22s2 + m23s3

5 Maarten De Vos

Decomposition of a measured signal

S1 SR

M1 MR

= ++ …. + EY

PCA estimates orthogonal sources (basis)

= MQQ* S

S1 SR

M1 MR

= ++ …. + EY

ICA estimates statistically independent sources

= MS

Matrix decompositions (e.g. PCA) are often not unique.

ICA imposes statistical independence to sources.

6 Maarten De Vos

• Different implementations of ‘independence’

• Jade: Joint Approximate Diagonalization of Eigenmatrices

• All the higher-order cross-cumulants are zero

• Fourth order tensor cumulant is diagonal

• Mixing matrix is the matrix that approximately diagonalizes the eigenmatrices of cumulant

Computation of ICA

7 Maarten De Vos

= ** ***

Approximate diagonalization

• Sobi: Second Order Blind Identification

• Assumption that sources are autocorrelated

• Mixing matrix also diagonalizes set of matrices

• Matrices are correlation matrices at different time lags

******

8 Maarten De Vos

Decomposition of a measured signal

S1 SR

M1 MR

= ++ …. + EY

B

A=

Y S

C

PCA

Tucker / HOSVD : estimates subspace

If a signal is multi-dimensional (=higher order tensor), multilinear algebra tools can be used that better exploit the multi-dimensional nature of the data.

=

B

A

S

C

QQ* PP*

OO*

9 Maarten De Vos

Decomposition of a measured signal

S1 SR

M1 MR

= ++ …. + EY

S1 SR

A1 AR

= ++ …. +Y E

BRB1

PCA

CPA: Canonical / Parallel Factor Analysis

If a signal is multi-dimensional (=higher order tensor), multilinear algebra tools can be used that better exploit the multi-dimensional nature of the data.

10 Maarten De Vos

• CPA components are not orthogonal• The best Rank – R approximation may not

exist• The R components are not ordered• But the decomposition is unique and no

rotation is possible without changing the model part …

Something about CPAS1 SR

A1 AR

= ++ ….

Y E

BRB1

11 Maarten De Vos

• CPA computed often by ALS:– Minimization of the (Frobenius -) norm

of residuals

– Minimize

1) Initialize A,S,B

2) Update A, given S and B :

3) Update S, given A and B :

4) Update B, given A and S :

5) Iterate (2-3-4) until convergence

Computation of CPAS1 SR

A1 AR

= ++ …. Y E

BRB1

min || ( ) ||TK JIBX B A S

min || ( ) ||TI JK

AX A S B

min || ( ) ||TJ IK

SX S B A

, ,min || ( ) ||T

I JKA S BX A S B

12 Maarten De Vos

• Sometimes long swamps, meaning that the costfunction converges very slowly.

Computation of CPA (2)

13 Maarten De Vos

• In order to reduce swamps, interpolate A, B and S from the estimates of 2 previous iterations and use the interpolated matrices at the current iteration

1.Line Search:

2.Then ALS update

Choice of crucial

=1 annihilates LS step (i.e. we get standard ALS)

( ) ( 2) ( 1) ( 2)

( ) ( 2) ( 1) ( 2)

( ) ( 2) ( 1) ( 2)

( )

( )

( )

new k k k

new k k k

new k k k

A A A A

S S S S

B B B B

Search directions

( ) ( ) ( )

( ) ( ) ( )

( ) ( ) ( )

1

k new newI JK

k new kJ IK

k k kK IJ

A X S B

S X B A

B X A S

k k

Improvement of ALS: Line search

14 Maarten De Vos

• [Harshman, 1970] « LSH »

• [Bro, 1997] « LSB »

• [Rajih, Comon, 2005] « Enhanced Line Search (ELS) »

• [Nion, De Lathauwer, 2006] «Enhanced Line Search with Complex Step  (ELSCS) » 

25.1 Choose

),,(

6)(),,()()()(

)()()(

newnewnew

thnewnewnew

HSA

HSA

minimizes that root the is Optimal

.polynomial order tensors REAL For

)2

tan(),(

:

),(:

),(),,(

.)()()(

tm

m

mm

mm

m

m

emnewnewnew

i

in polynomial order 6 fixed, for Update

in polynomial order5 fixed, for Update

: and of update Alternate

have We

optimal for look tensors, complex For

th

th

HSA

Fit in decrease if step LS validate and Choose 3/1k

Improvement of ALS : Line search

15 Maarten De Vos

16 Maarten De Vos

• What is ICA? What is CPA?

• Why combining ICA and CPA?

• Our algorithm

• Simulation results

• Conclusion

Overview

17 Maarten De Vos

These activations have different ratios in different subjects

Gives rise to a trilinear CPA structure

[Beckmann et al., 2005]

18 Maarten De Vos

• Beckmann et al (2005)• Combination of ICA and CPA : tensor pICA• Tensor pICA outperforms CPA due to low Signal-to-Noise

Ratio• Algo from paper:

– One iteration step to optimize ICA costfunction

– One iteration step to optimize trilinear structure

– Optimize ‘until convergence’

• Algo implemented in paper:– Compute ICA on matricized tensor

– Decompose afterwards mixing vector to obtain trilinear

decomposition

Tensor pICA

19 Maarten De Vos

• Does it make sense to add constraints?– - : uniqueness– +: robustness– +: more identifiable if constraints make sense– +: see results

20 Maarten De Vos

• What is ICA? What is CPA?

• Why combining ICA and CPA?

• Our algorithm

• Results

• Conclusion

Overview

21 Maarten De Vos

• We developed a new algorithm that simultaneously imposed the independence and the trilinear constraint

A1AR

B1 BR

= + …. +Y

SRS1

22 Maarten De Vos

• Compute fourth-order cumulant tensor• Compute the ‘eigenmatrices’ of this tensor -> 3rd

order tensor• Add slice with covariance matrix to this tensor• This tensor has a 3rd order CPA structure

ICA - CPA

( )Data A B S

** *********

23 Maarten De Vos

• Compute fourth-order cumulant tensor• Compute the ‘eigenmatrices’ of this tensor -> 3rd order

tensor• This tensor has a 3rd order CPA structure• When the mixing matrix has a bilinear structure (the mixing

vector has a Khatri-Rao structure) and this tensor can be rewritten as a 5th order tensor with CPA structure:

ICA - CPA

R

ir jr kr lr mr

r

a b a b d

( )Data A B S

24 Maarten De Vos

• How to compute the 5th order CPA?

• ALS breaks symmetry, simulations showed bad performance

• Taking into account the partial symmetry naturally preserved in a line-search scheme– Search directions: between current estimate and

ALS update– Step size: rooting real polynomial of degree 10

25 Maarten De Vos

• What is ICA? What is CPA?

• Why combining ICA and CPA?

• Our algorithm

• Results

• Conclusion

Overview

26 Maarten De Vos

• Application in fMRI?

• [Stegeman, 2007]: CPA on fMRI comparable to tensor pICA if correct number of components is chosen

• [Daubechies, 2009]: ICA on fMRI works rather because of sparsity than because of independence. Infomax

27 Maarten De Vos

• We consider narrow-band sources received by a uniform circular array (UCA) of I identical sensors of radius P. We assume free-space propagation.

• The entries of A represent the gain between a transmitter and an antenna

• We generated BPSK user signals: all source distributions are binary (1 or -1), with an equal probability of both values.

• B contains the chips of the spreading codes for the different users.

Application in telecommunications

28 Maarten De Vos

Well-conditioned mixture Mixture (5,2,1000)

Rank overestimated Colored noise

29 Maarten De Vos

Outperforms orthogonality constraint

30 Maarten De Vos

• We developed a new algorithm to impose both independence and trilinear constraints simultaneously: ICA-CPA

• We showed that the method outperforms both standard ICA and CPA in certain situations

• It should only be used when assumptions are validated …

Conclusion

31 Maarten De Vos