independent component analysis - Örebro...

87
Independent Component Analysis PhD Seminar Jörgen Ungh

Upload: vukhanh

Post on 04-Jun-2018

227 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Independent Component Analysis

PhD SeminarJörgen Ungh

Page 2: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Agenda

• Background – a motivater• Independence• ICA vs. PCA• Gaussian data• ICA theory• Examples

Page 3: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Background & motivation

• The cocktail party problem

Hi hiBla bla

Blabla

Page 4: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Background & motivation

• The cocktail party problem

Hi hiBla bla

Blabla

Page 5: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Background & motivation

• The cocktail party problem

Hi hiBla bla

Blabla

x1

x3

x2

s1 s2s3

Page 6: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Cocktail party problem• Let s1(t), s2(t) and s3(t) be the original spoken signals• Let x1(t), x2(t) and x3(t) be recorded signals• The connection between s and x can be written

x1(t)=a11*s1(t) + a12*s2(t) + a13*s3(t)x2(t)=a21*s1(t) + a22*s2(t) + a23*s3(t)x3(t)=a31*s1(t) + a32*s2(t) + a33*s3(t)

Goal: Estimate s1, s2 and s3 from x1, x2 and x3?Problem: We do not know anything about the right side…

Page 8: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

”Today we celebrate ourindependence day”

- US President THOMAS J. WHITMORE (Bill Pullman) in Independence Day (1996)

Page 9: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Independence – what is it?Independence = Uncorrelation?

Page 10: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Definitions

}))({( Tyxxy mymxEC −−=Covariance:

}{T

xy yxER =Correlation:

xyxyyx RCthenmmIf === ,0

Page 11: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Uncorrelated

• Two vectors are uncorrelated if:

0}))({( =−−= Tyxxy mymxEC

Tyx

TT

xy mmyExEyxER === }{}{}{

0,0 ==== xyxyyx RCthenmmIf

…from now we assume zero mean variables

Page 12: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Independent• Vectors x,y are independent if:

• Which also gives:

• Where gx and gy are arbitrary functions of x and y

)()(),(, ypxpyxp yxyx =

)}({)}({)}()({ ygExgEygxgE yxyx =

Page 13: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Independent

Independent is stronger than uncorrelated!

}{}{}{TTyExEyxE =

)}({)}({)}()({ ygExgEygxgE yxyx =

Equal if linear functions of x and y

Page 14: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Independent ≠ Uncorrelated

x

y

x

y

Are x and y uncorrelated?

Page 15: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Independent ≠ Uncorrelated

x

y

x

y

YESYES

Are x and y uncorrelated?

Page 16: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Independent ≠ Uncorrelated

x

y

x

y

Are x and y independent?

Page 17: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Independent ≠ Uncorrelated

x

y

x

y

No YES

Are x and y independent?

Page 18: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Relations

Independent Uncorrelated

BUT

Uncorrelated Independent

Page 19: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA vs. PCA

Independent Principal

Page 20: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

PCA• Goal: ”Project data onto an ortonormal

basis with maximum variance”

• Data explained by principal components

e1

e2

Page 21: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

PCA

• Uses information up to second moment, i.e. the mean and variance/covariance

• Reduce dimensions of data

• Ortonormal basis of uncorrelated vectors

Page 22: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA• Goal: ”Find the independent sources”

• Data explained by independent components

x

y

e1

e2

Page 23: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA

• Uses information over second moment, i.e. higher order statistics like kurtosis and skewness

• Does not reduce dimensions of data

• A basis of independent vectors

Page 24: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA vs. PCA

• Independent is stronger

• In case of Gaussian data, ICA = PCA

Page 25: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Gaussian data

Page 26: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Gaussian distribution

• Definition:

⎟⎠⎞

⎜⎝⎛ −−−= − )()(21exp

)2(

1)( 1

212/

µµπ

xCxC

xf T

Nx

C = covariance matrix, µ = mean vector

Explained completely by first and second orderstatistics, i.e. mean and variances

Page 27: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Gaussian data

• Cannot perform a rotation of the basis, due to symmetry

Page 28: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Gaussian distribution

• Completely defined by its first and secondmoment

• Uncorrelated Gaussian data Independence

• Why assume gaussian data?

Page 29: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Central limit theorem• Definition:

”A sum of independent random variables will tendto be Gaussian”

• That is the argument for many assumptions on gaussian distributions

Page 30: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Central limit theorem

• Definition:

”A sum of independent random variables willtend to be Gaussian”

What if we put it in another way…?

Page 31: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Central limit theorem

• 2:nd definition:

”The mixtures of two or more independent random variables are more gaussian thanthe random variables themselves”

Page 32: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

A single random u.d. variable

Page 33: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

A mixture of 2 u.d. variables

Page 34: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Idea!• The observed mixtures should be more

gaussian than the original components

• The original components should be less gaussian than the mixture

• If we try to maximize the non-gaussianity of the data, we should get closer to the original components…

Page 35: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA theory

• Problem definition• Solution • Preprocessing• Different methods• Examples

Page 36: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Definition of the problem• Let s1(t), s2(t) and s3(t) be the original signals• Let x1(t), x2(t) and x3(t) be collected signals• The connection between s and x can be written

x1(t)=a11*s1(t) + a12*s2(t) + a13*s3(t)x2(t)=a21*s1(t) + a22*s2(t) + a23*s3(t)x3(t)=a31*s1(t) + a32*s2(t) + a33*s3(t)

Goal: Estimate s1, s2 and s3 from x1, x2 and x3?

Page 37: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Assumption

• Independence

• Non-gaussian

• Square mixing matrix

Page 38: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Idea

• Maximize non-gaussianity of the data!

• We need a measure of ”Gaussianity” or ”Non-gaussianity”

Page 39: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity

1. Kurtosis

Assuming zero mean variables

{ } { }( )224 3)( yEyEykurt −=

Page 40: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity

1. Kurtosis

Assuming zero mean and unit variance

{ } 3)( 4 −= yEykurt

Page 41: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity1. Kurtosis

For Gaussian data we have:

Which gives kurt(y) = 0, for Gaussian data

For most other, kurt ≠ 0, positive or negative

{ } { }( )224 3)( yEyEykurt −=

{ } { }( )224 3 yEyE =

Page 42: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity

1. Kurtosis

Maximize |kurt(y)|

Page 43: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity

1. Kurtosis

Maximize |kurt(y)|

Advantages:- Easy to compute

Drawbacks:- Sensitive to outliers

Page 44: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity2. Negentropy

where, H = entropy, defined as:

)()()( yHyHyJ gauss −=

ηηη dppyH yy )(log)()( ∫−=

Page 45: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity2. Negentropy

)()()( yHyHyJ gauss −=

Gaussian data has the largest entropy, meaning that it is the ”most random” distrubution.

Page 46: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity2. Negentropy

)()()( yHyHyJ gauss −=

Gaussian data has the largest entropy, meaning that it is the ”most random” distrubution.

J(y) > 0 and equals zero if y gaussian

Page 47: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Measures of Gaussianity

1. Negentropy

Maximize J(y)

Advantages:- Robust

Drawbacks:- Computationally hard

Page 48: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Solutions

• Kurtosis• Negentropy• Maximum likelihood• Infomax• Mutual information• …

Page 49: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Solutions

• Kurtosis• Negentropy• Maximum likelihood• Infomax• Mutual information• …

Based on independenceand/or non-gaussian

Page 50: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Restrictions

• Non-gaussian data*

• Scaling, sign and order of components

• Need to know the No. of components

Page 51: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Restrictions

• Non-gaussian data*

• Scaling, sign and order of components

• Need to know the No. of components

* In case of some Gaussian data, the independent components will still be found, but the Gaussian oneswill be mixed.

Page 52: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Preprocessing

• No reduction of dimension in ICA

• Need to know the number of components

• But, we do already have a method for dimension reduction and estimating the probable number of components

Page 53: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Preprocessing

• No reduction of dimension in ICA

• Need to know the number of components

• But, we do already have a method for dimension reduction and estimating the probable number of components

Use PCA as a preprocessing step!

Page 54: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Preprocessing

• Low pass filtering+ Reduces noise– Reduces independence

• High pass filtering+ Increases independence- Increases noise

Page 55: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

ICA: Overlearning

• Much more mixtures than independent components

• Spiky character of the components

Page 56: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Examples

• Cocktail party• Music separation• Image analysis• Separation of recorded signals of brain

activity• Process data• Noise/Signal separation• Process monitoring

Page 57: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Cocktail party problem

Hi hiBla bla

Blabla

Page 58: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Music separation

Mix 1 Est 1Source 1

Mix 2 Est 2Source 2

Mix 3 Est 3Source 3

Mix 4 Est 4Source 4

http://www.cis.hut.fi/projects/ica/cocktail/cocktail_en.cgi

Page 59: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Music separation

Mix 1 Est 1Source 1

Mix 2 Est 2Source 2

Mix 3 Est 3Source 3

Mix 4 Est 4Source 4

http://www.cis.hut.fi/projects/ica/cocktail/cocktail_en.cgi

Page 60: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Image analysis - NLPCA

Page 61: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -
Page 62: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -
Page 63: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -
Page 64: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Brain activity

S1

S3

S2S4

Page 65: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Process data

0 200 400 600 800 1000 1200-5

0

5Mixed signals

0 200 400 600 800 1000 1200-5

0

5

0 200 400 600 800 1000 1200-5

0

5

0 200 400 600 800 1000 12000

1

2

Page 66: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Process data

0 200 400 600 800 1000 1200-20

0

20Whitened signals

0 200 400 600 800 1000 1200-5

0

5

0 200 400 600 800 1000 1200-5

0

5

0 200 400 600 800 1000 1200-5

0

5

Page 67: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Process data

0 200 400 600 800 1000 1200-20

0

20Independent components

0 200 400 600 800 1000 1200-2

0

2

0 200 400 600 800 1000 1200-5

0

5

0 200 400 600 800 1000 1200-10

-5

0

Page 68: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Process data

0 200 400 600 800 1000 1200-1

0

1

0 200 400 600 800 1000 12000

2

4

0 200 400 600 800 1000 12000.5

1

1.5

0 200 400 600 800 1000 12000

0.5

1

Page 69: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal

• Different noise sources– Laplacian– Gaussian– Uniform– Exponential

0 100 200 300 400 500 600 700 800 900 1000-1

-0.5

0

0.5

1

1.5

2

Page 70: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Laplacian

0 200 400 600 800 1000 1200-4

-2

0

2

4Mixed signals

0 200 400 600 800 1000 1200-4

-2

0

2

4

Page 71: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Laplacian

0 200 400 600 800 1000 1200-2

-1

0

1

2

3Whitened signals

0 200 400 600 800 1000 1200-4

-2

0

2

4

6

Page 72: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Laplacian

0 200 400 600 800 1000 1200-2

-1

0

1

2Independent components

0 200 400 600 800 1000 1200-6

-4

-2

0

2

4

Page 73: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Gaussian

0 200 400 600 800 1000 1200-4

-2

0

2

4Mixed signals

0 200 400 600 800 1000 1200-5

0

5

Page 74: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Gaussian

0 200 400 600 800 1000 1200-2

-1

0

1

2Whitened signals

0 200 400 600 800 1000 1200-4

-2

0

2

4

Page 75: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Gaussian

0 200 400 600 800 1000 1200-2

-1

0

1

2Independent components

0 200 400 600 800 1000 1200-4

-2

0

2

4

Page 76: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Uniform

0 200 400 600 800 1000 1200-2

-1

0

1

2Mixed signals

0 200 400 600 800 1000 1200-2

-1

0

1

2

Page 77: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Uniform

0 200 400 600 800 1000 1200-2

-1

0

1

2Whitened signals

0 200 400 600 800 1000 1200-4

-2

0

2

4

Page 78: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Uniform

0 200 400 600 800 1000 1200-2

-1

0

1

2Independent components

0 200 400 600 800 1000 1200-2

-1

0

1

2

Page 79: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Exponential

0 200 400 600 800 1000 1200-1

0

1

2Mixed signals

0 200 400 600 800 1000 1200-0.4

-0.2

0

0.2

0.4

0.6

Page 80: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Exponential

0 200 400 600 800 1000 1200-2

-1

0

1

2

3Whitened signals

0 200 400 600 800 1000 1200-6

-4

-2

0

2

Page 81: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Noise removal - Exponential

0 200 400 600 800 1000 1200-2

-1

0

1

2Independent components

0 200 400 600 800 1000 1200-2

0

2

4

6

Page 82: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Process monitoring

• Often done by PCA

• Example: F1, F2

• One step further, use ICA!

Page 83: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Practical considerations

• Noise reduction (filtering)

• Dimension reduction (PCA?)

• Overlearning

• Algorithm

Page 84: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

What about time signals?

• So far, no information about time used• Original ICA, x is a random variable• What if x is a time signal x(t) ?

0 100 200 300 400 500 600 700 800 900 1000-1

-0.5

0

0.5

1

1.5

2

x(t)

Page 85: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Time signal x(t)

• Extra information, order is not random:– Autocorrelation– Cross correlation

• More information

Relaxed assumptions

Gaussian data ok

Page 86: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Extensions…

• Non-linear ICA

• Independent subspace analysis

Page 87: Independent Component Analysis - Örebro Universityaass.oru.se/.../2007_06_07b-Ungh-Independent_Component_Analysi… · Further information: Book: Independent Component Analysis -

Further information:

Book:Independent Component Analysis - A. Hyvärinen, J. Karhunen, E. OjaCovers everything from novel to expert

Homepage:http://www.cis.hut.fi/projects/ica/Tutorials, material, contacts, matlab code, …

Journal of Machine Learning Researchhttp://jmlr.csail.mit.edu/papers/special/ica03.htmlPapers and publications

Toolboxes, codehttp://mole.imm.dtu.dk/toolbox/ica/index.htmlhttp://www.bsp.brain.riken.jp/ICALAB/http://www.cis.hut.fi/projects/ica/book/links.html