1 pattern recognition: statistical and neural lonnie c. ludeman lecture 30 nov 11, 2005 nanjing...

29
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

Upload: joella-reed

Post on 17-Jan-2016

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

1

Pattern Recognition:Statistical and Neural

Lonnie C. Ludeman

Lecture 30

Nov 11, 2005

Nanjing University of Science & Technology

Page 2: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

2

Lecture 30 Topics

1.General Comments about the Clustering Problem

2.Present my small programs that can be used for performing clustering

3. Demonstrate the programs

4. Closing Comments

Page 3: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

3

Clustering is the art of grouping together pattern vectors that in some sense belong together because they have similar characteristics and are different from other pattern vectors.

In the most general problem the number of clusters or subgroups is unknown as are the properties that make them similar.

Review

Page 4: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

4

Question:

How do we start the process of finding clusters and identifying similarities???

Answer:

First realize that clustering is an art and there is no correct answer only feasible alternatives.

Second explore structures of data, similarity measures, and limitations of various clustering procedures

Review

Page 5: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

5

Problems in performing meaningful clustering

Scaling

The nonuniqueness of results

Programs always give clusters even when there are no clusters

Review

Page 6: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

6

There are no correct answers, the clusters provide us with different interpretations of the data where the closeness of patterns is measured with different definitions of similarity.

The results may produce ways of looking at the data that we have not considered or noticed. These structural insights may prove useful in the pattern recognition process. Review

Page 7: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

7

Methods for Clustering Quantitative Data

1. K-Means Clustering Algorithm

2. Hierarchical Clustering Algorithm

3. ISODATA Clustering Algorithm

4. Fuzzy Clustering Algorithm

Review

Page 8: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

8

K-Means Clustering Algorithm

Randomly Select K cluster centers from Pattern Space

Distribute set of patterns to the cluster center using minimum distance

Compute new Cluster centers for each cluster

Continue this process until the cluster centers do not change.

Review

Page 9: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

9

Agglomerative Hierarchical Clustering

S = { x1, x2, ... , xk, ... , xN}Consider a set S of patterns to be clustered

Define Level N by

S1(N) = { x1}

SN(N) = { xN}

S2(N) = { x2}

Clusters at level N are the individual pattern vectors...

Review

Page 10: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

10

Define Level N -1 to be N – 1 Clusters formed by merging two of the Level N clusters by the following process.

Compute the distances between all the clusters at level N and merge the two with the smallest distance (resolve ties randomly) to give the Level N-1 clusters as

S1(N-1)

SN-1(N-1)

S2(N-1)

Clusters at level N -1 result from this merging

...Review

Page 11: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

11

The process of merging two clusters at each step is performed sequentially until Level 1 is reached. Level one is a single cluster containing all samples

S1(1) = { x1, x2, ... , xk, ... , xN}

Thus Hierarchical clustering provides cluster assignments for all numbers of clusters from N to 1.

Review

Page 12: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

12

Fuzzy C-Means Clustering Preliminary

Given a set S composed of pattern vectors which we wish to cluster

) ]

1 1 1

2 2 2

C C C

S = { x1, x2, ... , xN}

Define C Cluster Membership Functions

... ...

ReviewC

Page 13: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

13

Define C Cluster Centroids as follows

Let Vi be the Cluster Centroid for Fuzzy

Cluster Cli , i = 1, 2, …, C

Define a Performance Objective Jm as

where

Review

Page 14: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

14

The Fuzzy C-Means Algorithm minimizes Jm

by selecting Vi and i , i =1, 2, … , C by an

alternating iterative procedure as described in the algorithm’s details

m = Fuzziness Index (m >1 ) Higher numbers being more fuzzy

A is a symmetric positive definite matrix

Ns is total number of pattern vectors

Definitions

Review

Page 15: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

15

Fuzzy C-Means Clustering Algorithm (a) Flow Diagram

No Yes

Review

Page 16: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

16

General Programs for Performing Clustering

1. Available commercial Packages:

2. Small Programs for classroom use

SPSS , SAS, GPSS,

LCLKmean.exe

LCLHier.exe

LCLFuzz.exe

Page 17: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

17

LCLKmean.exe

LCLHier.exe

LCLFuzz.exe

2. Small Programs for classroom use

Use the K-Means Algorithm to cluster small data sets

Performs Hierarchical Clustering of small data sets

Performs Fuzzy and crisp clustering of small data sets

Page 18: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

18

Data File Format for the LCL Programs

NS = Number of data samples

VS = Data vector size

DATA in row vectors with space between components

NS

VS

DATA

5 3 1 6 3 2 0 5 7 1 4 6 6 8 2 2 3

Text File

Page 19: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

19

All the clustering techniques presented so far use a measure of distance or similarity. Many of these give equal distance contours that represent hyper spheres and hyper ellipses.

If these techniques are used directly on patterns that are not describable by those type of regions we can expect to obtain poor results.

Food for Thought

Page 20: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

20

In some cases each cluster occupies a limited region (subspace of the total pattern space ) described by a nonlinear functional relation between components. An example appears below.

Existing Pattern vectors

Existing Pattern Vectors

Standard K-Means, Hierarchical, or Fuzzy cluster analysis directly on the data will produce unsatisfactory results.

Page 21: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

21

For this type of problem the patterns should be first preprocessed before a clustering procedure is performed .

Two almost contradictory approaches can be used for this processing.

1. Extend the pattern space by techniques comparable to functional link nets so that the clusters can be separated by spherical and elliptical regions.2. Reduce the dimension of the space by a nonlinear form of processing involving principal component like processing before clustering.

Page 22: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

22

Both methods imply that we know additional information about the structure of the data. This additional information may be known to us or it may need to be determined.

The process of finding structure within data has been put in the large category of “Data Mining”.

So get a shovel and start looking. Good luck in your search for gold in the mounds of practical data.

Page 23: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

23

Several very important topics in Pattern Recognition were not covered in this course because of time limitations. The following topics deserve your special attention to make your educational experience complete

1. Feature Selection and Extraction

2. Hopfield and feedback neural nets

3. Syntactical Pattern Recognition

4. Special Learning Theory

Page 24: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

24

Nanjing University of Science & Technology

Lu Jian FengYang Jing-yu Wang Han

for inviting me to present this course onStatistical and Neural Pattern Recognition

Like to Thank

and

Page 25: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

25

Lu Jian Feng Wang QiongWang Huan

A Very Special Thanks to my new friends

for looking after me. Their kindness and gentle assistance has made my stay in Nanjing a very enjoyable and unforgettable experience.

Page 26: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

26

Last and not least I would like to thank

all you students

for your kind attention throughout this course. Without your interest and cheerful faces it would have been difficult for me to teach.

My apology for teaching in English, which I am sure, made your work a lot harder.

Best of Luck to all of you in your studies and life.

Page 27: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

27

“As you travel through life may all your trails be down hill and the wind always be at your back”.

Bye for now and I hope our paths cross again in the future. I will have pleasant thoughts about NUST Sudents and Faculty, Nanjing, and China as I head back to New Mexico !

Page 28: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

28

New Mexico

Land of Enchantment

Page 29: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 30 Nov 11, 2005 Nanjing University of Science & Technology

29

End of Lecture 30