a robust fuzzy classification maximum likelihood clustering...

22
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems Vol. 21, No. 5 (2013) 755-776 © World Scientific Publishing Company DOI: 10.1142/S0218488513500360 755 A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING FRAMEWORK MIIN-SHEN YANG *,‡ , CHIH-YING LIN * and YI-CHENG TIAN *,† * Department of Applied Mathematics Chung Yuan Christian University Chung-Li 32023, Taiwan Department of Applied Mathematics, Center for General Education, Hsin Sheng College of Medical Care and Management, Chung-Li, Taiwan [email protected] Received 16 October 2012 Revised 26 April 2013 In 1993, Yang first extended the classification maximum likelihood (CML) to a so-called fuzzy CML, by combining fuzzy c-partitions with the CML function. Fuzzy c-partitions are generally an extension of hard c-partitions. It was claimed that this was more robust. However, the fuzzy CML still lacks some robustness as a clustering algorithm, such as its in-ability to detect different volumes of clusters, its heavy dependence on parameter initializations and the necessity to provide an a priori cluster number. In this paper, we construct a robust fuzzy CML clustering framework that has a robust clustering method. The eigenvalue decomposition of a covariance matrix is firstly considered using the fuzzy CML model. The Bayesian information criterion (BIC) is then used for model selection, in order to choose the best model with the optimal number of clusters. Therefore, the proposed robust fuzzy CML clustering framework exhibits clustering characteristics that are robust in terms of the parameter initialization, robust in terms of the cluster number and also in terms of its capability to detect different volumes of clusters. Numerical examples and real data applications with comparisons are provided, which demonstrate the effectiveness and superiority of the proposed method. Keywords: Clustering; model-based clustering; k-means; fuzzy clustering; fuzzy c-means; classification maximum likelihood; robustness. 1. Introduction Data analysis is the science of analyzing real world data. Cluster analysis is a useful tool for data analysis. It identifies clusters as the groups of data points in a data set with the greatest similarity within the same cluster and the largest dissimilarity between different clusters. Cluster analysis is a branch of statistical multivariate analysis. 1,2 In contrast, learning and recognition mostly begin with clustering, so cluster analysis represents an unsupervised learning in pattern recognition. 3.4 From the statistical point of view, * Corresponding author.

Upload: others

Post on 26-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

International Journal of Uncertainty,

Fuzziness and Knowledge-Based Systems

Vol. 21, No. 5 (2013) 755−776

© World Scientific Publishing Company

DOI: 10.1142/S0218488513500360

755

A ROBUST FUZZY CLASSIFICATION MAXIMUM

LIKELIHOOD CLUSTERING FRAMEWORK

MIIN-SHEN YANG*,‡, CHIH-YING LIN* and YI-CHENG TIAN*,†

*Department of Applied Mathematics

Chung Yuan Christian University

Chung-Li 32023, Taiwan

Department of Applied Mathematics, †Center for General Education,

Hsin Sheng College of Medical Care and Management,

Chung-Li, Taiwan ‡[email protected]

Received 16 October 2012

Revised 26 April 2013

In 1993, Yang first extended the classification maximum likelihood (CML) to a so-called fuzzy

CML, by combining fuzzy c-partitions with the CML function. Fuzzy c-partitions are generally an

extension of hard c-partitions. It was claimed that this was more robust. However, the fuzzy CML

still lacks some robustness as a clustering algorithm, such as its in-ability to detect different volumes

of clusters, its heavy dependence on parameter initializations and the necessity to provide an a priori

cluster number. In this paper, we construct a robust fuzzy CML clustering framework that has a

robust clustering method. The eigenvalue decomposition of a covariance matrix is firstly considered

using the fuzzy CML model. The Bayesian information criterion (BIC) is then used for model

selection, in order to choose the best model with the optimal number of clusters. Therefore, the

proposed robust fuzzy CML clustering framework exhibits clustering characteristics that are robust

in terms of the parameter initialization, robust in terms of the cluster number and also in terms of its

capability to detect different volumes of clusters. Numerical examples and real data applications

with comparisons are provided, which demonstrate the effectiveness and superiority of the proposed

method.

Keywords: Clustering; model-based clustering; k-means; fuzzy clustering; fuzzy c-means;

classification maximum likelihood; robustness.

1. Introduction

Data analysis is the science of analyzing real world data. Cluster analysis is a useful tool

for data analysis. It identifies clusters as the groups of data points in a data set with the

greatest similarity within the same cluster and the largest dissimilarity between different

clusters. Cluster analysis is a branch of statistical multivariate analysis.1,2

In contrast,

learning and recognition mostly begin with clustering, so cluster analysis represents an

unsupervised learning in pattern recognition.3.4

From the statistical point of view,

*Corresponding author.

Page 2: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 756

clustering methods can be generally categorized as either having a (probability) model-

based approach or a nonparametric approach. A model-based clustering approach

assumes that the data set obeys a mixture model of probability distributions, so a mixture

likelihood approach to clustering is used.5,6

For a nonparametric approach, clustering

methods are generally based on an objective function of similarity or dissimilarity

measures and partitional methods are popularly used.2,4

Most partitional methods suppose

that the data set can be represented by finite cluster prototypes with their own objective

functions. Therefore, the definition of the dissimilarity (or distance) between points and

cluster prototypes is essential for partitional methods. The most popular partitional

methods for cluster prototypes are k-means,7,8

trimmed k-means,9,10

fuzzy c-means

(FCM),11−13

mean shift14−16

and possibilistic c-means (PCM).17−19

There are two categories of model-based clustering algorithms. One is based on the

expectation and maximization (EM) algorithm,20,21

and the other is based on the

classification maximum likelihood (CML). 22−24

The EM algorithm is most popular, but it

is sensitive to initial values and to cluster shapes in which an a priori number of

components (clusters) must be given.21

Scott and Symons23

first proposed the CML

procedure. Yang25

then extended the original CML to the so-called fuzzy CML (FCML)

based on fuzzy c-partitions. Banfield and Raftery26

extended the original CML by using

the eigenvalue decomposition of a covariance matrix to allow the specification of which

feature (orientation, size and shape) is common to all clusters and which differs between

clusters. Fraley and Raftery27,28

continued to develop a more general model-based

clustering procedure.

In the literature, there are some applications of the fuzzy CML.29-33

However, the

FCML is not sufficiently robust in clustering, so it cannot detect different volumes (sizes

and shapes) of clusters and also requires an a priori number of clusters with parameter

initialization. In this paper, a robust framework is constructed for FCML clustering. It is

firstly necessary to consider the eigenvalue decomposition of a covariance matrix in the

fuzzy CML model. The Bayesian information criterion (BIC) is then used to choose the

best model with the optimal number of clusters. The remainder of this paper is organized

as follows. Section 2 provides a review of the FCML clustering method proposed by

Yang.25

In Sec. 3, the robust FCML clustering framework is constructed. Section 4

contains some numerical comparisons of the proposed method with existing methods.

The method is then applied to real data sets. Finally, conclusions are stated in Sec. 5.

2. Fuzzy Classification Maximum Likelihood Clustering

The classification maximum likelihood (CML)22−24

is a classification approach that

approximates the maximum likelihood estimates. Let ,..., 1 nxxX = be a random

sample of size, n , from a population that consists of c different subpopulations where

( ; )k

f x θ is the density function of the kth subpopulation with an unknown vector of

parameters, θ , and let ),...,( ,1 cPPP = be a partition of X . Then, the joint density of X is

Page 3: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 757

1

( ; )i k

c

k i k

k x P

f x θ= ∈

∏ ∏ . In the CML method, ),...,( ,1 cPPP = and ),...,( 1 cθθθ = are estimated

by maximizing the log likelihood function1

ln ( ; )i k

c

k i k

k x P

f x θ= ∈

∑∑ , so the CML objective

function CMLL for the grouped data ,..., 1 nxxX = is

1

( , ; ) ln ( ; )i k

c

CML k i k

k x P

L P X f xθ θ= ∈

=∑ ∑

If a d-variate Gaussian mixture model with mean vectors, ka , and covariance matrices,

kΣ , is considered, then the CML objective function, CMLL , becomes

1

1 1

1 1( , , ; ) ln(2 ) ln ( ) ( )

2 2 2i k

c cN

CML k k i k k i k

k k x P

ndL P a X n x a x aπ −

= = ∈

′Σ = − − Σ − − Σ −∑ ∑ ∑

−−−−−= ∑∑

=

=

c

k

kkiki

c

k

kk axaxtrnnd

1

1

1

)')((2

1ln

2

1)2ln(

2ΣΣπ

where kk Pn = is the cardinality of kP . If the sample cross-product matrix, k

W , for the

kth cluster is ( )( )i k

k i k i k

x P

W x x x x∈

′= − −∑ , then

);ˆ,ˆ,(maxargˆ XaPLP CMLP Σ=

= ∑=

c

k k

kkP

n

Wn

1

lnminarg .

This criterion is the CML originally proposed by Scott and Symons.23

Fuzzy set theory, as proposed by Zadeh,34

has been widely used in cluster analysis

since Ruspini35

introduced the notion of fuzzy c-partitions (see also Refs. 11−13). Yang25

made the fuzzy extension of the original CML to the fuzzy CML (FCML), based on

fuzzy c-partitions. This section firstly introduces the concept of hard and fuzzy c-

partitions and then the FCML method.

The CML objective function is ∑∑= ∈

=c

k Px

kikCML

ki

xfXPL

1

);(ln);,( θθ . The parameter,

0>kα , is the proportion of the kth subpopulation in the whole population with the

constraint, ∑=

=c

k

k

1

1α . The indicator functions, 1, ,

cz z… , are used with ( ) 1

jz x = if

jx P∈ and ( ) 0

jz x = if

jx P∉ for all x in X and for all 1, ,j c= … . This is generally

known as clustering X into c clusters using 1

( , , )c

z z z= … and is called the hard c-

partition of .X Therefore, the CML objective function becomes

1 1 1

( , , ; ) ln ( ; ) ( ) ln ( ; )j

c c n

CML j j i j i j j i

j x P j i

L P X f x z x f xα θ α θ α θ= ∈ = =

= =∑∑ ∑∑ .

Page 4: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 758

For the extension that allows the indicator functions, ( )j i

z x , to be functions (known

as membership functions), the values are assumed to be in the interval [0, 1], such that

1

( ) 1c

j i

j

z x=

=∑ for all i

x X∈ . In this case, the data set, X , is said to have a fuzzy c-

partition, 1

( , , )c

z z z= … , according to basic idea of fuzzy sets, as proposed by Zadeh.34

Based on this fuzzy extension, Yang25

proposed the FCML objective function as follows:

1 1 1 1

( , , ; ) ln ( ; ) lnc n c n

m m

FCML ij j i ij j

j i j i

J z X z f x r zα θ θ α= = = =

= +∑∑ ∑∑

subject to 1

1c

j

j

α=

=∑ , 0j

α > and 1

1c

ij

j

z=

=∑ for 1, ,i n= … for all [0,1].ij

z ∈ The criterion,

1m > , represents the degree of fuzziness and 0r ≥ is a scale parameter for the penalty

term 1 1

lnc n

m

ij j

j i

z α= =

∑∑ . Therefore, the FCML procedure25

is created by choosing a fuzzy c-

partition, z , and a proportion, α , and an estimate, θ , that maximize ( , , ; )FCML

J z Xα θ .

Yang25

used a multivariate normal distribution with an identity covariance matrix and

then derived a penalized fuzzy c-means clustering (PFCM) algorithm, where the PFCM

objective function is exactly the FCM objective function with the addition of the penalty

term 1 1

lnc n

m

ij j

j i

z α= =

∑∑ . There are several applications of the FCML,29-33

but the FCML

algorithm still lacks some robustness, in terms of its inability to detect different sizes and

shapes of clusters, its heavy dependence on parameter initializations and also the

necessity to provide an a priori number of clusters. In the next section, a robust

framework for FCML clustering is constructed.

3. The Proposed robust FCML Clustering Framework

Although the CML is a probability model-based cluster analysis that is used to

understand some existing clustering methods, such as agglomerative hierarchical

clustering,36

k-means7,8

and the criterion of Friedman and Rubin,37

the CML does not

allow the specification of which feature (orientation, size and shape) is common to all

clusters and which may differ between clusters. In order to address these drawbacks of

the CML, Banfield and Raftery26

first created a so-called model-based clustering by

extending the CML with the eigenvalue decomposition of the covariance matrix.

Banfield and Raftery’s26

eigenvalue decomposition approach is firstly narrated.

Since the CML originally proposed by Scott and Symons23

assumes that all the

components (clusters) are different, with a Gaussian mixture model, where

);ˆ,ˆ,(maxargˆ XaPLP CMLP Σ=

= ∑=

c

k k

kkP

n

Wn

1

lnminarg , then in order to allow the

specification of which feature is common to all clusters and which may differ between

clusters, Banfield and Raftery26

re-parameterized the covariance matrix, kΣ , in terms of

its eigenvalue decomposition with

Page 5: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 759

kkkk DD ′= ΛΣ kkkk DAD ′= λ

where )(max kk diag Λλ = determines the size of the kth cluster, kD is an orthogonal

matrix representing its orientation, and 2

(1, , , )k k kd

A diag a a= … represents its shape. If

different constrains for its eigenvalue decomposition are given, the criterion for partition,

P , in the CML becomes the following clustering criteria:

1. When 2σλ =k , IAk = and IDk = for all ck , ,1…= and the clustering is spherical

and the same size as for ∑∑∑= ∈=

−==c

k Px

kiP

c

k

kP

ki

axWtrP

1

2

1

ˆminarg)(minargˆ . This

criterion was originally proposed by Ward36

.

2. When 2σλ =k , AAk = and DDk = for all ck , ,1…= and the clustering is

ellipsoidal and the same size, shape and orientation as for ∑=

=c

k

kP WP

1

minargˆ . This

criterion was originally proposed by Friedman and Rubin.37

3. When there is no constrain to kΣ for all ck , ,1…= , the clustering is ellipsoidal and

different in size, shape and orientation to that for

= ∑=

c

k k

kkP

n

WnP

1

lnminargˆ . This

criterion was the original type proposed by Scott and Symons.23

4. When 2kk σλ = , IAk = and IDk = for all ck , ,1…= and the clustering is spherical

and different in size to that for

= ∑

=

c

k k

kkP

n

WtrnP

1

lnminargˆ . This criterion was

proposed by Banfield and Raftery.26

5. When there is only the constraint, AAk = , where A is known, for all ck , ,1…= ,

each cluster is different in size and orientation but has the same known shape, so

= ∑

=

−c

k k

kkP

n

AtrnP

1

1

lnminargˆ Ω . This criterion was also proposed by Banfield

and Raftery.26

For ,..., 1 nxxX = as a random sample of size, n , from a mixture of probability

density functions, ( )θα ,;xf , and the fuzzy c-partition, 1

( , , )c

z z z= … , with values in the

interval [0, 1], such that 1

( ) 1c

j i

j

z x=

=∑ for all i

x X∈ , Yang16

proposed the FCML

objective function as follows:

1 1 1 1

( , , ; ) ln ( ; ) lnc n c n

m m

FCML ij j i ij j

j i j i

J z X z f x r zα θ θ α= = = =

= +∑∑ ∑∑

Page 6: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 760

subject to 1

1c

j

j

α=

=∑ , 0j

α > and 1

1c

ij

j

z=

=∑ for 1, ,i n= … for all [0,1]ij

z ∈ . The

eigenvalue decomposition concept is used to advance the FCML model by re-

parameterizing the covariance matrix, j

Σ , in terms of its eigenvalue decomposition with

.j j j j j j j j

D D D A Dλ′ ′Σ = Λ =

The distribution, ( ; )j i

f x θ , is a multivariate normal distribution. The decomposition of

j j j j jD A Dλ ′Σ = produces the following five cases.

Case 1: If the parameters of j

Σ are allowed to vary, then

1

1/ 2/ 2

1 1 1 1

( ) ( )1( , , , ; ) ln( exp( )) ln .

2(2 )

tc n c ni j j i jm m

FCML ij ij jd

j i j ij

x u x uJ z u X z r zα α

π

= = = =

− Σ −Σ = − +

Σ∑∑ ∑∑ If

11( )(ln ( ) ( ) ln(2 )) ln

2

t

ij j i j j i j jd x u x u d rπ α−= − Σ + − Σ − + + , then the lagrangian FCML

J

of FCML

J is 1 2 1 2

1 1 1 1

( , , , , , ) 1 1c n c c

m

FCML ij ij j ij

j i j j

J z u z d zα λ λ α= = = =

Σ = − − − −

∑∑ ∑ ∑ . The first

derivatives of the lagrangian, FCML

J , are taken with respect to the parameters,

, , ,ij j j j

z uα Σ , and these are set to 0. Therefore, the necessary conditions for the

maximization ( , , , )z uα Σ of ( , , , )FCML

J z uα Σ can be derived as follows:

1 1 1

, 1, ,n c n

m m

j ij il

i l i

z z j cα= = =

= =∑ ∑∑ … (1)

1 1

, 1, ,n n

m m

j ij i ij

i i

u z x z j c= =

= =∑ ∑ … (2)

11

1

1

, 1, , , 1, ,c m

ij

ij

l li

dz j c i n

d

=

= = =

∑ … … (3)

1

1

( )( )

, 1, ,

nm t

ij i j i j

i

j nm

ij

i

z x u x u

j c

z

=

=

− −

Σ = =∑

∑… (4)

The factor, r , of the penalty term, 1 1

lnc n

m

ij j

j i

z α= =

∑∑ , in the FCML objective function,FCML

J ,

is now considered. Since the penalty term is a function of the mixing proportions, j

α ,

the factor, r , must be proportional to the impact of the mixing proportions, j

α , on

FCMLJ . In fact, when r = 0 , the penalty term,

1 1

lnc n

m

ij j

j i

z α= =

∑∑ , has no impact onFCML

J , but

Page 7: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 761

when r = 1 , the mixing proportions, [0,1]j

α ∈ , with 1

1c

j

j

α=

=∑ , have a full impact on

FCMLJ . In this sense, a constraint for r can be set between 0 and 1, i.e. [ , ]r ∈ 0 1 . A way

to estimate the factor, r , is then proposed. Since the true values of the mixing

proportions, j

α , depend on the cluster structure of the data set, the value of r can be

estimated according to its data structure. Because [ , ]r ∈ 0 1 , the correlation coefficients

between data attributes of the data set can be used to estimate r , using the following

process. Let ,..., 1 nxxX = be a data set with n observations and d attributes. We

consider ( ,..., ), , ,s s ns

y x x s d= = …1

1 as the data vector of sth attribute for the data set,

,..., 1 nxxX = . The correlation coefficients between these attribute vectors with

( )( ) ( ) ( ),,

s t

n n n

y y is s it t is s it t

i i i

y y y y y y y y s tρ= = =

= − − − − ≠∑ ∑ ∑2 2

1 1 1

are then calculated. The initial

value of r is 1/2 and the value is increased for positive correlation coefficients and

decreased for negative correlation coefficients. Therefore, the following estimate is

proposed for r :

,1

2

s ty y

s t tr

ρ≠

+

=∑∑

(5)

Since the value of r may exceed 1, if 1r ≥ , we set 1r = . In contrast, the value of r

may become less than 0, so if 0r ≤ , we set 0r = . In this case, the factor, r , is an

aggregate of the degrees of correlation between data attributes for the given data set. If

the data set has a larger aggregated degree of correlation between attribute vectors, then

the mixing proportions, j

α , have a larger impact on the FCML objective function,

FCMLJ .

Case 2: If the covariance matrix is replaced with j

IλΣ = , then all clusters are the same

size. In this case, the objective function becomes the penalized FCM. The updated Eq. (4)

is replaced with the following equation:

( )1 1

1 1

( )( )c n

m t

ij j j j j

j i

c nm

il

l i

z tr x u x u

d z

λ= =

= =

− −

=

∑∑

∑∑ (6)

Case 3: If the cluster sizes are changed to j j

IλΣ = , the updated Eq. (4) is replaced with

the following equation:

( )1

1

( )( )n

m t

ij j j j j

i

j nm

ij

i

z tr x u x u

d z

λ =

=

− −

=∑

∑ (7)

Page 8: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 762

Case 4: All clusters are the same size and shape, but have different orientations, so the

covariance matrix can be defined in the form,t

j j jD ADλΣ = . Let

jW be the sample cross-

product matrix for the kth cluster, written as ( )( )n

t

j ij i j i j

i

W z x x x x=

= − −∑1

. Note that

/j j

W n is the MLE of j

∑ . The eigenvalue decomposition of j

W becomes t

j j j jW L L= Ω ,

where , ,j j pjdiag w wΩ = …1 and mj

w is the mth eigenvalue of j

W . In deriving the

updated equations, j

L is treated as the estimation of j

D , so the updated Eq. (4) is

replaced with the following equations:

ij

c n cm

j

j i j

tr A d zλ −

= = =

= Ω

∑ ∑∑1

1 1 1

(8)

, 1, ,t

j j jL AL j cλ∑ = = … (9)

Case 5: When the parameters of the covariance matrix are the same as for case 4, but the

size to be changed, more general elliptical shapes are produced. The covariance matrix is

changed to t

j j j jD ADλΣ = . In this case, the updated Eq. (4) is replaced with the

following equations:

ij

c nm

j j

j i

tr A d zλ −

= =

= Ω

∑ ∑1

1 1

(10)

, 1, ,t

j j j jL AL j cλ∑ = = … (11)

Thus, we propose the robust FCML clustering algorithm as follows:

Robust FCML clustering algorithm

Step 1: Fix (1, )m ∈ ∞ , fix 2 c n≤ ≤ and fix any 0ε > .

Give an initial partition (0)z and let 1s = .

Estimate the factor r of the penalty term using Eq. (5).

Step 2: Compute ( )sα and ( )su with ( 1)sz − by (1) and (2).

Step 3: Compute ( )sΣ with ( 1)sz − and ( )su by (4) or (6) or (7) or (8) and (9) or (10) and

(11).

Step 4: Update to ( )sz with ( ) ( )

,s s

uα and ( )sΣ by (3).

Step 5: Compare ( )sz to

( 1)sz − in a convenient matrix norm.

If ( ) ( 1)s s

u u ε−− < , Stop.

Else 1s s= + , return to Step 2.

The use of these five cases for the robust FCML clustering algorithm is important.

The following notations are used to indicate the presentation of different cases, as

follows: vvv for case 1, ei for case 2, vi for case 3, eve for case 4 and vve for case 5.

Page 9: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 763

After using the different models: vvv, ei, vi, eve and vve, for the robust FCML clustering

algorithm with different cluster numbers, the best model with the optimal number of

clusters must be selected. The Bayesian information criterion (BIC) is a very suitable tool

for model selection, so it is used for the model selection criterion.

The BIC is a model selection method that is based on the Bayesian method, using

the Bayes factor and posterior model probabilities. The main concept is that if there are

1, ,

kM M… models, with prior probabilities ( ), 1, ,

kp M k K= … , then, by the Bayes

theorem, ( | ) ( | ) ( ),k k k

p M X p X M p M∝ where ( | ) ( | , ) ( | )k k k k k k

p X M p X M p M dθ θ θ= ∫ and

( | )k k

p Mθ is the prior distribution of k

θ and ( | )k

p X M is known as the integrated

likelihood of the model,k

M . The Bayesian approach to model selection is to choose the

model that is most likely to be a posterior. To compare two models, 1

M and2

M , the

Bayes factor is defined as the ratio of the two integrated likelihoods,

12 1 2( | ) / ( | ).B p X M p X M= However, the main difficulty in using the Bayes factor is

the evaluation of the integral that defines the integrated likelihood. For regular

models, the integrated likelihood can be approximated by the BIC, using

ˆ2 log ( | ) 2 log ( | , ) log( ) ,k k k k k

p X M p X M v n BICθ≈ − = where k

v is the number of

independent parameters to be estimated ink

M . A large BIC score indicates strong

evidence for the corresponding model, so the BIC score can be used to compare models

with different covariance matrix parameterizations and different numbers of clusters for

the robust FCML clustering method. All BIC values are calculated from the cluster

number, 2, to the given maximum number of clusters for these five different models.

Using the plot of the BIC values, the best model with the optimal number of clusters is

easily chosen, so a robust FCML clustering framework is constructed, as shown in Fig. 1.

Fig. 1. Robust FCML clustering framework.

Page 10: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 764

4. Numerical Examples and Real Data Applications

In this section, we present several examples that use simulated data. The proposed

method is applied to these simulated data sets of different sizes, shapes and orientations.

One of the five models: vvv, ei, vi, eve and vve, is chosen as the best one with the

optimal number of clusters, using the proposed clustering framework. Yang25

proposed

the FCML, in which the penalized fuzzy c-means (PFCM) is derived using the FCML, so

that the well-known fuzzy c-means (FCM) become a special case of the PFCM.

However, the FCML with PFCM needs an a priori number of clusters to be assigned and

parameter initialization for the factor, r , of the penalty term. In 1978, Gustafson and

Kessel (GK) studied the effect of different cluster shapes, with the exclusion of the

spherical shape, by replacing the Euclidean distance, ( , )i j i jd x u x u= − , in the FCM

objective function with the Mahalanobis distance, ( ) ( )2

( , ) ,j

t

i j i j i j j i jAd x u x u x u A x u= − = − −

where j

A is a positive definite d d× matrix with a determinate, ( )det j jA ρ= , that is a

fixed constant. This is termed the GK algorithm. Some comparisons of the proposed

method with the FCM, PFCM and GK algorithms are given, using the artificial data sets.

For real data sets, the method uses iris data, diabetes data, and Wisconsin breast cancer

datasets.

Example 1. In the first example, a three-cluster data set with the same shape, but different sample sizes is considered. The data set has 300 data points; 200 data points in one cluster and 50 data points in two other clusters, as shown in Fig. 2. The proposed method is used for the data set and it is found that the BIC values select model ei as the best one, with an optimal cluster number of 3, as shown in Fig. 2. The clustering results for the five models: vvv, ei, vi, eve and vve, are also shown in Fig. 2. The FCM and PFCM are also used in this example. The FCM, PFCM and GK algorithms require the assignation of a cluster number. In fact, these algorithms were implemented using different cluster numbers, but there are always too many misclassifications for these different cluster numbers, except for the cluster number, 3, so only those clustering results for the FCM, PFCM and GK with the cluster number, 3, were used for the comparison, as shown in Fig. 3. It is seen that the results for the PFCM have less misclassifications than those for the FCM, but the proposed method is the best of them. Since the factor, r , of the penalty term is effective in the PFCM, the PFCM was also implemented for several values of r . The results are shown in Table 1.

Fig. 2. Data set and different clustering results and BIC values for Example 1.

Page 11: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 765

Fig. 3. Clustering results of our method, FCM and PFCM for Example 1.

Table 1. Error rates of our method, PFCM and FCM for Example 1.

The next examples use data with variation in shape. The proposed method produces

good clustering results because the estimation of the factor, r , that is suggested is proper

and the BIC criterion is used to choose a suitable model.

Example 2. This example uses two data sets with line-shaped clusters. In the first data

set, the three line-shaped clusters are overlapped in the middle, as shown in the first

graph (original) on the left side of Fig. 4. The proposed method is used first for this data

set and it is found that the models, ei and vi, perform badly, but that the other models

produce good clustering results with an optimal cluster number of 3, as shown on the left

side of Fig. 4. Using the BIC values shown on the right side of Fig. 4, the model, eve, is

Methods Error rate

GK 4.3%

FCM ( PFCM with r = 0 ) 2%

PFCM with r = 0.2 1.9%

PFCM with r = 0.4 1.82%

PFCM with r = 0.6 1.74%

PFCM with r = 0.8 1.67%

PFCM with r = 1 1.67%

Our method 1%

Page 12: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 766

best, with an optimal cluster number of 3. Figure 5 shows the clustering results for the

FCM, PFCM and GK, and the proposed method is obviously better than the others. In the

second data set, the three line-shaped clusters constitute an A-like character, as shown in

Fig. 6. For this data set, the models, ei and vi, perform badly, but the other models

produce correct cluster numbers and good clustering results. The model, vve, is selected

as the best one, with an optimal cluster number of 3. The proposed method is also better

than the FCM and PFCM, but GK performs well. The error rates and clustering results

are shown in Fig. 7.

Fig. 4. Data set and different clustering results and BIC values for the first data set.

Fig. 5. Error rates and clustering results of our method, FCM and PFCM for the first data set.

Page 13: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 767

Fig. 6. Data set and different clustering results and BIC values for the second data set.

Fig.7. Error rates and clustering results of our method, FCM and PFCM for the second data set.

Example 3. In this example, a data set is generated from a mixture of three bivariate

normal distributions, using an unconstrained covariance matrix. The data set has 200 data

points in a spherically-shaped cluster and 50 data points each in two other clusters, as

shown in Fig. 8. Using the BIC values from the proposed method, as shown in Fig. 8, the

model, vvv, is obviously selected as the best one, with an optimal cluster number of 3.

For this data set, it is found that the model, vvv, performs very well, but some other

models do not. The error rates and the clustering results for the proposed method, FCM,

PFCM and GK are shown in Fig. 9. It is seen that the proposed method still performs

well for this data set.

Page 14: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 768

Fig. 8. Data set and different clustering results and BIC values for Example 3.

Fig. 9. Error rates and clustering results of our method, FCM, and PFCM for Example 3.

These examples show that the proposed method actually works well for these

simulated data sets, with different sizes, shapes and orientations. The next examples use

real data sets: iris data, diabetes data and Wisconsin breast cancer datasets.

Example 4. (Iris data38

) The Iris data is a typical test data set for many classification

techniques in pattern recognition. The data consists of 50 samples from each of three

species of iris flowers (iris setosa, iris virginica and iris versicoclor). The attributes of the

iris data are the length and the width of the sepal and the petal. The scatterplot for the Iris

Page 15: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 769

data is shown in Fig. 10 (see Wikipedia). Figure 10 shows that there are two clusters in

each plot, the larger cluster containing two overlapped clusters. The proposed method

was implemented for the Iris data set. By using the random membership initial, (0)z , in

performing the proposed method, it is found that there are two groups of BIC values for

the Iris data set, as shown in Fig. 11. Both groups of BIC values always choose the

model, vve, as the best one. It is found that one produces an optimal cluster number of 2,

as shown on the left of Fig. 11, but the other produces an optimal cluster number of 3, as

shown on the right of Fig. 11. The proposed method was run with 200 times with the

random membership initial, (0)z , and approximately 70% of the runs produce a group of

BIC values with an optimal cluster number of 2 and that approximately 30% of the runs

produce a group of BIC values with an optimal cluster number of 3. For the case with an

optimal cluster number of 3, the best model, vve, can be more than 85% accurate in

correctly detecting the three clusters. In contrast, the second better model, eve, can be

more than 82% accurate in correctly detecting the three clusters.

Fig. 10. Iris data set.

Fig. 11. Two groups of BIC values for Iris data set.

Page 16: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 770

Example 5. (Diabetes dataset39

) This example uses, the diabetes data set, which has 145

observations and 3 attributes (Reaven and Miller39

). The meaning of the attributes is

described as follows: Glucose is the plasma glucose response to oral glucose, Insulin is

the plasma insulin response to oral glucose and Sspg is the degree of insulin resistance.

This data set is clinically clustered into 3 groups (chemical diabetes, overt diabetes, and

normal). Figure 12 is the 3D plot of the diabetes data. It is seen that these clusters are

overlapping and are clearly not spherical in shape. The triangular symbol in Fig. 12

denotes the cluster of normal patients. The structure of this cluster has a “fat middle”.

The circular and square symbols in Fig. 12 denote the clusters for chemical and overt

diabetes, respectively. The shapes of these two clusters resemble two wings. The varying

structure of this data means that many clustering methods are unsuitable. The data set

was used with the proposed method and the results were compared with those for the

FCM and PFCM, for a given cluster number of 3. Firstly, the results for BIC values using

the proposed method are shown in Fig. 13. It is seen that the best model is vvv, with an

optimal cluster number of 3. This is the cluster number used for the FCM and PFCM.

The error rates and clustering results for the proposed method, the FCM and the PFCM

are shown in Fig. 14. It is seen that the proposed method performs better than the FCM

and the PFCM for the Diabetes dataset.

Fig. 12. The 3D plot of diabetes data set.

Fig. 13. BIC values for diabetes data set after performed our method.

Page 17: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 771

Fig. 14. Error rates and clustering results from our method, FCM and PFCM.

Example 6. (Wisconsin breast cancer dataset36,38

) This example uses the proposed

method to analyze the real Wisconsin breast cancer dataset, which is taken from the UCI

Machine Learning Repository38

(also see Wolberg and Mangasarian40

). There are 569

instances and 32 attributes in this two-cluster data set. The first attribute of this data is the

ID number and the second attribute is the diagnosis, which is treated as the true

classification (M = malignant, B = benign). Other attributes indicate the real-value

features which are computed for each cell nucleus: radius, texture, perimeter, area, etc.

The first two indicator attributes were initially ignored and the description of the results

using this data from the UCI website was used. The attributes: worst area, worst

smoothness and mean texture, were retained and the three-dimensional data was used

again, using the proposed method. The proper cluster model, vvv, has an optimal cluster

number of 2, as seen in Fig. 15. Using the cluster number, 2, the FCM and PFCM were

also implemented using this data. It is seen that the accuracy of the proposed method is

94.5%, which is better than that for the other clustering methods, as shown in Table 2.

Table 2. Average accuracies for different clustering methods.

Clustering

method

Our method PFCM with r = 1 FCM (PFCM with r = 0)

Optimal

accuracy

94.5% 92.1% 91.9%

Fig. 15. BIC values of our method for Wisconsin breast cancer dataset.

Page 18: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 772

Example 7. This example uses a three-component bivariate Gaussian mixture

distribution with the parameters (see Chatzis and Varvarigou41

),

1 2 3 1 2 3

1; (0,3) , (3,0) , ( 3,0) ;

3

T T Tu u uα α α= = = = = = −

1 2 3

2 0.5 1 0 2 0.5, , .

0.5 0.5 0 0.1 0.5 0.5

− Σ = Σ = Σ =

A data set is generated from this Gaussian mixture model with a sample size, n = 2400, as

shown in Fig. 16. The proposed method is firstly implemented using the data set and the

BIC values select the model, vvv, as the best one, with an optimal cluster number of 3, as

shown in Fig. 17. In order to determine the effect of noise on the proposed method, 600

noisy points that are drawn from a uniform distribution over the range [−20,20] are

added, as shown in Fig. 18. The proposed method is then implemented using this noisy

data set and the BIC values also select the model, vvv, as the best one, but with an

optimal cluster number of 4, as shown in Fig. 19. For this noisy data set, there is not good

clustering. That is, the proposed method is heavily affected by noisy points, for this noisy

data set. On adding 600 noisy points that are drawn from a uniform distribution over the

range [−10,10], as shown in Fig. 20, the proposed method was also implemented for this

noisy data set and the BIC values also select the model, vvv, as the best one, with an

optimal cluster number of 4, as shown in Fig. 21. However, for this noisy data set, there

is still good clustering, not affected by the noisy points too much. Overall, a comparison

with Chatzis and Varvarigou41

shows that the proposed method is not significantly

tolerant of noise or outliers; it may consider all noisy points as another cluster, as shown

in Fig. 21.

It is worthy of note that the proposed robust fuzzy CML clustering framework exhibits

clustering characteristics that are robustness to parameter initialization, cluster number

and different volumes of clusters, but not to noise and outliers. In the literature, there are

some studies of clustering characteristics with robustness to noise and outliers that use

multivariate t-distributions, such as those of Chatzis and Varvarigou,41,42

Peel and

McLachlan43

and Greselin and Ingrassia.44

Further research will extend this robust fuzzy

CML clustering framework to multivariate t-distributions.

-6 -4 -2 0 2 4

-3

-2

-1

0

1

2

3

4

5

6

Fig.16. Three-component bivariate Gaussian mixture data set.

Page 19: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 773

-5 0 5-2

0

2

4

Original

-5 0 5-2

0

2

4

3

2 2

333

1

2

13

31

222

11

2

331

2

33

3 11

2

33333

2

1

2

31 1

2

1

2

1 1

2

133

1

222

11

2

3 1

2

1

2

113331

23

3

2

13

2

3 3

31

2

1

2

11

22

2

1

2

3 11

2

1

2

22

33 1

22

33

2

13

1

2

13 13

22

32

3

2

3 13 1 13

2

1 1

2

1

22

2

3 13

113 1

2

11 113 1 1 1

313

222

13

2

13

2

3

2

133

3

2

1

2 2

2

1

2

1 1

22

3 1 1

2 2

3

2

11

3 3

3

1

2

13

3 113 1

2 2

3

33

22

33 1

2

313

3

3

2

3

2

3 33

2

3

2

33

333

2

1

2

1 133

2

1

22

333

2

1111

2 2

31 1

3

2 22

2

1 133

3 1133

2

3

2

311

2

13

2233

2

3 2

1

22 2

23

1

2 2

31

2

1

233

13 1 11

2

3 11

2

1 13

2

22

3 3

2

3

23

1113 1

2

31

31

2

133

2

33

2

3

2 2

3

22

1

2

3

1

1

2

2

333 1

22

1

23

2

3

3

3 33 3

2 2

11

2

31

2

13

1 113 13

222

3 1

2

13

3 1113

1

2

11

2

3 1

2

3

33 11

2

3113 3

2

3 11

2

133

2

3 13 3

13 1

2

33

2

33

1 13

2

11

2

11

2

33

3 31 1

13

222

3 1 13

3 13

113

33

33

22

1

22

1

22

22

23

2

2

1

22

1

2

33

2

1

2

3

222

1

2

1 13

2

3

22

3133

2

1131

2

3 1

2 2

33

222

2

3 1

2

11

2

3

22

1

2

3 1

22

3

2

111

1

2 2

13 131

2

11

3 3

2

3 133

2

1

2

311

11

2

31

22

1 11

22

23

2

3 13 13

111

2

11 11333

313 1

22

13 133

2

33

2

3 3 13

2

3

2

133

2333 1 1

2

33

13

3

222

33

1

2

13 11

2

13

1

322

3

2

1

2

1

22

3 3311

22

3 31 1

22

13 13 11

22

313

2

1 1

222

13

223

1113

22

2

3

22

22

3

2

13

2

33 1

22

32

33 1

2 2

13

2

3 13

2

33 3

31

2

1

2

3

33

2 22

3 1

22

32

1 1 13

333 11

2

13

2

1

2

31

3 3

33

2

2

13 1

23

222

2

3

22

11

2

13

2

1

2

1 1

2

1

2

13

2

3

2

1 11

22

33 1

2

133

2

1

22

1

2

11

222

13 11

2

1111

22

131 1

33

31

3

2

3 13

22

22

2

322

3 1

2

2

313

2

3

2

3

22

33 1 1

2

1

2

3

22

3

2

3

2

3

22

1

2

1

2

31

2

1

2

3 3

2

33 113

2

11

222

33 3

2

11

2

13

2

33 1

222

1

22

331

2

3

2

1

22

1

2

13 3 1 1 1

222

3

222

113 11

2

1

2

222

1

2

22

1 11

2

3

2

2

2

1

22

13

2

3

22

3 1

22

3

3 11

2

11133 3

22

13

33

1

311

2

1 13

3

2

3 1

2

3

2

13

22

1 13

2

2 2

1

23

3 1

22

1

2

113 1

2

113

2

11

22

31 13

11 1

2

3

22

11

22

2

13 1

2 2

1

2

1333

2

3 113 1

2

3 1

2

133

131 133 3 113

222 2

2

133

3

2

1

2

1131

22

1 133

3

22

31 1

3

22

13 13113 133

13 1

3

2 22 2

1311 11

1 133

3

3

2

22

32

3 1

2

11

23

11

22 2

33

2

3

22

1 1

2

2

13

3

2

13

22

3

2

31

31

13

1311

2

11

2

13

23

3

2

13 113 11

22

1

2 2

13 13

22

33

2

33 3

31

3

1

2

3

223

3

2

3

2 22

11 11

2

3 13

2

3

2

3 111

22

1

2

3

222

3 13 1

2 2

33 3

2

1133 33

13

22

3

1

2

13

222 2

11

2

2

3333

2

3333 1 1

2

1

2

33

33

2

1

2

3

2 22 2

3 1

22

1

21

2

13

2

3

2

11

2

33

2

3 1 111 1

22

1

2

1

22

3

2

2

113

33

3 11 111

2

11 1

2 2

33113

2

13

22

33

2

3

22

113

22

2

21111

3 1

22

31

1

2

3

2

3

2

3

2

12

33

2

13 1

2

22

3

2

3 13

22

1

2

32 2

11

2

133 1

2

3

22 22

2

1

2

1 1 1

2

1

22

3 113 13

22

111

2

3

2

1

22

332

1

2 2

3

2

133

13 13 11

2

33

2

13 13

113 11

2

3 3 1

3

2

33

13

2

111

2

133 11

2

3

22

3

2

1

22

1

222

2

1 1

22

2

33 1 1

2

33

13

2

133

2

13

333

2

1 13

222

3

2

1

222

2

3 13

13

2

13 3

2

22 2

1

23

2

12

1

22

1

2

3

22

1113

2

1

2

1

322

3

2

33 1

33

2

3

222

2

1

2

22

1

3

111

3

2 2

3

2

2

3 1

3

33

2

333311

1

2

313

31 1

2

333

3

1113 1

2

1 13

2

3 1 1 11 113 11

2 2

3

3

1

2

1

2 2

13 11 1

22

1

2

1 13 1 133

2

3 13

3

22

11

2 2

33

2

3

2

3 3

2 22

111 13 13

22

3

2

1 1

22

3333

13

1

222

311

2

1

22

33 113 33 3

2

3

2

33

2

31

3 1

22

22

3

22

2

13

13

2

11

22

111

3

2

33

22

111

3 13 3 1

2

13

22

222

1

22

3 3

2

3 1

22

3 3

2

11

2

1

2

33

2

13

22

13

2

33

2

3

2

2

3 1

2

31

113

2313 1

2

33

1 1

22

13 1

2

11

22

3 1 113 133

313 3

2

1

2

11

2

113 1

2

13

2

1

2

113

2

2

1

2

11 1 13

11

2

11

2

3

2

3 13 13

2

113

3

1

2 2

1 1 1

2

1

2

2

133

3

3

1 11

222

1333

1 1133

3

2

3 113 1

1 13

22

133 13 1

2

1

2

3 13 131 1 1

2

33

2

3

3

2

3 1311

3

311

13

33 1 1

2

1

2

113

2

3 13 1

2

1

2

1 111

222

222

11

13

2

1

2

3

22

11

33 133

2

3

113

33

31

22

3

2

133

31 1 1

333

2

2

32

1

2

3

3 1111

3

3

2

31

2

3 1

22

3 1133

2

3 3

2

313 3 1 1313

2

22

33 1

22

3

12

13 113 13

3

1

222

113

3 3 1

2222

1 13

2

1

32

3

2

11

22

3

22

133 133 1

23

1 1

2 2

333 133

2

1

2 2

1113

2

3

2 2

2

13

2

1

2

3

2

1

3

22

22222

31

2 2

1

32

3 3 22 2

33

1

22

11

2

1

2

1

2

3 1113

2

1

22

2

3

311

23

11

1

2 2

1

22

13 1

2

13

3

2

1

2

2

1

2

1

222

33 3

3

22

1

2

13 313

3

2

3 13 1 113 1

2

3 3 1

2

3

2

13

33

1333

2

2

3 33 3 1

2

33 1

222

2

3 11

222

333 1

11131

2

11 11

3 3

2

3

VVV

-5 0 5-2

0

2

4

3

2 2

333

1

2

13

31

222

11

2

331

2

33

3 11

2

33333

2

1

2

31 1

2

1

2

1 1

2

133

1

222

11

2

3 1

2

1

2

113331

23

3

2

13

2

3 3

31

2

1

2

11

22

2

1

2

3 11

2

1

2

22

33 1

22

33

2

13

1

2

13 13

22

32

3

2

3 13 1 13

2

1 1

2

1

22

1

3 13

113 1

2

11 113 1 1 1

313

222

13

2

13

2

3

2

133

3

2

1

2 2

2

1

2

1 1

22

3 1 1

2 2

3

2

11

3 3

3

1

3

13

3 113 1

3 2

3

33

22

33 1

2

313

3

3

2

3

2

3 33

2

3

2

33

333

2

1

2

1 133

2

1

22

333

2

1111

2 2

31 1

3

2 23

2

1 133

3 1133

2

3

2

311

2

13

2333

2

3 2

1

22 2

23

1

2 2

31

2

1

233

13 1 11

2

3 11

2

1 13

2

22

3 3

2

1

33

1113 1

2

31

11

2

133

2

33

2

3

2 2

3

22

1

2

3

1

1

2

2

333 1

22

1

23

2

3

3

3 33 3

2 2

11

2

31

2

13

1 113 13

222

3 1

2

13

3 1113

1

2

11

3

3 1

2

3

13 11

2

3113 3

2

3 11

2

133

2

3 13 3

13 1

2

33

2

33

1 13

2

11

2

11

3

33

3 11 1

13

222

3 1 13

3 13

113

33

33

22

1

33

1

22

22

23

2

2

1

22

1

2

33

2

1

2

3

222

1

2

1 13

2

3

22

3133

2

1131

2

3 1

2 2

33

232

2

3 1

2

11

2

3

22

1

2

3 1

22

3

2

112

1

3 2

13 131

2

11

3 3

2

3 133

2

1

2

311

11

2

31

22

1 11

22

23

2

3 13 13

111

2

11 11333

313 1

22

13 133

2

33

2

3 3 13

3

3

2

133

2333 1 1

2

33

13

3

222

33

1

2

13 11

2

13

1

322

3

2

1

2

1

22

3 3311

22

3 31 1

22

13 13 11

22

313

2

1 1

122

13

223

1113

22

2

3

22

22

3

2

13

2

33 1

22

33

33 1

2 2

13

2

3 13

1

33 3

31

2

1

2

3

33

2 22

3 1

22

33

1 1 13

333 11

2

13

2

1

2

31

3 3

33

2

2

13 1

23

222

2

3

22

11

2

13

2

1

2

1 1

2

1

2

13

2

3

2

1 11

22

33 1

2

133

2

1

22

1

2

11

222

13 11

2

1111

22

131 1

33

31

3

3

3 13

23

22

2

322

3 1

2

2

313

2

3

2

3

22

33 1 1

2

1

2

3

22

3

2

3

2

3

23

1

2

1

2

31

2

1

2

3 3

2

33 113

2

11

223

33 3

2

11

2

13

2

33 1

223

1

12

331

2

3

2

1

22

1

2

13 3 1 1 1

222

3

222

113 11

2

1

2

222

1

2

22

1 11

3

3

2

3

2

1

22

13

2

3

22

3 1

22

1

3 11

2

11133 3

32

13

33

1

311

2

1 13

3

2

3 1

2

3

2

13

22

1 11

2

2 2

1

33

3 1

22

1

2

113 1

2

113

2

11

22

31 13

11 1

2

3

23

11

22

2

13 1

2 2

1

2

1333

2

3 113 1

2

3 1

2

133

131 133 3 113

222 2

2

133

3

2

1

2

1131

32

1 133

3

22

311

1

22

13 13113 133

13 1

3

2 22 1

1311 11

1 133

3

3

2

22

32

3 1

3

11

23

11

22 2

33

2

3

22

1 1

2

2

13

3

2

13

22

3

2

31

31

13

1311

2

11

2

13

33

3

2

13 113 11

21

1

2 2

13 13

22

33

2

33 3

31

3

1

2

3

223

3

2

3

2 22

11 11

2

3 13

2

3

2

3 111

22

1

2

3

222

3 13 1

2 2

33 3

2

1133 33

13

22

3

1

2

13

322 2

11

3

2

3333

2

3333 1 1

2

1

2

33

33

2

1

2

3

2 22 2

3 1

22

1

21

2

13

2

3

2

11

2

33

2

3 1 111 1

22

1

3

1

22

3

2

2

113

33

3 11 111

2

11 1

2 2

33113

2

13

22

33

2

3

22

113

33

2

31111

3 1

22

31

1

2

3

2

3

2

3

2

13

33

2

13 1

2

23

3

2

3 13

22

1

2

32 2

11

2

133 1

2

3

22 22

2

1

2

1 1 1

2

1

22

3 113 13

22

111

2

3

2

1

22

333

1

2 2

3

2

133

13 13 11

2

33

3

13 13

113 11

2

3 3 1

3

2

33

13

3

111

2

133 11

2

3

22

3

2

1

22

1

222

2

1 1

22

2

33 1 1

2

33

13

2

133

2

13

333

2

1 13

233

3

2

1

222

2

3 13

13

2

13 3

2

33 2

1

13

2

12

1

22

1

2

3

22

1113

2

1

2

1

322

3

2

33 1

33

2

3

222

2

1

2

22

1

3

111

3

2 2

3

2

2

3 1

3

33

2

3333 11

1

2

313

31 1

2

333

3

1113 1

2

1 13

2

3 1 1 11 113 11

2 2

3

1

1

2

1

2 2

13 11 1

22

1

2

1 13 1 133

2

3 11

3

22

11

2 2

33

2

3

2

3 3

2 22

111 13 13

22

3

2

1 1

22

3333

13

1

222

311

2

1

22

13 113 33 3

2

3

2

33

2

31

3 1

22

22

3

12

2

13

13

2

11

22

111

3

2

33

22

111

3 13 3 1

2

13

22

222

1

23

3 3

2

3 1

22

3 1

2

11

2

1

2

33

2

13

22

13

2

33

2

3

3

2

3 1

2

31

113

2313 1

2

33

1 1

22

13 1

2

11

22

3 1 113 133

313 3

2

1

2

11

2

113 1

2

13

2

1

2

113

2

2

1

2

11 1 13

11

2

11

2

3

2

3 13 13

2

113

3

1

2 2

1 1 1

2

1

2

2

133

3

3

1 11

222

1333

1 1133

3

2

3 113 1

1 13

22

133 13 1

2

1

2

3 13 131 1 1

2

33

2

3

3

2

3 1311

3

311

13

33 1 1

2

1

2

113

2

3 13 1

2

1

2

1 111

222

2 22

1113

2

1

2

3

22

11

33 133

2

3

113

33

31

23

3

2

133

31 1 1

333

2

2

32

1

2

3

3 1111

3

3

2

31

2

3 1

22

3 1133

2

3 3

2

313 3 1 1313

3

22

33 1

22

3

13

13 113 13

3

1

222

113

3 3 1

2222

1 13

2

1

32

3

2

11

22

3

22

133 133 1

23

1 1

2 2

133 133

2

1

2 2

1113

3

3

2 2

2

13

2

1

2

3

2

1

3

22

22222

31

2 2

1

32

3 3 22 2

33

1

22

11

2

1

3

1

2

3 1113

2

1

22

2

3

211

23

11

1

2 2

1

22

13 1

2

13

2

2

1

2

2

1

2

1

222

33 3

3

23

1

2

13 313

3

2

3 13 1 113 1

2

3 3 1

2

3

2

13

33

1333

2

2

3 33 3 1

2

33 1

222

3

3 11

222

333 1

11131

2

11 11

3 3

2

3

EI

-5 0 5-2

0

2

4

3

2 2

333

1

2

13

31

223

11

2

331

2

33

3 11

2

33333

2

3

2

31 1

2

1

2

1 1

2

133

1

222

11

2

3 1

2

1

2

113331

23

3

2

13

2

3 3

31

2

1

2

11

22

2

1

2

3 11

2

1

2

22

33 1

22

33

2

13

1

2

13 13

22

33

3

2

3 13 1 13

2

1 1

2

1

22

2

3 13

113 1

2

13 113 1 1 1

313

222

13

2

13

2

3

2

133

3

2

1

2 2

2

1

2

1 1

22

3 1 1

3 2

3

2

11

3 3

3

1

3

13

3 113 1

3 2

3

33

22

33 1

2

313

3

3

2

3

2

3 33

2

3

2

33

333

2

1

2

1 133

2

1

22

333

2

1111

2 2

31 1

3

2 23

2

1 133

3 1133

2

3

2

311

2

13

2333

2

3 2

1

23 2

23

1

2 2

31

2

1

233

13 1 11

2

3 11

2

1 13

2

22

3 3

2

1

33

1113 1

2

31

31

2

133

2

33

2

3

3 2

3

22

1

2

3

1

1

2

2

333 1

22

1

23

2

3

3

3 33 3

2 2

11

2

31

2

13

1 113 13

222

3 1

2

13

3 1113

1

2

11

3

3 1

2

3

33 11

2

3113 3

2

3 11

2

133

2

3 13 3

13 1

2

33

2

33

1 13

2

11

2

11

3

33

3 31 1

13

222

3 1 13

3 13

113

33

33

22

1

33

1

22

22

23

2

2

1

22

1

2

33

2

1

2

3

222

1

2

3 13

2

3

22

3133

2

1131

2

3 1

3 2

33

232

2

3 1

2

11

2

3

22

1

2

3 1

22

3

2

112

1

3 2

13 131

2

11

3 3

2

3 133

2

1

2

311

11

2

33

22

3 11

22

23

2

3 13 13

111

2

11 11333

313 1

22

13 133

2

33

2

3 3 13

3

3

2

133

2333 1 1

2

33

13

3

222

33

1

2

13 11

2

13

1

322

3

2

1

2

1

22

3 3311

22

3 31 1

22

13 13 11

22

313

2

1 1

222

13

223

1113

22

2

3

22

22

3

2

13

2

33 1

22

33

33 1

2 2

13

2

3 13

2

33 3

31

2

1

2

3

33

2 22

3 1

22

33

1 1 13

333 11

2

13

2

1

2

31

3 3

33

2

2

13 1

23

222

2

3

22

11

2

13

3

1

2

1 1

2

1

2

13

2

3

2

1 11

22

33 1

2

133

2

1

23

1

2

11

222

13 11

2

1111

22

131 1

33

31

3

3

3 13

23

22

2

322

3 1

2

3

313

2

3

2

3

22

33 1 1

2

1

2

3

22

3

2

3

2

3

23

1

2

1

2

31

2

1

2

3 3

2

33 133

2

11

233

33 3

2

11

2

13

2

33 1

223

1

22

331

2

3

2

1

22

1

2

13 3 1 1 1

222

3

222

113 11

2

1

2

222

1

2

22

1 11

3

3

2

3

2

1

22

13

2

3

22

3 1

22

3

3 11

2

11133 3

32

13

33

1

311

2

1 13

3

2

3 1

2

3

2

13

22

3 13

2

2 2

1

33

3 1

22

1

2

113 1

2

113

2

11

22

31 13

11 1

2

3

23

11

22

2

13 1

2 2

1

2

2333

2

3 113 1

2

3 1

2

133

131 133 3 113

222 2

2

133

3

2

1

2

1131

32

1 133

3

22

31 1

1

22

13 13113 133

13 1

3

3 22 2

1311 11

1 133

3

3

2

22

32

3 1

3

11

23

12

22 2

33

2

3

22

1 1

2

2

13

3

2

13

22

3

2

31

31

23

1311

2

11

2

13

33

3

2

13 113 11

22

1

2 2

13 13

22

33

2

33 3

31

3

1

2

3

223

3

2

3

2 22

11 11

2

3 13

2

3

2

3 111

22

1

2

3

222

3 13 1

2 2

33 3

2

1133 33

13

22

3

1

2

13

322 2

11

3

2

3333

2

3333 1 1

2

1

2

33

33

2

1

2

3

2 22 2

3 1

22

1

31

2

13

2

3

2

11

2

33

2

3 1 111 1

22

1

3

1

22

3

2

2

113

33

3 11 111

2

11 1

2 2

33113

2

13

22

33

2

3

22

113

33

2

31111

3 1

22

31

1

2

3

2

3

2

3

2

13

33

2

13 1

2

23

3

2

3 13

22

1

2

32 2

11

2

133 1

3

3

22 22

2

1

2

2 1 1

2

1

22

3 113 13

22

111

2

3

3

1

22

333

1

2 2

3

2

133

13 13 11

2

33

3

13 13

113 11

2

3 3 1

3

2

33

13

3

111

2

133 11

2

3

22

3

2

1

22

1

222

2

1 1

22

2

33 1 1

2

33

13

2

133

2

13

333

2

1 13

333

3

2

1

222

2

3 13

13

2

13 3

2

33 2

1

23

3

13

1

22

1

2

3

22

1113

2

1

3

1

322

3

2

33 1

33

2

3

222

2

1

2

22

1

3

111

3

2 2

3

2

2

3 1

3

33

2

333311

1

2

313

31 1

2

333

3

1113 1

2

1 13

2

3 1 1 11 113 11

2 2

3

1

1

2

1

2 2

13 11 1

22

1

2

1 13 1 133

2

3 13

3

22

11

2 2

33

2

3

2

3 3

2 22

111 13 13

22

3

2

1 1

22

3333

13

1

222

311

2

1

22

33 113 33 3

2

3

2

33

2

31

3 1

22

22

3

22

2

13

13

2

11

22

111

3

2

33

33

111

3 13 3 1

2

13

22

222

1

23

3 3

2

3 1

22

3 3

2

11

2

1

2

33

2

13

22

13

2

33

2

3

3

2

3 1

2

31

113

2313 1

2

33

1 1

22

13 1

2

11

22

3 1 113 133

313 3

2

1

2

11

2

113 1

2

13

2

1

2

113

2

2

1

2

11 1 13

11

2

11

2

3

2

3 13 13

2

113

3

1

3 2

1 1 1

2

1

2

2

133

3

3

1 11

222

1333

1 1133

3

2

3 113 1

1 13

22

133 13 1

2

1

2

3 13 131 1 1

2

33

2

3

3

2

3 1311

3

311

13

33 1 1

2

1

2

113

2

3 13 1

2

1

2

1 111

222

222

11

13

2

1

2

3

22

11

33 133

2

3

113

33

31

23

3

2

133

31 1 1

333

2

2

32

1

2

3

3 1111

3

3

2

31

2

3 1

22

3 1133

2

3 3

2

313 3 1 1313

3

22

33 1

22

3

13

13 113 13

3

1

222

113

3 3 1

2232

1 13

2

1

32

3

2

11

22

3

22

133 133 3

23

1 1

2 2

333 133

2

1

2 2

1113

3

3

2 2

2

13

2

1

2

3

2

1

3

22

22222

31

2 2

1

32

3 3 22 2

33

1

22

11

2

1

3

1

2

3 1113

2

1

23

2

3

311

23

11

1

2 2

1

22

13 1

2

13

3

2

1

2

2

1

2

1

222

33 3

3

23

1

2

13 313

3

2

3 13 1 113 1

2

3 3 1

2

3

2

13

33

1333

2

2

3 33 3 1

2

33 1

222

3

3 11

222

333 1

11131

2

11 11

3 3

2

3

VI

-5 0 5-2

0

2

4

3

2 2

333

1

2

13

31

222

11

2

331

2

33

3 11

2

33333

2

1

2

31 1

2

1

2

1 1

2

133

1

222

11

2

3 1

2

1

2

113331

23

3

2

13

2

3 3

31

2

1

2

11

22

2

1

2

3 11

2

1

2

22

33 1

22

33

2

13

1

2

13 13

22

32

3

2

3 13 1 13

2

1 1

2

1

22

2

3 13

113 1

2

11 113 1 1 1

313

222

13

2

13

2

3

2

133

3

2

1

2 2

2

1

2

1 1

22

3 1 1

2 2

3

2

11

3 3

3

1

2

13

3 113 1

2 2

3

33

22

33 1

2

313

3

3

2

3

2

3 33

2

3

2

33

333

2

1

2

1 133

2

1

22

333

2

1111

2 2

31 1

3

2 22

2

1 133

3 1133

2

3

2

311

2

13

2233

2

3 2

1

22 2

23

1

2 2

31

2

1

233

13 1 11

2

3 11

2

1 13

2

22

3 3

2

1

23

1113 1

2

31

31

2

133

2

33

2

3

2 2

3

22

1

2

3

1

1

2

2

333 1

22

1

23

2

3

3

3 33 3

2 2

11

2

31

2

13

1 113 13

222

3 1

2

13

3 1113

1

2

11

2

3 1

2

3

33 11

2

3113 3

2

3 11

2

133

2

3 13 3

13 1

2

33

2

33

1 13

2

11

2

11

2

33

3 31 1

13

222

3 1 13

3 13

113

33

33

22

1

22

1

22

22

13

2

2

1

22

1

2

33

2

1

2

3

222

1

2

1 13

2

3

22

3133

2

1131

2

3 1

2 2

33

222

2

3 1

2

11

2

3

22

1

2

3 1

22

3

2

111

1

2 2

13 131

2

11

3 3

2

3 133

2

1

2

311

11

2

31

22

1 11

22

23

2

3 13 13

111

2

11 11333

313 1

22

13 133

2

33

2

3 3 13

2

3

2

133

2333 1 1

2

33

13

3

222

33

1

2

13 11

2

13

1

322

3

2

1

2

1

22

3 3311

22

3 31 1

22

13 13 11

22

313

2

1 1

222

13

223

1113

22

2

3

22

22

3

2

13

2

33 1

22

32

33 1

2 2

13

2

3 13

2

33 3

31

2

1

2

3

33

2 22

3 1

22

32

1 1 13

333 11

2

13

2

1

2

31

3 3

33

2

2

13 1

23

222

2

3

22

11

2

13

2

1

2

1 1

2

1

2

13

2

3

2

1 11

22

33 1

2

133

2

1

22

1

2

11

222

13 11

2

1111

22

131 1

33

31

3

2

3 13

22

22

2

322

3 1

2

2

313

2

3

2

3

22

33 1 1

2

1

2

3

22

3

2

3

2

3

22

1

2

1

2

31

2

1

2

3 3

2

33 113

2

11

222

33 3

2

11

2

13

2

33 1

222

1

22

331

2

3

2

1

22

1

2

13 3 1 1 1

222

3

222

113 11

2

1

2

222

1

2

22

1 11

2

3

2

2

2

1

22

13

2

3

22

3 1

22

3

3 11

2

11133 3

22

13

33

1

311

2

1 13

3

2

3 1

2

3

2

13

22

1 13

2

2 2

1

23

3 1

22

1

2

113 1

2

113

2

11

22

31 13

11 1

2

3

22

11

22

2

13 1

2 2

1

2

1333

2

3 113 1

2

3 1

2

133

131 133 3 113

222 2

2

133

3

2

1

2

1131

22

1 133

3

22

311

3

22

13 13113 133

13 1

3

2 22 2

1311 11

1 133

3

3

2

22

32

3 1

2

11

23

11

22 2

33

2

3

22

1 1

2

2

13

3

2

13

22

3

2

31

31

13

1311

2

11

2

13

23

3

2

13 113 11

22

1

2 2

13 13

22

33

2

33 3

31

3

1

2

1

223

3

2

3

2 22

11 11

2

3 13

2

3

2

3 111

22

1

2

3

222

3 13 1

2 2

33 3

2

1133 33

13

22

3

1

2

13

222 2

11

2

2

3333

2

3333 1 1

2

1

2

33

33

2

1

2

3

2 22 2

3 1

22

1

21

2

13

2

3

2

11

2

33

2

3 1 111 1

22

1

2

1

22

3

2

2

113

33

3 11 111

2

11 1

2 2

33113

2

13

22

33

2

3

22

113

22

2

31111

3 1

22

31

1

2

3

2

3

2

3

2

12

33

2

13 1

2

22

3

2

3 13

22

1

2

32 2

11

2

133 1

2

3

22 22

2

1

2

1 1 1

2

1

22

3 113 13

22

111

2

3

2

1

22

332

1

2 2

3

2

133

13 13 11

2

33

2

13 13

113 11

2

3 3 1

3

2

33

13

2

111

2

133 11

2

3

22

3

2

1

22

1

222

2

1 1

22

2

33 1 1

2

33

13

2

133

2

13

333

2

1 13

222

3

2

1

222

2

3 13

13

2

13 3

2

22 2

1

23

2

12

1

22

1

2

3

22

1113

2

1

2

1

322

3

2

33 1

33

2

3

222

2

1

2

22

1

3

111

3

2 2

3

2

2

3 1

3

33

2

3333 11

1

2

313

31 1

2

333

3

1113 1

2

1 13

2

3 1 1 11 113 11

2 2

3

1

1

2

1

2 2

13 11 1

22

1

2

1 13 1 133

2

3 11

3

22

11

2 2

33

2

1

2

3 3

2 22

111 13 13

22

3

2

1 1

22

3333

13

1

222

311

2

1

22

13 113 33 3

2

3

2

33

2

31

3 1

22

22

3

22

2

13

13

2

11

22

111

3

2

33

22

111

3 13 3 1

2

13

22

222

1

22

3 3

2

3 1

22

3 1

2

11

2

1

2

33

2

13

22

13

2

33

2

3

2

2

3 1

2

31

113

2313 1

2

33

1 1

22

13 1

2

11

22

3 1 113 133

313 3

2

1

2

11

2

113 1

2

13

2

1

2

111

2

2

1

2

11 1 13

11

2

11

2

3

2

3 13 13

2

113

3

1

2 2

1 1 1

2

1

2

2

133

3

3

1 11

222

1333

1 1133

3

2

3 113 1

1 13

22

133 13 1

2

1

2

3 13 131 1 1

2

33

2

3

3

2

3 1311

3

311

13

33 1 1

2

1

2

113

2

3 13 1

2

1

2

1 111

222

2 22

1113

2

1

2

3

22

11

33 133

2

3

113

33

31

22

3

2

133

31 1 1

333

2

2

32

1

2

3

3 1111

3

3

2

31

2

3 1

22

3 1133

2

3 3

2

313 3 1 1313

2

22

33 1

22

3

12

13 113 13

3

1

222

113

3 3 1

2222

1 13

2

1

32

3

2

11

22

3

22

133 133 1

23

1 1

2 2

133 133

2

1

2 2

1113

2

3

2 2

2

13

2

1

2

3

2

1

3

22

22222

31

2 2

1

32

3 3 22 2

33

1

22

11

2

1

2

1

2

3 1113

2

1

22

2

3

111

23

11

1

2 2

1

22

13 1

2

13

1

2

1

2

2

1

2

1

222

33 3

3

22

1

2

13 313

3

2

3 13 1 113 1

2

3 3 1

2

3

2

13

33

1333

2

2

3 33 3 1

2

33 1

222

2

3 11

222

333 1

11131

2

11 11

3 3

2

3

EVE

-5 0 5-2

0

2

4

3

2 2

333

1

2

13

31

222

11

2

331

2

33

3 11

2

33333

2

1

2

31 1

2

1

2

1 1

2

133

1

222

11

2

3 1

2

1

2

113331

23

3

2

13

2

3 3

31

2

1

2

11

22

2

1

2

3 11

2

1

2

22

33 1

22

33

2

13

1

2

13 13

22

32

3

2

3 13 1 13

2

1 1

2

1

22

2

3 13

113 1

2

13 113 1 1 1

313

222

13

2

13

2

3

2

133

3

2

1

2 2

2

1

2

1 1

22

3 1 1

2 2

3

2

11

3 3

3

1

2

13

3 113 1

2 2

3

33

22

33 1

2

313

3

3

2

3

2

3 33

2

3

2

33

333

2

1

2

1 133

2

1

22

333

2

1111

2 2

31 1

3

2 22

2

1 133

3 1133

2

3

2

311

2

13

2233

2

3 2

1

22 2

22

1

2 2

31

2

1

233

13 1 11

2

3 11

2

1 13

2

22

3 3

2

3

23

1113 1

2

31

31

2

133

2

33

2

3

2 2

3

22

1

2

3

1

1

2

2

333 1

22

1

23

2

3

3

3 33 3

2 2

11

2

31

2

13

1 113 13

222

3 1

2

13

3 1113

1

2

11

2

3 1

2

3

33 11

2

3113 3

2

3 11

2

133

2

3 13 3

13 1

2

33

2

33

1 13

2

11

2

11

2

33

3 31 1

13

222

3 1 13

3 13

113

33

33

22

1

22

1

22

22

23

2

2

1

22

1

2

33

2

1

2

3

222

1

2

1 13

2

3

22

3133

2

1131

2

3 1

2 2

33

222

2

3 1

2

11

2

3

22

1

2

3 1

22

3

2

113

1

2 2

13 131

2

11

3 3

2

3 133

2

1

2

311

11

2

33

22

1 11

22

23

2

3 13 13

111

2

11 11333

313 1

22

13 133

2

33

2

3 3 13

2

3

2

133

2333 1 1

2

33

13

3

222

33

1

2

13 11

2

13

1

322

3

2

1

2

1

22

3 3311

22

3 31 1

22

13 13 11

22

313

2

1 1

222

13

223

1113

22

2

3

22

22

3

2

13

2

33 1

22

32

33 1

2 2

13

2

3 13

2

33 3

31

2

1

2

3

33

2 22

3 1

22

32

1 1 13

333 11

2

13

2

1

2

31

3 3

33

2

2

13 1

23

222

2

3

22

11

2

13

2

1

2

1 1

2

1

2

13

2

3

2

1 11

22

33 1

2

133

2

1

22

1

2

11

222

13 11

2

1111

22

131 1

33

31

3

2

3 13

22

22

2

322

3 1

2

2

313

2

3

2

3

22

33 1 1

2

1

2

3

22

3

2

3

2

3

22

1

2

1

2

31

2

1

2

3 3

2

33 113

2

11

222

33 3

2

11

2

13

2

33 1

222

1

22

331

2

3

2

1

22

1

2

13 3 1 1 1

222

3

222

113 11

2

1

2

222

1

2

22

1 11

2

3

2

2

2

1

22

13

2

3

22

3 1

22

3

3 11

2

11133 3

22

13

33

1

311

2

1 13

3

2

3 1

2

3

2

13

22

1 13

2

2 2

1

23

3 1

22

1

2

113 1

2

113

2

11

22

31 13

11 1

2

3

22

11

22

2

13 1

2 2

1

2

1333

2

3 113 1

2

3 1

2

133

131 133 3 113

222 2

2

133

3

2

1

2

1131

22

1 133

3

22

31 1

3

22

13 13113 133

13 1

3

2 22 2

1311 11

1 133

3

3

2

22

32

3 1

2

11

23

11

22 2

33

2

3

22

1 1

2

2

13

3

2

13

22

3

2

31

31

13

1311

2

11

2

13

23

3

2

13 113 11

22

1

2 2

13 13

22

33

2

33 3

31

3

1

2

3

223

3

2

3

2 22

11 11

2

3 13

2

3

2

3 111

22

1

2

3

222

3 13 1

2 2

33 3

2

1133 33

13

22

3

1

2

13

222 2

11

2

2

3333

2

3333 1 1

2

1

2

33

33

2

1

2

3

2 22 2

3 1

22

1

21

2

13

2

3

2

11

2

33

2

3 1 111 1

22

1

2

1

22

3

2

2

113

33

3 11 111

2

11 1

2 2

33113

2

13

22

33

2

3

22

113

22

2

31111

3 1

22

31

1

2

3

2

3

2

3

2

12

33

2

13 1

2

22

3

2

3 13

22

1

2

32 2

11

2

133 1

2

3

22 22

2

1

2

1 1 1

2

1

22

3 113 13

22

111

2

3

2

1

22

332

1

2 2

3

2

133

13 13 11

2

33

2

13 13

113 11

2

3 3 1

3

2

33

13

2

111

2

133 11

2

3

22

3

2

1

22

1

222

2

1 1

22

2

33 1 1

2

33

13

2

133

2

13

333

2

1 13

222

3

2

1

222

2

3 13

13

2

13 3

2

22 2

1

23

2

12

1

22

1

2

3

22

1113

2

1

2

1

322

3

2

33 1

33

2

3

222

2

1

2

22

1

3

111

3

2 2

3

2

2

3 1

3

33

2

333311

1

2

313

31 1

2

333

3

1113 1

2

1 13

2

3 1 1 11 113 11

2 2

3

3

1

2

1

2 2

13 11 1

22

1

2

1 13 1 133

2

3 13

3

22

11

2 2

33

2

3

2

3 3

2 22

111 13 13

22

3

2

1 1

22

3333

13

1

222

311

2

1

22

33 113 33 3

2

3

2

33

2

31

3 1

22

22

3

22

2

13

13

2

11

22

111

3

2

33

22

111

3 13 3 1

2

13

22

222

1

22

3 3

2

3 1

22

3 3

2

11

2

1

2

33

2

13

22

13

2

33

2

3

2

2

3 1

2

31

113

2313 1

2

33

1 1

22

13 1

2

11

22

3 1 113 133

313 3

2

1

2

11

2

113 1

2

13

2

1

2

113

2

2

1

2

11 1 13

11

2

11

2

3

2

3 13 13

2

113

3

1

2 2

1 1 1

2

1

2

2

133

3

3

1 11

222

1333

1 1133

3

2

3 113 1

1 13

22

133 13 1

2

1

2

3 13 131 1 1

2

33

2

3

3

2

3 1311

3

311

13

33 1 1

2

1

2

113

2

3 13 1

2

1

2

1 111

222

222

11

13

2

1

2

3

22

11

33 133

2

3

113

33

31

22

3

2

133

31 1 1

333

2

2

32

1

2

3

3 1111

3

3

2

31

2

3 1

22

3 1133

2

3 3

2

313 3 1 1313

2

22

33 1

22

3

12

13 113 13

3

1

222

113

3 3 1

2222

1 13

2

1

32

3

2

11

22

3

22

133 133 1

23

1 1

2 2

333 133

2

1

2 2

1113

2

3

2 2

2

13

2

1

2

3

2

1

3

22

22222

31

2 2

1

32

3 3 22 2

33

1

22

11

2

1

2

1

2

3 1113

2

1

22

2

3

311

23

11

1

2 2

1

22

13 1

2

13

3

2

1

2

2

1

2

1

222

33 3

3

22

1

2

13 313

3

2

3 13 1 113 1

2

3 3 1

2

3

2

13

33

1333

2

2

3 33 3 1

2

33 1

222

2

3 11

222

333 1

11131

2

11 11

3 3

2

3

VVE

2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7-2.15

-2.1

-2.05

-2

-1.95

-1.9

-1.85

-1.8

-1.75x 10

4

cluster numbersB

IC v

alu

es

VVV

EI

VI

EVE

VVE

Fig. 17. Data set without noises and different clustering results and BIC values for Example 7.

-25 -20 -15 -10 -5 0 5 10 15 20 25

-15

-10

-5

0

5

10

15

Fig. 18. Gaussian mixture data set with 600 noisy points from a uniform distribution on [−20,20].

-40 -20 0 20 40

-10

0

10

Original

-40 -20 0 20 40

-10

0

10

222222 22 222 22222222222 22222222 2222 22 22222222222

222 222 2222 222

22 2222

22 2222 22222 22 22 22

222 222

22 2222 222222222222222 222222

22 222222 2

222222 222222

22222 2222

2222222222

2 222222 2222 222222 22

22 222 22222 22 22222222222 222 2

222 2222222 2

2222 22222222

22 222222

2222

2222222222 22222 222 22222 222

2 22 2222 22222 222222222 2222 2

222

2 22222

2 22222

2222222 222222 2222222 22222 222 222 222 2222

22

22222 222222

222222222222222 2

22222

222 22 2222 2222222222222 2222 2222222 222 2222222222222

22 22222222 22222222222

222 2222222 22222222 222222 2222 22222 2222222222

2222 22

222 2222222 2

222222

2222 222222222222222 2222222222222

2222 222 22

2 22222

222 222 222 222 22 22 22222

22222222222 2222222222 22222222222222222222222222222 222222222

22 22222 22222 22

22 2222 2

222 2222 222222 222 22 222 22222 222 2

22222 222222 2222

2222 2222222

222222 222222

2222 2222 222222 2

22 222222

222 2222 22 222222222 222222

222222 22222222222

222222222222222 2222 22 2222 2222222 2222222222 222

222 22

222 22 22 22222 2222 222222 22222222222 22222222222222 222 22 222 2222 2

22222222222 22 2

22 222222222 22 222222222

2222 22222222

2 2222 2222 22 22 222222 2222 22

222222222

22222222 2222

222222222 222222222 22

22222222 22

222

22 222 222222 22 2

2222 22 222222 22 22222222 222 2222 2222222222 22

2 22

2 222 22222 222 22 2222 22222222 222 22222222222222

22 22222 222 2222222222

2222222 222222

222222 222 22222 2222222

2 22222222222222

24222222 22222 2222 22222

22 2222222 22 2222

222 222222

222 222 22222

222222 22 222222222222 222222222222 222

222222

2 22222222 22

22 2 22222 222222222

22 2222222222222 222 2

222

22222222 2222

2222222 22222222222 222 2

2222 2222

222222 222222222 22222 2222

22

222222222

22222 222 222 22222 22222

22222222 2222

2222222

22222 2 22

2 222222222222222222 222 2222222 22 2

2222

2 222 222222222 222222 2222222222 222 222

2222 222222222 222 2

222222 222222 2222222 2222 22222

2222222 22 22222 22222 22222222222

222

2222222

222 2222222 222222

22

222 22222222222 222 2222 22

222222222222

22 222 222222222

2222222 2222 2222222 2

22222 22

222222222222222 2

2 2222222222222

2222 22 22 222222222

2222222222222 2222

2222

2222222 222 222222 22222 22

22222 2222 22 22222 2222222 22 2222 222 222 222222222222 22222 222222 222222222

2222

2222

22 222

222222222222 222 22222222 222222222222222

222 22

2 22222222

222222222 22222 22

22 222 22 222

22222222 222 22

222222222 222 222222 222

22 22 2222 22242222 2

2222222 22

222222222 22222 22

22 22222 222222 222222 222222 222

22 22

222222

22 22222222222222 2 2222222222222 222 2222

222222222222222222

22222222

22222

22 2222 2

22222222

222

2222222 2222 222222222222 2222

222222 2

2222222222222 22 22

22222222222

222

2222222 2222 222

2 2222222 2222

222 222 22222222 222222 2222222222

222 222 222222222 22222222 2222 222

222222 22 2222 22222222 222 22

22222222 22222222 2222222 222222222 2

22222

22 22222

222222222

4

4

11

23

2

44

31

1

3

11 1

4

2

44

4

1

43

31

43

23

3 3

4

4

1

4

1

3

3

3 4

11

3

3

33

4

4

1

4

1

4

2

4

3

4

32

4

3

11

3

3

44

3

44

4 4

3

3 44

1

4

1

2

3 1

3

32

2

1

2

1 1

3

11

4

11

4

3

1

34

3

433

4

3

3

4

1

3 2

3

3

4

4

11

4 4

3

4

1

3

1

4

3

1

4

3

3

3 3

1

43

1

1

3

1

3

11

3 1

3

4

4 422

2

113

2

4

31

233

1

24

3

44

434

3

41

43

3

4

13 1

4

4

1

3

11

44

1

2 4

4 44

1

1

3

3233

3 13

3

3

3 1

3

4 433

31 1

4

1

1

4

111

1

31

1

13

12

4

3

3

21

1

4

3 1

33

13

21

33

3

1

23

3

21

4

3 2

3

13

4

23

4

42

3

1

4

1

3

2

4222

3

24

2

4

4

1

4

3 1

4

4

34

3

1

43

13

44

133

3

4

31

4

2

4

23

4

1

44

1

43

1

34

134

1

4

3

3

3

4

14

4

24

3

4

333 4

1

3

4

4

4

1

4

1

4

1

4

31

3

3

2

2

43 4

3

4

1

344

1

2 44

1

2

43

33

4

4

3 3

4 4

3 1

33

4

1

1

4

3

3

44

3 4

3 1

3

1

4

2

3

2

1

4 4

11

1

44

42

11

44

3

4

1

4

1

4

4

1

3 42

2

4

1

4

1

3 43

3

22

1

4

11

2

11

4

1

4

4

44

32

1

4

1

4

3

3

3

4

11

4

1

12

3

31

11 123

3

2

1

4

1

2

4 4

1

4

3 331

3

2113

4

12

11

3 4

1

2

4

3

4

4

1

442

4

3

4

13

4

2

334

43

13

3

13

1 13

4

133

1

3

4443

3

3

1

3 13

3

3

2

4

1

3

44

1

3

1

3

4

4

33

1

4

311

4442

3

4

4

3

3

33

1

4

1

31

43

1

4

1

1

4 4

1

413

VVV

-40 -20 0 20 40

-10

0

10

22222222 222 222222 22222 22222222 2222 2222222222222

222 222 2222 222

22 2222

22 2222 22222 22 22 22222 222

22 2222 222222222222222 222222

22 222222 2

222222 222222

22

222 22222222222222

2 222222 2222 222222 22

22 222 22222 22 22222222222 222 2

222 2222222 2

2222 22222222

22 222222

2222

2222222222 22222 222 22222 222

2 22 2222 22222 222222222 2222 2

222

2 22222

2 22222

2222222 222222 2222222 22222 222 222 2222222

22

22222 222222

222222222222222 2

22222

22222 2222 2222222222222 2222 2222222 222 2222222222222

22 22222222 22222222222

222 2222222 22222222 222222 2222 22222 22222222222222 22

222 2222222 2222222

2222 222222

222222222 22222222222222222 222 22

2 22222

222 222 222 222 22 22 22222

22222222222 2222222222 22222222222222222222222222222 222222222

22 2222222222 22

22 2222 2

222 2222 222222 222 22 222 22222 222 2

22222 222222 2222

2222 22222

22

222222 222222

2222 2222 222222 2

22 222222

222 2222 22 222222222 2222 22

222222 2222222222222222222222

2222 222222 2222 2222222 2222222222 222

222 22

222 22 22 22222 2222222222 22222222222 22222222222222 222 22 222 2222 22

2222222222 22 222 22

2222222 22 222222222

2222 22222222

2 2222 2222 22 22 222222 2222 22

222222222

22222 222 222

22

22222222 22222222222

22222222 22

222

22 222 22222222 2

2222 22 222222 22 22222222 222 2222 2222222222 22

2 22

2 222 22222 222 22 2222 22222222 222 22 222222222222

22 22222 222 2222222222

2 222222 222222

2222 22222 22 222 2

222222

2 22222 222222222

22222222 22222 2222 22222

22 2222222 22 2222

22

2 2222222

22222 22222

222222 22 222222222222 2222

22222222 222

222222

2 22 222222 22

22 2 22222 222222222

22 2222222222222 222 2

222

22222222 2222

2222222 22222222222 222 22222 2222

222222 222222222 22222 2222

22

222222222

22222 222 222 22222 22222

22222222 2222

222222222222 2 22

222222222222222 2222 222 22222 22 22 2

2222

2 222 222222222 222222 2222222222 222 222

2222 222222222 222 2

222222 222222 2222222 2222 22222

2222222 22 22222 22222 22222222222

222222222

2222 22222

22 222222

22

2 22 22222222222 222 2222 22

222222222222

22 222 222222222

2222222 2222 2222222 2

22222 22

222222222222222 2

2 22222222

222222

222 22 22 22 22222 2

22222222222222 22

222222

2222222 222 222222 22222 22

22222 2222 22 22222 2222222 22 2222 222 222 222222222222 22222 222222 222222222

2222

2222

22 222

222222222222 222 22222222222222222222222222 22

2 22222222

222222222 22222 22

22 222 22 222

22222222 222 22

222222222 222 222222 222

22 22 2222 22222222 2

2222222 22

222222222 22222 22

22 22222 222222 222222 222222 222

22 22

222222

22 22222222222222 2 222222222 2222 222 2222

22222222222 2222222

22222222

22222

22 2222 2

22 222222

222

2222222 2222 222222222222 222 2

2222 22 22

222222222222 22 222

2222222222222

2222222 2222 222

2 2222222 2222

222 222 22222222 222222 2222222222

222 222 222222222 22222222 2222 222

222222 222222 22222222 222 22

22222222 22222222 2222222 222222222 2

22222

22 22222

222222222

4

4

13

25

2

55

31

1

5

11 1

5

2

45

2

1

25

31

45

23

33

5

4

2

4

1

2

3

5 4

11

5

3

33

5

4

1

5

1

5

2

5

3

5

32

4

3

11

3

3

45

3

52

5 5

3

5 45

1

4

1

2

3 1

3

32

2

1

2

1 1

5

11

4

11

5

3

1

54

3

555

5

3

3

4

1

2 2

3

5

4

4

11

2 4

3

5

1

5

1

4

3

1

4

3

3

5 5

1

42

2

1

2

1

3

21

3 1

3

4

2 422

2

113

2

4

31

223

2

24

3

44

454

3

21

23

3

4

13 1

2

5

1

2

11

45

1

2 4

5 55

2

1

3

5233

3 13

3

3

3 1

3

5 455

31 1

4

1

1

5

111

1

32

1

33

12

4

3

5

21

1

5

3 1

55

13

21

55

3

3

23

5

21

5

3 2

3

22

2

23

5

42

5

1

5

1

5

2

2222

5

24

2

4

4

2

4

3 1

4

4

54

2

1

45

13

25

133

5

5

31

5

2

4

23

4

1

45

1

25

4

55

134

2

5

3

3

3

5

24

5

22

3

2

555 4

1

3

2

5

4

1

2

1

5

1

5

31

2

3

2

2

55 4

5

4

1

544

1

2 44

1

2

55

33

4

4

3 2

5 4

3 1

33

4

1

1

4

3

5

45

5 4

3 1

5

1

4

2

3

2

1

4 4

11

1

22

22

22

55

3

5

3

4

3

4

2

1

5 42

2

2

1

4

1

5 25

3

22

1

4

12

2

11

5

1

4

4

45

32

2

5

2

4

3

5

3

5

11

4

1

12

3

31

23 122

5

2

1

5

1

2

4 4

3

5

3 332

5

21 13

5

12

11

5 4

1

2

4

3

4

4

1

442

5

3

5

13

4

2

334

45

13

5

33

2 13

4

133

1

3

4445

3

2

1

3 15

3

3

2

2

1

3

44

1

5

4

3

4

5

35

1

2

311

4522

3

4

4

3

5

33

1

5

1

31

42

1

5

4

1

5 4

2

443

EI

-40 -20 0 20 40

-10

0

10

222222 22 222 22222222222 22222222 2222 22 22222222222

222 222 2222 222

22 2222

22 2222 22222 22 22 22

222 222

22 2222 222222222222222 222222

22 222222 2

222222 222222

22222 2222

2222222222

2 222222 2222 222222 22

22 222 22232 22 22222222222 222 2

222 2222222 2

2222 22222222

22 222222

2222

2222222222 22222 222 22222 222

2 22 2222 22222 222222222 2222 2

222

2 22222

2 22222

2222222 222222 2222222 22222 222 222 222 2222

22

22222 222222

222222222222222 2

22222

222 22 2222 2222222222222 2222 2222222 222 2222222222222

22 22222222 22222222222

222 2222222 22222222 222222 2222 22222 2222222222

2222 22

222 2222222 2

222222

2222 222222222222222 2222222222222

2222 222 22

2 22222

222 222 222 222 22 22 22222

22222222222 2222222222 22222222222222222222222222222 222222222

22 22222 22222 22

22 2222 2

222 2222 222222 222 22 222 22222 222 2

22222 222222 2222

2222 2222222

222222 222222

2222 2222 222222 2

22 222222

222 2222 22 222222223 222222

222222 22222222222

222222222222222 2222 22 2222 2222222 2222222222 222

222 22

222 22 22 22222 2222 222222 22222222222 22222222222222 222 22 222 2222 2

22222222222 22 2

22 222222222 22 222222222

2222 22222222

2 2222 2222 22 22 222222 2222 22

222222222

22222222 2222

222222222 222222222 22

22222222 22

222

22 222 222222 22 2

2222 22 222222 22 22222222 222 2222 2222222222 22

2 22

2 222 22222 222 22 2222 22222222 222 22222222222222

22 22222 222 2222222222

2222222 222222

222222 222 22222 2222222

2 22222222222222

22222222 22222 2222 22222

22 2222222 22 2222

222 222222

222 222 22222

222222 22 222222222222 222222222222 222

222222

2 22222222 22

22 2 22222 222222222

22 2222222222222 222 2

222

22222222 2222

2222222 22222222222 222 2

2222 2222

222222 222222222 22222 2222

22

222222222

22222 222 222 22222 22222

22222222 2222

2222222

22222 2 22

2 222222222222222222 222 2222222 22 2

2222

2 222 222222222 222222 2222222222 222 222

2222 222222222 222 2

222222 222222 2222222 2222 22222

2222222 22 22222 22222 22222222222

222

2222222

222 2222222 222222

22

222 22222222222 222 2222 22

222222222222

22 222 222222222

2222222 2222 2222222 2

22222 22

222222222222222 2

2 2222222222222

2222 22 22 222222222

2222222222222 2222

2222

2222222 222 222222 22222 22

22222 2222 22 22222 2222222 22 2222 223 222 222222222222 22222 222222 222222222

2222

2222

22 222

222222222222 222 22222222 222222222222222

222 22

2 22222222

222222222 22222 22

22 222 22 222

22222222 222 22

222222222 223 222222 222

22 22 2222 22242222 2

2222222 22

222222222 22222 22

22 22222 222222 222222 222222 222

22 22

222222

22 22222222222222 2 2222222222222 222 2222

222222222222222222

22222222

22222

22 2222 2

22222222

222

2222222 2222 222222222222 2222

222222 2

2222222222222 22 22

22222222222

222

2222222 2222 222

2 2222222 2222

222 222 22222222 222222 2222222222

222 222 222222222 22222222 2222 222

222222 22 2222 22222222 222 22

22222222 22222222 2222222 222222222 2

22222

22 22222

222222222

4

4

13

24

4

44

31

1

4

11 1

4

3

44

4

3

43

31

44

13

3 3

4

4

1

4

1

3

3

4 4

11

3

3

33

4

4

1

4

1

4

2

4

3

4

32

4

3

11

3

3

44

3

44

4 4

3

4 44

1

4

1

2

3 1

3

32

2

3

2

1 1

4

11

4

11

4

3

1

34

3

443

4

3

3

4

1

3 2

3

3

4

4

11

4 4

3

4

1

4

1

4

3

1

4

3

3

4 4

1

43

3

1

3

1

3

11

3 1

3

4

4 422

2

113

2

4

31

233

3

24

3

44

441

3

41

43

3

4

13 1

4

4

1

3

11

44

1

2 4

4 44

3

1

3

4233

3 13

3

3

3 1

3

4 444

31 1

4

1

1

4

111

1

31

1

33

11

4

3

4

21

1

4

3 1

44

13

21

43

3

3

23

4

21

4

3 2

3

13

4

23

4

42

4

1

4

1

4

3

4223

4

24

2

4

4

2

4

3 1

4

1

44

3

1

44

13

44

133

4

4

31

4

4

4

23

4

1

44

1

44

1

44

131

1

4

3

3

3

4

24

4

34

3

4

444 4

1

3

4

4

4

1

4

1

4

1

4

31

4

3

2

2

44 4

3

4

1

444

1

2 44

1

2

44

33

4

4

3 3

4 4

3 1

33

4

1

1

4

3

4

44

4 4

3 1

4

1

4

2

3

2

1

4 4

11

1

44

42

31

44

3

4

3

4

3

4

4

1

4 42

3

4

1

4

1

4 44

3

23

1

4

13

4

11

4

1

4

4

44

34

3

4

3

1

3

4

3

4

11

4

1

12

3

31

33 123

4

2

1

4

1

2

4 4

3

4

3 333

4

2113

4

12

11

4 4

1

2

4

3

4

4

1

442

4

3

4

13

4

2

331

43

13

4

33

1 13

4

133

3

3

4444

3

3

1

3 13

3

3

2

4

1

3

44

1

4

1

3

4

4

33

1

4

311

4442

3

4

4

3

4

33

1

4

1

31

43

1

4

1

1

4 4

3

113

VI

-40 -20 0 20 40

-10

0

10

25252525 255 222555 25225 22522252 2555 5525552255255

225 255 2555 255

52 2225

55 2225 22525 25 55 22525 555

55 2555 225222222255525 222525

55 255555 5

222255 255555

52

555 22552255555255

5 225525 2225 255525 22

55 225 22555 25 22255555525 255 2

555 5222525 2

5225 22225522

55 252252

5552

5522525555 22555 222 22255 255

5 22 2525 25552 252552525 5555 5

255

5 22225

2 22555

2222225 222552 2255255 22525 225 255 2555522

52

52255 225522

555555555522525 2

55225

52555 2225 2255552522525 2255 2255225 225 2522252225555

55 22225225 25552522255

555 2555555 22555525 222225 2555 22525 22552255525555 25

555 5555525 2522552

5525 255525

552222555 25222552252252522 225 25

2 22255

255 225 225 555 22 25 25522

55525255255 2555555555 22555255555552255552522252555 255555552

55 2525222225 25

55 2225 2

555 5555 225555 255 22 255 25555 555 2

55555 225555 2555

2555 22525

52

522525 252255

5555 2555 255525 2

55 255552

525 2255 25 255555255 2225 25

255555 5522522222255552525552

5555 222555 2555 5225525 2255222525 222

525 22

525 55 25 25225 5555255555 25555525225 25222522555552 225 25 555 2525 52

5222225555 25 555 25

5255222 25 222255225

5555 22555555

5 2225 2255 25 25 255525 2225 22

555525555

55525 255 255

25

55222525 22555255225

25555555 25

525

52 255 25522525 2

5255 25 225555 25 25555525 255 2555 2222555225 25

5 25

5 252 22525 225 25 2555 22255555 255 25 225555225525

55 22555 225 2555255555

5 525255 222525

5525 25255 55 255 2

555555

2 25252 255552555

55225255 22255 2225 55555

25 2555552 25 2225

25

5 2255555

55225 22225

522255 25 225555525555 2522

55552225 225

225555

5 25 522225 25

25 2 22225 255522522

52 2522555552255 225 2

522

55555555 2255

5555525 25525522225 222 25555 2525

252255 255552255 22555 2255

52

525552555

55255 225 255 22525 25555

52552225 2522

522222255225 5 25

225552255555552 2255 255 55525 55 25 2

5255

5 555 222252552 255252 2552255555 225 225

5555 252552555 255 2

555225 225255 2555255 2555 25555

5555555 25 25555 52555 25552252255

525252222

5225 22222

55 225255

25

5 25 55522222255 225 5255 25

552555555522

55 525 225555222

5552555 2555 5555255 2

52525 55

252255552552255 5

2 22555555

522225

255 25 55 25 25555 2

52555555222255 25

255525

5525255 255 222555 25225 22

55255 2525 25 52555 2522255 25 2225 255 255 555525255225 22255 252555 252222525

2555

5255

55 225

255555555225 255 25525222255555252255555225 25

5 25225552

555555555 25555 25

55 225 25 255

25255555 225 25

555522255 255 555525 255

25 25 2225 25255555 2

5255555 25

555225525 22255 25

55 22255 222555 225555 222555 225

55 25

255222

55 22222522252525 5 255555552 2555 525 2525

25555552555 2555555

25225252

55225

25 2525 2

55 252525

552

5255555 2225 555522522255 252 2

5555 25 25

225522225555 25 525

2522522525252

5525255 5555 252

5 2225555 5225

555 222 22525225 222555 2252555522

555 255 222555255 22555225 2225 222

552555 255225 22555555 255 22

55555525 25255555 2225555 222222522 2

52555

25 25552

555252525

4

4

11

57

2

77

31

4

7

11 1

7

5

47

2

1

56

31

67

23

35

7

4

1

5

1

5

3

7 4

11

6

3

33

6

4

1

7

1

7

5

7

3

6

32

5

3

41

3

3

47

3

65

7 7

3

7 56

4

4

1

5

3 1

6

55

5

1

5

1 1

7

11

4

11

7

3

4

64

3

776

7

3

3

5

1

5 2

3

3

4

5

11

5 5

3

6

1

7

1

2

3

4

4

3

3

7 6

1

45

5

1

5

1

3

11

3 4

3

4

6 452

5

113

5

4

31

253

5

54

5

44

264

3

24

22

3

4

43 4

2

7

1

2

11

67

1

2 2

7 66

1

1

3

6253

3 15

3

3

3 1

5

6 477

31 1

5

1

4

7

111

1

52

4

13

42

4

6

7

22

1

7

3 1

67

15

52

76

5

1

53

7

54

7

3 2

3

25

6

23

6

22

7

4

6

1

7

5

6525

7

54

5

4

4

2

5

5 1

4

4

76

6

1

57

13

56

453

7

7

34

7

2

4

53

4

1

46

1

67

4

77

454

2

7

1

3

3

7

22

6

55

3

5

777 4

1

3

6

7

2

1

5

4

7

1

7

51

6

3

5

5

77 4

3

4

1

644

1

5 45

1

2

66

33

4

4

6 5

7 4

3 1

55

4

1

4

5

3

6

46

7 6

5 4

7

1

6

5

3

5

1

6 4

11

1

26

25

12

77

3

6

1

4

1

4

6

1

6 52

5

6

1

4

1

7 57

1

55

1

2

11

2

11

6

1

4

2

47

32

1

7

1

4

5

7

3

6

11

4

1

45

3

34

11 125

7

2

1

7

1

2

5 4

1

7

3 335

7

52 43

6

22

11

7 4

1

2

4

3

2

4

1

445

6

3

6

13

4

2

354

66

13

6

13

1 45

4

133

1

5

4447

5

2

1

2 46

3

3

5

6

4

3

44

4

7

4

3

4

6

33

1

6

511

4665

3

4

4

3

7

13

4

7

1

31

46

1

7

4

1

6 4

1

443

EVE

-40 -20 0 20 40

-10

0

10

255555 25 255 22255525225 22552255 2555 55 55555255255

255 255 2555 255

55 2225

55 2225 25555 25 55 22

525 555

55 2555 225255225555525 222525

55 255555 5

222255 255555

52555 2255

5555555255

5 225525 2525 255525 22

55 225 22535 25 22255555525 255 2

555 5222555 2

5255 22225522

55 255555

5552

5525555555 52555 225 22255 255

5 15 2525 25555 252555555 5555 5

555

5 22555

5 22555

2222225 222555 2555255 22555 225 255 255 5522

52

52255 255522

555555555522525 2

55555

525 55 2225 2255552522525 2555 2255225 225 2522555225555

55 22225225 25552522255

555 2555555 25555525 222225 2555 22525 2255225552

5555 25

555 5555525 2

522552

5555 255555552222555 2522255255225

5555 225 25

5 22255

255 225 225 555 25 25 25522

55525255555 2555555555 22555255555552255552522252555 255555555

55 25555 22225 25

55 2225 2

555 5555 225555 255 25 255 25555 555 2

55555 225555 2555

2555 2252552

522525 252255

5555 2555 255525 2

55 255552

525 2255 25 255555253 222525

255555 55225522222

555525255555555 2225 55 2555 5255525 2555222525 222

525 22

525 55 25 25255 5555 255555 25555525255 25522522555552 225 25 555 2525 5

25222255555 25 5

55 255255225 25 222255225

5555 25555555

5 5225 2555 25 25 255525 2555 25

555525555

55525255 2552

555222525 225555555 25

25555555 25

525

55 255 255225 25 2

5255 25 225555 25 25555525 255 2555 2222555555 25

5 25

5 255 22525 225 25 2555 22255555 255 25255555555555

55 22555 225 2555555555

5525255 222555

552525 555 55255 2555555

5 25555255555555

54225255 25255 2225 55555

55 2555555 25 2225

255 225555

555 525 22225

552255 25 225555555555 255255552225 225

555555

5 25522525 25

25 5 22225 255552522

55 2522555552255 225 2

522

55555555 2255

5555525 25525522225 225 2

5555 2525

252555 255552555 22555 2255

52

525552555

55555 225 255 22525 25555

52551255 2552

5222222

55225 5 25

5 255522555555552255 255 5552555 25 2

5255

5 555 222252555 255255 2552255555 255 255

5555 252552555 255 2

555555 225255 2555255 2555 25555

5555555 25 25555 52555 25552252555

525

2522225

225 2252255 225555

55

555 55555555555 525 5255 55

552555555522

55 555 225555222

5552555 2555 5555255 2

52555 55

252255555552255 5

5 2255555552222

5555 25 55 252555525

5555555225255 2525

5525

5525255 255 222555 25555 22

55255 2555 25 55555 2522555 25 2225 253 555 555525255555 22255 252555 252255555

5555

5555

55 255

255555555555 255 25525225 255555552555555

255 25

5 25255552

555555555 25555 25

55 225 25 255

25555555 225 25

555522255 253 555555 255

25 25 2225 25545555 2

5255555 25

555525555 22255 25

55 22255 222555 255555 222555 225

55 25

255222

55 22222522255555 5 2555555552555 525 2555

555555525552555555

55225252

55555

25 2525 2

55252525

555

5555555 2255 555522522255 2552

555525 2

5255555225555 25 55

52522522555

552

5525255 5555 252

5 2225555 5225

555 225 22525225 222555 2252555525

555 255 222555255 25555225 2225 222

555555 25 5255 22555555 255 22

55555525 25255555 2225555 222222525 2

52555

25 25552

555255555

4

4

11

33

4

44

31

1

4

11 1

4

3

44

4

1

43

31

43

43

3 3

4

4

1

4

1

3

3

3 4

11

3

3

33

4

4

1

4

1

4

5

4

3

4

35

4

3

11

3

3

44

3

44

4 4

3

3 44

1

4

1

4

3 1

3

35

5

1

3

1 1

3

11

4

11

4

3

1

34

3

433

4

3

3

4

1

5 2

3

3

4

4

11

4 4

3

4

1

3

1

4

3

1

4

3

3

3 3

1

43

1

1

5

1

3

11

3 1

3

4

4 451

5

113

5

4

31

133

1

54

3

44

434

3

41

43

3

4

13 1

4

4

1

3

11

44

1

5 4

4 44

1

1

3

3433

3 13

3

3

3 1

3

4 433

31 1

4

1

1

4

111

1

34

1

13

14

4

3

3

51

1

4

3 1

33

13

54

33

3

1

53

4

51

4

3 5

3

15

4

13

4

42

4

1

3

1

4

5

4555

4

54

5

4

4

1

4

3 1

4

4

34

3

1

43

13

44

133

3

4

31

4

4

4

53

4

1

44

1

43

4

34

134

1

4

3

3

3

4

14

4

34

3

4

333 4

1

3

4

4

4

1

4

1

4

1

4

31

3

3

5

5

44 4

3

4

1

344

1

3 44

1

1

43

33

4

4

3 3

4 4

3 1

33

4

1

1

4

3

3

44

4 4

3 1

4

1

4

5

3

5

1

4 4

11

1

44

43

11

44

3

3

1

4

1

4

4

1

3 45

5

4

1

4

1

3 43

3

53

1

4

11

4

11

4

1

4

4

44

34

1

4

1

4

3

3

3

4

11

4

1

15

1

31

11 155

4

2

1

4

1

5

4 4

1

4

3 331

3

5113

4

15

11

4 4

1

2

4

3

4

4

1

445

4

3

4

13

4

5

334

43

13

3

13

1 13

4

133

1

3

4443

3

3

1

3 13

3

3

5

4

1

3

44

1

4

4

3

4

4

33

1

4

311

4445

3

4

4

3

3

33

1

4

1

31

43

1

4

4

1

4 4

1

443

VVE

2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7-4

-3.9

-3.8

-3.7

-3.6

-3.5

-3.4

-3.3

-3.2

-3.1

-3x 10

4

cluster numbers

BIC

valu

es

VVV

EI

VI

EVE

VVE

Fig. 19. Data set with 600 noisy points and different clustering results and BIC values for Example 7.

Page 20: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 774

-10 -5 0 5 10

-8

-6

-4

-2

0

2

4

6

8

Fig. 20. Gaussian mixture data set with 600 noisy points from a uniform distribution on [−10,10].

-20 0 20

-5

0

5

Original

-20 -10 0 10 20

-5

0

5

2 34

24

2 324

2444

2 32 332

44333 3

43

433

4

2 34

244

34

133

43

4 4333

43

433

43

444322

4

22 332

2 2 3

4

3

4444

2444

24

244

332 324

3222 33244

24

24

222 32 2 334

442

4432

43

4444332 322 3444

32 324

22 34

224

3 34 4

322 332

4

344

24

3222

44 422 3

422 32 2

422 32 332

43

4

224

244 4

242222 34

3 322

4332 32

444443

4432

4432

42 3

422

4432 322222 322 3

43

43 33

44222 22 322

434

244

2 322 324

3224

342 33

4222 122

433

432

43

4 4

34

2 34

32 34

22433332

4

3324

3222

4

2224

22 32 34

24

1

4

12 3322

42

424

24 4

2 324

24

33334

34

34

3332

443

433

44412

43

433

433

44442 34

33

42222

432

44

2 342 44

2

443

42 334 44

324

2 34

2 324

322

2 324

344

324

32 2

42 3

42 3

432

42 3

43

42

42 333

4 422 222

4

322 34

323

4

32 32 32 2 32244

34

334

324 442 3

42

2 32 4

322

432 3332 3333

1

334

3244

344

32 322 22 32 32222224

244

224

2 32 3222

432 3

442

443

43

42

42 33

44322 3

42

4433

4

2224

22

44433244333222

4

2 344

43332

42 32

42 333

443

433

442 32 342 32 34

4

333432 2 322

4 42222

4443

4 332 344

2

44344

33 332 4242 3

432 3 3332

222 342 3

14 42 3

42

42

442

44442 32 3

4

344

244

2

44

22244

32 332422 332 33

434

32 2 344

322

34

4 344

34

3224

3324

22 44222 32 323

4

34

22 34

24

22432

4

32 442

44

234331

22

4444442 3

422322 32

42 22 2

44224

3432 3222 3

42 3 324 4

333 34

2 32 44

34

3444

3344

342

42

44

2 3

443222

44442

4

24 444

34

344

2 3

442 22 2 33332

24

22 344

22 314

34 1

4

233333222 332 3

4

2 344

3242

42 3 3

4432

42 4

3222

14

32 33222 3

4422

433

433

43

433

42

44332

42 33 324

32

43332 32 3322 3

432222 3

4

222 332

4422 32 2

24

34244

244

3243

43

4

3

421

324

2 2 324

324

32224

22244

33442 34

2244

2 24 4

4 312 3

4422 33 32

44

2 33332

4

3342 3

4

32 324444

2222 24

322242 34

2 3322 322 32 322

43

1

34444 41444

2 31332

433

4332 334

3244

222 32 3

44332 332

4322 32

43

443

422 32 33 3

444222 34

2

4

244

32 324

2 334

34

332

4

2 334

2 332 32244424

33322

4

32 3322 21

32224424

2 324

24

32 34

2 3324

222444

2 322 342124

34

24

34

343332 33

4 4

333332 34

2 244

24

344

33334

333 34332

224

2242

4332 33

44433

443432 3

4332 33

43333

42222 2

4

344

2 3444

4 322 32 32224

2 2 32 2 32 34

2 2 334 4

3222 334

4432 33312 3

4433

422 3

4332 332

2 4224

21

24

324

2432 33

44 32 3

444

2 322

42 31 32 3

4433

432 32 322 3

4222

42

44322 3

432

443

4 444342 222

42 33

44

2

4

22 344

2244 3322 3

422 32 33

43

432 3

443

44443

4

32 34

2 344

2 3244

34432

44332 3332

443332

4332 2 332

4422 22 324

244

324

224

33334

3

43

433

43

42222

4332 32 32 342

224

22422

422 32 22

42

4

222 32 33

4

22 33333

442 44

3 3443

4422

4442 32 2

4 433

444

3244

34

332224

322 33332 324

332 2 34

3433

42 3

4

324

3224432222

433333

4

322 34

2444

22

2 34

2 332 3 343

442 12

42 3

43

4222

42

44433

4

22 3

44 4

344

243332222

42 3322

432

4442 32

4

24

32

432

4422

4

342

443

444422

4433

42 333

43

4432 333333

43

44

44222224

2 24

42

422

44

344

2 24444

324

34

14

22

4

34

332 2 32 3

412

24

34

34

24

2 3344

32 12 34

32 244

3344

333444

2 32 342 3324

33222 3344

2 3322 224

2 322224 44

2224

2 33344

2 22 32 3322 3

433

432 333

4

322

4

244

332 33334

3 34

2 344

42222 322 32

432

4432 3

4333

43322 332 33

444224

33342

2 323

42

4

3224

2

43322

44333

4222 33

4

34 4

322

24

34

2

43

4

2 333224

343

41

2 3

443

4322 32 33

434

24

32 34434

244

2 34

24

2444

33222

43332

2 2 224

333

4

3244

2 32 32 33344

2 3442 2

1

43222 3

422 32 3

44

3444

332 344

34

32 2 334444

422

43

432 332

2444224

2 2224 4

42222

42 33

442 332444

24

1 1

1

1

11

11

1

1

1

1

2

1

11

11

114

1

11

1

111

1

1

1

1

4

11

2

1

1

1

1

4

1

2

11

1

11

4

1

1

41

1

1

1

1

1

12

1

11 1

1

1

1

114

11

1

1

1

11

11

1

1 11

1

1

1

21

4

1

11

11

1

42

1

11

11

1

1111

2

1

1

2

41

11

1

11 1

1 1

1

11

1

21

1

4

1

1 4

1

11

1

1

1

321

1

11

1

11

1

1

11

1

11

14

1

2

11

11

1

11

1

111

1

1

2

11

1

3

1

311

1

3

1 1

111

1

11

1 1

1

2

144 3

4

1

11

1

1 1

1

11

1

114

24

11

11

21

1

3

1 1

1

1 1 1

1

11

1

111

1

4

1

4

1

111

31

1

111

22

1

1

1

2

1

1

1

1

11

1

4

11

21

1

11

1

1

1

1 111

1

1

1

22

14

1

1

11

1

1

1

1

2

1

2

1

1

1

1

1

1

1

1

11

1

1 2

1

2

11

1

11

1 1

1

1

1

1

11

1

1

1 4

1

1

3

11

1

1

11

11

2

1

42

1

1

1

111

1 1 1

2

11

11

11

1

41

1

1

12

11

1

1

1

1

1

1

11

1

11

1

1

141 2

1

1

11

2

1 11

1

11 1

1

11

41

1

21

1

1 1

1

1

1

11

1

111

1

11

11

11

11

1

14

1

1

1

2

11 11

1

1

11

14 4

14

1

1

1

12

1

1

1

1

11 1

1

11

2

11

11

1

1

11

1

12

1

2

1

1

11

1

1

1

3

1 1

1

1

34

1

1

11

1

1

1

1

4

1 2

111

1

13

1

1

1

1

112 11

11

1

1

1 1

1

11

4

1

1

1

11 11

41

1

1

14 1

1

1

1 1

14

1 2

1

1

4

1

111

2

1 11

4411

1

1

1

1

1

11 1

2

1

1

24

1

1

1

1

1

21

1

1

21

1

1

1

11

2

1

11

1

4

1

1

1

1

VVV

-20 -10 0 10 20

-5

0

5

5 24

54

5 254

5444

5 25 225

44222 2

42

422

4

5 24

544

24

522

42

4 4222

42

422

42

544255

4

25 225

5 5 2

4

2

4445

5444

54

544

225 25

52555 225

445

45

4555 255 22

544

5

4425

42

4444225 255 2444

25 254

55 24

554

2 24 4

255 225

4

244

54

2555

44 4

55 24

55 25 5

455

25 2254

24

554

544 4

54

5555 24

2 255

4225 25

444442

4525

4425

45 2

455

4425 255555 255 2

42

42 22

44555 55 255

424

544

5 255 254

2554

245 22

4555 455

422

425

42

4 4

24

5 2525 24

5542 2225

4

2254

2555

4

5554

55 25 24

54

5

4

55 2255

45

454

54 4

5 25454

22224

24

24

222544

24

22444

254

24

224

224444

5 24

22

45555

425

44

5 255

445

442

45 224 44

254

5 24

5 254

255

5 254

244

255

25 5

45 2

45 2

425

45 2

42

45

45 222

4 455 255

4

255 24

25 2

4

25 25 25 4 25544

24

224

255

445 2

45

5 25 4

255

425 2225 2222

4

224

2544

244

25 255 55 25 25555554

544

554

5 25 2555

425 2

445

442

42

45

45 22

44255 2

45

4422

4

555

455

44422544222555

4

5 2445

22254

5 254

5 22244

24

2244

5 25 255 25 24

4

222425 5 255

5 45555

4442

5 225 244

5

442

4422 225 4

545 2

425 2 2225

555 245 2

55 45 2

45

45

445

44445 22 2

4

244

544

5

44

55544

25 225455 225 22

424255 2

54 255

24

5 244

24

2554

2224

55 44555 25 25 2

4

24

55 24

54

554

22

4

25 445

44

524 2245

5444444

5 2455255 25

4555 5

44554

2425 2555 2

45 2 255 4

222 24

5 25 44

25

2444

2244

255

45

44

5 2

442555

45445

4

54 444

24

244

5 2

445 55 5 222255

4

55 244

55 244

24 2

4

522222555225 2

4

5 244

2545

45 2 2

4425

55

42555

44

25 22555 2

4 455

422

422

52

422

45

44225

45 22 254

25

42225 25 2255 2

425555 2

4

555 225

4455 25 5

54

24545

544

2542

42

4

2

454

254

5 5 254

254

25554

55544

22445 24

5544

5 54

45 22

5 244

55 22 25

44

5 22225

4

2255 2

4

25 254444

5555 54

255545 24

5 2255 255 25 255

42

4

24444 44444

5 22225

422

4225 224

2544

555 25 2

44225 225

4255 25

42

442

455 25 22 2

445555 24

5

4

544

25 254

5 224

24

225

4

5 224

5 225 25544454

22255

4

25 2255 55

25554454

5 254

54

25 245 225

4555

4445 255 2

45555

24

54

24

242225 22

4 4

222225 24

5 544

54

244

22224

222 24225

554

5542

4225 22

45422

442425 2

4225 22

42222

45555 5

4

244

5 2444

5 255 25 25554

5 5 25 5 25 24

55 225 4

2555 224

4425 22225 2

4422

455 2

4225 225

5 4554

51

54

254

5425 22

44 22 2

444

2 255

45 25 25 2

4422

425 25 255 2

4555

45

44255 2

425

442

4 444245 555

45 22

44

5

4

55 244

5544 2255 2

455 25 22

42

425 2

442

44 442

4

25 24

5 244

5 2544

244

2544

225 222544

2 2254

225 5 2254455 55 25

4544

254

55

42222

42

42422

42

45555

4225 25 25 255

554

55555

455 25 55

45

4

555 25 22

4

55 22222

445 44

2 2442

4455

44 45 25 5

4 422

444

25442

422555

4255 22225 25

4225 5 2

42

422

45 2

4

254

2554425555

422222

4

255 24

5444

55

5 24

5 225 2 242

445 25

45 2

42

4555

45

44422

4

55 2

44 4

244

542225555

55 2255

525

4445 25

4

54

25

425

4455

4

255

442

444555

4422

45 222

42

4425 222222

42

4444255554

5 54 4

54

554

42

4 45 5

445

4

254

24

24

55

4

24

2255 25 2

455

54

24

24

54

5 2244

25 45 24

25 544

2245

222444

5 25 245 2254

22555 2244

52255 55

55 25555

4 44555

4

5 22244

5 25 25 2255 2

422

425 222

4

255

4

55 4

225 22224

2 24

5 244

45555 255 25

425

4425 2

4222

52255 225 22

445555

22242

5 252

45

4

2554

5

42255

44222

2555 22

4

24 4

255

54

24

5

42

4

5 222554

242

45

5 2

442

4255 25 22

424

54

25 24424

544

5 24

54

5444

22555

42222

5 5 554

222

4

2544

5 25 25 22245

5 2445 5

2

42555 2

455 25 2

44

2444

225 244

24

25 5 224 444

455

42

425 225

5 444254

5 5554 44

55554

5 2244

5 225444

54

7 7

2

2

41

77

5

3

7

4

5

3

66

42

114

5

66

4

767

7

5

1

5

4

34

5

6

2

5

3

4

7

5

26

4

76

5

4

7

43

1

2

1

6

7

35

6

33 4

7

5

7

144

55

1

5

4

67

33

5

1 11

7

5

1

52

4

7

31

31

6

45

7

12

62

2

1134

5

6

4

5

43

65

3

77 6

3 4

6

24

7

52

3

4

3

5 4

7

41

4

7

4

255

2

13

7

76

1

7

13

4

76

34

4

5

26

26

5

21

7

767

2

1

5

66

1

2

3

237

7

2

6 6

443

2

11

7 6

1

5

145 2

4

5

66

2

7 6

4

51

4

524

54

66

33

53

1

2

4 1

5

75 2

4

66

4

777

3

4

6

5

1

233

22

7

222

55

4

3

1

5

6

2

3

6

51

2

4

51

25

3

72

1

6

2

5 114

6

1

5

55

34

7

3

67

3

3

5

6

5

6

5

5

7

1

6

3

6

2

2

14

2

5 5

7

2

13

7

67

3 4

2

7

4

6

67

3

6

5 4

6

3

2

77

4

5

14

67

5

7

45

1

6

3

777

3 4 1

5

76

77

32

7

43

4

6

25

11

6

1

7

3

6

3

67

1

43

7

1

255 5

5

1

67

5

3 43

6

56 6

4

76

41

3

57

3

7 2

5

3

3

66

2

314

3

57

66

77

33

6

45

2

3

7

5

77 66

7

4

57

35 4

34

1

6

6

25

1

5

3

5

13 1

7

44

5

32

62

3

6

13

5

75

7

2

6

5

77

5

7

3

2

7 6

2

7

24

6

3

67

2

3

3

6

4

5 2

67 6

3

22

7

3

2

3

525 22

66

7

1

76

4

13

4

6

1

2

66 66

43

5

6

24 1

3

2

3 4

24

5 5

4

6

4

1

666

5

6 67

4444

1

7

6

4

6

13 4

5

3

6

54

1

1

6

4

6

51

5

7

57

6

5

6

43

5

7

47

5

4

3

7

5

1

EI

-20 -10 0 10 20

-5

0

5

4 33

43

4 343

4333

4 34 334

33333 3

33

333

3

4 33

433

33

333

33

3 3333

33

333

33

333344

3

34 331

4 4 3

3

3

3313

4333

43

433

334 34

13444 334

334

34

3444 34 4 33

433

4

3334

33

3333334 344 3333

34 343

44 33

443

3 33 3

344 334

3

333

43

3444

33 3

44 33

44 34 4

344

34 3343

33

341

433 3

434444 33

3 344

3334 34

333333

3134

3333

34 3

334

3334 344444 344 3

33

33 33

33444 34 344

333

433

3 344 343

3443

334 33

3444 344

333

334

33

3 3

33

4 3333 33

44333331

3

3343

3444

3

4343

44 34 33

43

3

3

34 3344

34

343

43 3

4 34313

33333

33

33

333433

33

33333

341

33

333

333333

3 33

33

34444

334

33

4 314

333

333

34 333 33

343

1 33

4 343

341

4 343

333

344

34 3

34 3

34 3

334

34 3

33

33

31 333

3 344 344

3

344 33

34 3

3

34 34 34 3 34433

33

333

344

334 3

34

3 31 3

344

334 3334 3333

3

333

3433

333

34 344 44 33 33444443

433

443

4 34 3441

334 3

334

333

33

34

34 33

33344 3

34

1333

3

444

344

33333333333444

3

4 3334

33313

1 313

4 33333

33

3333

4 34 334 34 33

3

333331 4 344

1 34444

3333

4 334 333

4

33333

33 331 3413 3

334 3 3334

344 334 3

11 34 3

33

34

334

33334 33 3

3

333

433

4

33

44433

31 334344 334 33

333

34 4 333 334

33

4 333

33

3443

3333

41 33444 34 333

3

33

44 33

43

44333

3

34 331

33

333

33344

3333334 3

343344 34

34 41 4

33343

3334 3444 3

34 3 241 3

333 33

4 34 33

34

3333

3333

334

34

33

4 3

333444

31334

3

43 333

33

333

4 3

334 44 4 33333

43

44 333

44 333

33 3

3

433333444334 3

3

4 333

3434

34 3 3

3334

11

33444

33

33 33344 3

3344

333

333

43

333

34

33334

34 33 343

34

33334 34 3334 3

334444 3

3

444 334

3344 34 3

43

33431

433

3433

33

3

3

341

333

4 4 343

343

34343

44433

33333 33

4433

4 33

34 33

4 333

34 33 34

33

4 33334

3

3314 3

3

34 343333

4444 43

344431 33

4 3344 344 33 344

33

1

33333 33333

4 33334

333

3334 333

3433

441 34 3

33334 334

3344 34

33

333

344 34 33 3

334444 33

1

3

433

34 343

4 333

33

334

3

4 333

4 334 34433343

33344

3

34 3344 41

34443333

4 343

43

34 331 334

3444

3334 334 3

34441

33

43

33

333334 33

3 3

333334 33

4 433

43

333

33333

333 33334

443

4433

3334 33

34333

333331 3

3334 33

33333

34441 4

3

333

3 3333

4 344 34 34443

4 3 34 3 34 33

4 4 331 3

3414 333

3333 33334 3

3333

344 3

3333 334

4 3443

43

43

343

4334 33

33 33 3

333

3 344

34 34 34 3

3333

334 34 344 3

3134

34

33344 3

334

333

3 333334 344

34 33

33

4

3

33 333

4433 3344 3

344 34 33

33

334 3

333

33333

3

34 33

4 333

4 3433

33334

33334 3334

333334

3334 3 334

3341 43 343

413

333

44

33333

33

33

333

33

34444

3334 34 34 341

443

44144

344 34 31

34

3

444 34 33

3

44 33333

334 33

3 3313

3334

1334 34 3

3 333

333

3333

33

334443

344 33334 343

331 4 33

3333

34 3

3

313

3443334444

333333

3

344 33

3333

34

4 33

4 331 3 333

334 34

34 3

33

3444

34

33333

3

44 3

33 3

333

433334444

43 3334

134

3314 34

3

43

33

334

3344

3

314

313

333443

3333

34 333

33

3334 333333

33

3333334443

4 333

43

413

33

334 4

334

3

343

33

33

44

3

33

334 4 34 3

313

43

33

33

43

4 3333

34 34 33

34 333

3331

333333

4 34 334 3313

33444 3333

43311 44

41 34414

3 33444

3

4 33333

4 34 34 3344 3

333

334 333

3

344

3

413

334 33333

3 33

4 333

34434 341 34

334

3334 3

3333

43344 334 33

331441

33333

4 343

34

3

3443

4

33344

33333

3444 33

3

31 3

341

43

33

4

13

3

4 333443

333

31

4 3

333

3344 34 33

333

43

34 33333

433

4 33

43

4333

33441

33333

4 4 343

333

3

3433

4 34 34 33331

4 3331 4

3

33444 3

344 34 3

33

3333

334 331

33

31 4 333333

344

33

334 334

4 333343

1 4443 3

34444

34 33

334 334

3334

3

2 2

3

3

22

22

2

1

2

1

3

1

22

12

223

2

22

1

222

2

1

3

2

3

11

1

2

2

1

1

3

2

4

22

3

22

4

2

2

31

2

2

2

2

2

14

2

11 1

2

4

2

213

41

2

2

1

22

11

3

2 22

2

1

2

42

3

2

12

12

2

31

2

22

22

2

2212

3

2

1

4

31

22

1

22 2

1 2

2

33

2

42

1

3

1

1 3

2

32

2

2

3

342

3

21

2

12

2

2

21

3

22

13

3

1

22

32

2

32

2

222

2

2

1

22

3

3

1

311

2

3

2 2

211

2

22

1 2

2

3

234 3

3

1

22

3

2 2

1

12

2

223

43

22

11

11

2

3

1 2

1

2 1 2

1

22

2

222

1

3

2

4

2

311

33

2

222

43

2

1

2

4

2

3

1

2

13

3

3

42

32

1

12

2

2

3

1 221

2

2

2

11

13

2

1

22

1

1

1

2

4

2

1

2

2

2

2

1

2

3

2

21

2

1 4

2

3

21

2

22

1 1

3

2

1

2

21

1

2

1 3

2

1

3

22

2

4

22

22

4

2

31

2

2

1

222

1 1 2

1

22

22

12

2

31

1

2

24

22

2

2

2

1

2

1

22

2

11

1

2

341 3

2

2

21

4

1 11

2

22 2

2

22

32

1

11

1

2 2

1

1

1

22

2

121

1

22

22

22

11

2

31

2

1

2

4

22 22

2

2

12

14 3

13

2

2

2

24

2

1

1

4

21 2

2

31

4

13

23

1

2

21

1

11

2

3

2

1

22

1

2

1

3

2 2

3

2

31

2

1

22

2

1

1

2

3

1 3

22 2

1

33

2

1

2

1

234 23

22

2

2

22

3

21

3

2

2

2

22 22

31

1

2

23 2

1

2

1 1

33

1 4

2

2

3

2

222

4

2 22

3311

2

2

2

1

2

21 1

4

1

2

43

2

2

2

2

2

42

1

2

32

2

1

2

11

1

2

31

1

3

1

2

1

2

VI

-20 -10 0 10 20

-5

0

5

4 63

43

4 643

4333

4 64 664

33666 6

36

366

3

4 63

433

63

466

36

3 3666

36

366

36

333644

3

44 6644 4 6

3

6

3333

4333

43

433

664 643

6444 66433

43

43

444 644 663

334

3364

36

3333664 644 6333

64 643

44 63

443

6 63 3

644 664

3

633

43

6444

33 3

44 63

44 64 4

344 64 664

36

3

443

433 3

43

4444 63

6 644

3664 64

333336

3364

3364

34 6

344

3364 644444 644 6

36

36 66

33444 44 644

363

433

4 644 643

6443

634 66

3444 644

366

364

36

3 3

63

4 6364 63

4436 6664

3

6643

6444

3

4643

44 64 63

43

4

3

44 6644

34

343

43

3

4 64343

66663

63

63

666433

63

66333

643

63

663

663333

4 63

66

34444

364

33

4 634

334

336

34 663 33

643

4 63

4 643

644

4 643

633

643 64 4

34 6

34 6

364

34 6

36

34

34 666

3 344 444

3

644 63

64 6

3

64 64 64 6 64433

63

663

643

334 6

34

4 64 3

644

364 6664 6666

3

663

6433

633

64 644 44 64 64444443

433

443

4 64 6444

364 6

334

336

36

34

34 66

33644 6

34

3366

3

4443

44

33366433666444

3

4 6333

66643

4 643

4 66633

63

6633

4 64 634 64 63

3

666364 4 644

3 34444

3336

3 664 633

4

336

3366 664 3

434 6

364 6 6664

444 63

4 633 34 6

34

34

334

33334 64 6

3

633

433

4

33

44433

64 664344 664 66

363644 6

33 644

63

3 63

3

63

6443

6643

44 33444 64 64 6

3

63

44 63

43

443

66

3

64 33

433

463

66644

3333334 6344644 64

3444 4

33443

6364 6444 6

34 6 643 3

666 63

4 64 33

63

6333

6633

634

34

33

4 6

336444

33334

3

43 333

63

633

4 6

334 44 4 666644

3

44 633

44 663

63 6

3

466666444 664 6

3

4 633

6434

34 6 6

3364

34

36444

33

64 66444 6

3 344

366

366

3 63

663

433

6643

4 66 643

64

36664 64 6644 6

364444 6

3

444 664

3344 64 4

43

63433

433

6436

36

3

6

343

663

4 4 643

643

64443

44433

66334 63

4433

4 43

33 66

4 633

44 66 64

33

4 66664

3

6634 6

3

64 643333

4444 43

644434 63

4 6644 644 64 644

36

3

63333 36333

4 63664

366

3664 663

6433

444 64 6

33664 664

3644 64

36

336

344 64 66 6

333444 63

4

3

433

64 643

4 663

63

664

3

4 663

4 664 64433343

66644

3

64 6644 43

64443343

4 64

343

64 634 664

3444

3334 644 6

34443

63

43

63

636664 66

3 3

666664 63

4 433

43

633

66663

666 63664

443

4436

3664 66

33366

33636

4 63

664 663

66663

4444 43

633

4 6333

3 644 64 64443

4 4 64 4 64 63

44 663 3

6444 663

3364 66664 6

3366

344 6

3664 664

4 3443

43

43

643

4364 66

336

4 6

333

4 644

34 64 64 6

3366

364 64 644 6

3444

34

33644 6

364

336

3 333634 444

34 66

33

4

3

44 633

4433 6644 6

344 64 66

36

364 6

336

33 336

3

64 63

4 633

4 6433

633

6433

664 666433

6 6643

664 4 6643344 44 64

3433

643

443

66663

6

36366

36

34444

3664 64 64 634

443

44344

344 64 44

34

3

444 64 66

3

44 66666

334 33

6 6336

3344

33 34 64 6

3 366

333

64336

366444

3644 66664 64

3664 4 6

36

366

34 6

3

64 36443364444

366666

3

644 63

4333

44

4 63

4 664 6 636

334

643

4 63

63

4443

4333

663

44 6

33 3

633

436664444

34 6644

364

3334 64

3

43

64

364

3344

3

634

336

333344

3366

34 666

36

3364 666666

36

3333444443

4 43 3

43

443

36

3 34 4

333

3

64363

63

44

3

63

6644 64 6

334

43

63

63

43

4 6633

64 34 63

64 433

6633

666333

4 64 634 6643

66444 6633

4 6644 443

4 644443 33

4443

4 66633

4 44 64 6644 6

366

364 666

3

644

3

43 3

664 66663

6 63

4 633

34444 644 64

364

336

4 63

6663

6644 664 66

333443

66636

4 646

34

3

6443

4

36644

33666

3444 66

3

63 3

6444

363

4

36

3

4 666443

636

344 6

336

3644 64 66

363

43

64 63363

433

4 63

43

4333

66444

36664

4 4 443

666

3

6433

4 64 64 66633

4 6334 4

4

36444 6

344 64 6

33

6333

664 633

63

64 4 663 333

344

36

364 664

4 333443

4 4443 33

44443

4 6633

4 664333

43

7 2

6

6

51

77

4

5

7

5

4

5

22

36

113

4

22

5

7227

7

3

4

3

55

4

2

2

4

5

3

7

4

22

3

7 2

3

5

7

34

1

6

1

2

7

44

2

55 5

7

4

7

133

44

1

4

5

27

55

4

1 11

2

7

1

46

3

7

51

51

2

34

7

16

26

2

1155

4

2

5

4

35

24

5

77 2

5 5

2

36

7

46

5

3

5

3 3

7

3 15

7

3

644

6

15

7

72

1

7

15

3

72

53

3

4

22

62

4

33

7

722

6

3

4

22

3

6

5

647

7

6

2 2

555

6

15

7 2

5

4

133 6

3

4

22

6

7 2

5

41

5

463

43

22

45

44

1

6

5 5

4

77 6

5

22

5

277

5

3

2

3

1

645

66

2

666

44

5

4

1

4

2

6

5

2

43

6

3

41

47

5

72

1

2

6

3 113

2

1

7

44

53

7

5

27

4

5

7

2

4

2

4

4

7

1

2

5

2

6

4

15

4

7 4

7

6

15

7

22

5 5

4

2

5

2

67

4

2

4 3

2

5

6

27

5

4

15

22

4

7

34

1

2

5

777

55 1

4

72

77

56

7

35

5

2

64

11

2

1

7

5

2

5

27

1

55

7

5

634 4

4

1

27

4

5 55

2

42 2

5

72

31

5

47

5

7 6

4

5

4

22

6

415

5

77

22

77

54

2

33

4

5

7

4

77 22

7

5

47

53 3

53

1

6

2

64

1

7

5

4

15 1

7

35

4

53

24

5

2

15

4

74

7

4

2

4

77

4

7

5

6

2 2

3

7

63

2

5

27

6

5

4

2

3

7 6

27 2

5

66

7

5

4

5

464 66

22

7

1

72

6

15

3

2

5

4

22 22

35

7

2

63 1

5

4

5 5

33

7 4

5

2

3

1

222

4

2 27

3353 1

2

2

5

2

15 5

4

5

2

43

1

1

2

5

2

41

4

7

47

2

4

2

35

4

7

67

7

3

5

7

4

1

EVE

-20 -10 0 10 20

-5

0

5

3 54

34

3 534

3444

3 53 553

44555 5

45

455

4

3 54

344

54

355

45

4 4555

45

455

45

444533

4

33 5513 3 5

4

5

4444

3444

34

344

553 534

5333 55344

34

34

333 53 3 554

443

4453

45

4444553 533 5444

53 534

33 54

334

5 54 4

533 553

4

544

34

5333

44 4

33 54

33 53 3

433 53 553

45

4

334

344 4

343333 54

5 533

4553 53

444445

4453

4453

43 5

433

4453 533333 533 5

45

45 55

44333 33 533

454

344

3 533 534

5334

543 55

4333 333

455

453

45

4 6

54

3 5453 54

33455551

4

5534

5333

4

3334

33 53 54

34

3

4

33 5533

43

434

34

4

3 53414

55554

54

54

555344

54

55444

534

54

554

554444

3 54

55

43333

453

44

3 543

443

445

43 554 44

534

3 54

3 534

533

3 534

544

534 53 3

43 5

43 5

453

43 5

45

43

43 555

4 433 333

4

533 54

53 5

4

53 53 53 3 53344

54

554

534

443 5

43

3 51 4

533

453 5553 5555

6

554

5344

544

53 533 33 53 53333334

344

334

3 53 5331

453 5

443

445

45

43

43 55

44533 5

43

4455

4

3334

33

44455344555333

4

3 5444

55534

3 534

3 55544

54

5544

3 53 543 53 54

4

555453 3 533

4 43333

4445

4 553 544

3

44344

55 553 4343 5

453 5 5553

333 54

3 544 43 5

43

43

443

44443 53 5

4

544

344

3

44

33344

53 553433 553 55

454

53 3 544 533

54

4 544

54

5334

5534

31 44333 53 535

4

54

33 54

34

33453

4

53 44

144

354

55633

4444443 5

433533 53

43 33 3

44334

5453 5333 5

43 5 534 4

555 54

3 53 44

54

5444

5544

543

43

44

3 5

445333

44443

4

34 444

54

544

3 5

443 33 3 55553

34

33 544

33 544

54 5

4

355555333 553 5

4

3 544

5343

43 5 5

4453

43

45333

64

53 55333 5

4433

455

455

4 54

554

344

5534

3 55 534

51

45553 53 5533 5

453333 5

4

333 553

4433 53 3

34

54344

344

5345

45

4

5

434

534

3 3 534

534

53334

33344

55443 54

3344

3 34

64 55

3 544

33 55 53

44

3 55553

4

5543 5

4

53 534444

3333 34

533343 54

3 5533 533 53 533

45

6

54444 44444

3 56553

455

4553 554

5344

333 53 5

44553 553

4533 53

45

445

433 53 55 5

444333 54

1

4

344

53 534

3 554

54

553

4

3 554

3 553 53344434

55533

4

53 5533 34

53334434

3 53

434

53 543 553

4333

4443 533 5

43334

54

34

54

545553 55

4 4

555553 54

3 344

34

544

55554

555 54553

334

3343

4553 55

44455

44545

3 54

553 554

55554

3333 34

544

3 5444

4 533 53 53334

3 3 53 3 53 54

3 3 554 4

5333 554

4453 55553 5

4455

433 5

4553 553

3 4334

36

34

534

3453 55

445

3 5

444

3 533

43 51 33 5

4455

453 53 533 5

4333

43

44533 5

453

445

4 444543 333

43 55

44

3

4

33 544

3344 5533 5

433 53 55

45

453 5

445

44445

4

53 54

3 544

3 5344

54453

44553 5553

445553

4553 3 553

4433 33 534

344

534

334

53554

5

45

455

45

43333

4553 53 53 543

334

33433

433 53 33

43

4

333 53 53

4

33 55555

443 44

5 5445

4433

4443 53 3

4 455

444

5344

54

553334

533 55553 534

553 3 54

5455

43 5

4

53 45334453333

455555

4

533 54

3444

33

3 54

3 551 5 545

443

334

3 54

54

3334

3444

554

33 5

44 4

544

345553333

43 5533

453

4443 53

4

34

53

453

4433

4

543

445

444433

4455

43 555

45

4453 555555

45

4444333334

3 344

34

334

45

443 3

444

4

53454

54

33

4

54

553 3 53 5

413

34

54

54

34

3 5544

53 43 54

53 344

5544

555444

3 53 543 5534

55333 5344

3 5533 334

3 533334 44

3334

3 55544

3 33 53 5533 5

455

453 555

4

533

4

344

553 55554

5 54

3 544

43333 533 53

453

445

3 54

5554

5533 553 55

444334

55543

3 535

43

4

5334

3

45533

44555

4333 55

4

54 4

533 3

454

3

45

4

3 555334

545

413 5

445

4533 53 55

454

34

53 54454

344

3 54

34

3444

55333

45553

3 3 334

555

4

5344

3 53 53 55544

3 5443 3

3

45333 5

433 53 5

44

5444

553 544

54

53 3 554444

433

45

453 553

3 444334

3 3334 44

33334

3 5544

3 553444

34

1 2

2

6

66

12

2

1

1

6

3

6

22

66

664

1

22

6

2221

1

6

2

4

16

3

2

2

1

6

4

2

3

22

6

1 2

4

6

1

41

6

2

6

2

2

13

2

61 6

1

1

1

664

11

6

1

6

21

11

3

6 66

2

1

6

32

4

1

16

16

2

41

2

66

22

2

6616

3

2

6

3

46

22

1

11 2

6 6

2

63

1

32

6

4

1

1 4

1

6 66

1

6

531

2

61

2

12

6

1

66

6

12

14

6

3

22

22

1

66

1

222

2

6

3

22

6

5

6

511

1

5

2 2

661

2

66

1 2

6

3

643 5

4

1

22

5

2 2

6

16

6

364

34

22

16

11

6

5

6 6

1

1 1 2

6

22

6

211

1

4

2

4

6

511

52

2

626

33

6

1

6

3

2

2

1

2

16

5

4

36

31

6

12

6

2

2

1 666

2

6

1

11

64

1

6

21

1

6

1

2

3

2

1

1

2

6

2

6

2

6

2

66

2

1 3

1

3

66

1

22

6 6

2

2

6

2

21

1

2

1 4

2

6

5

22

6

1

66

22

3

1

43

6

2

1

111

16 6

1

12

11

16

1

41

6

2

23

66

2

6

1

1

2

1

21

6

66

1

6

641 3

2

6

21

3

1 61

2

22 2

6

22

46

1

11

6

1 2

1

6

1

22

2

166

1

11

22

11

11

2

64

2

6

2

3

11 22

2

6

11

14 4

64

6

2

2

23

6

1

1

1

61 6

2

66

3

66

22

6

2

66

1

11

1

3

2

1

11

1

1

6

5

2 2

6

2

54

2

1

22

2

6

1

2

4

1 3

22 2

1

65

1

1

2

6

233 63

22

1

6

22

3

66

4

2

6

2

22 22

41

1

2

64 6

6

2

1 6

64

1 3

6

2

4

6

222

3

2 21

4466 6

2

2

6

2

61 6

3

6

2

34

6

6

2

6

2

36

1

1

31

2

3

2

46

1

1

31

1

6

6

2

1

6

VVE

2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7-3.4

-3.3

-3.2

-3.1

-3

-2.9

-2.8x 10

4

cluster numbers

BIC

va

lue

s

VVV

EI

VI

EVE

VVE

Fig. 21. Data set with 600 noisy points and different clustering results and BIC values for Example 7.

5. Conclusions

Using the fuzzy classification maximum likelihood (FCML) with five eigenvalue

decomposition models and BIC model selection, we propose a robust FCML clustering

framework. The proposed clustering method achieves robust clustering results, with

robustness to different cluster volumes and also cluster number and with no parameter

initialization problem. Several simulated data and real data sets are used to demonstrate

the effectiveness and usefulness of the proposed model. A comparison of the proposed

method with the FCM, PFCM, and GK algorithms shows that the proposed method

produces better results for the simulated and real data sets. In fact, the FCM algorithm

can be derived through the FCML, which means that the proposed model can be seen as a

probability model for the FCM. Further study of other relationships and properties, such

as stochastic convergence and limiting theorems for the FCM algorithm, are worthwhile,

using the proposed model. New fuzzy clustering algorithms may also be developed using

the proposed robust FCML clustering framework.

Page 21: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

A Robust Fuzzy Classification Maximum Likelihood Clustering Framework 775

Acknowledgements

The authors are grateful to the anonymous referees for their constructive comments and

suggestions to improve the presentation of the paper. This work was supported in part by

the National Science Council of Taiwan, under Grant NSC101-2118-M-033-002-MY2.

References

1. J. A. Hartigan, Clustering Algorithms (Wiley, New York, 1975).

2. L. Kaufman and P. J. Rousseeuw, Finding Groups in Data: An Introduction to Cluster

Analysis (Wiley, New York, 1990).

3. R. Duda and P. Hart, Pattern Classification and Scene Analysis (Wiley, New York, 1973).

4. A. K. Jain and R. C. Dubes, Algorithms for Clustering Data (Prentice-Hall, New Jersey,

1988).

5. G. J. McLachlan and K. E. Basford, Mixture Models: Inference and Applications to

Clustering (Marcel Dekker, New York, 1988).

6. S. Zhong and J. Ghosh, A unified framework for model-based clustering, Journal of Machine

Learning Research 4 (2004) 1001−1037.

7. J. MacQueen, Some methods for classification and analysis of multivariate observations,

Proc. 5th Berkeley Symposium on Mathematical Statistics and Probability, Vol. 1, University

of California Press, Berkley, 1967, pp. 281−297.

8. C. Fraley and A.E. Raftery, How many clusters? Which clustering method? Answers via

model-based cluster analysis, Computer Journal 41 (1998) 586−588.

9. L. A. Garcia-Escudero and A. Gordaliza, Robustness properties of k means and trimmed k

means, Journal of the American Statistical Association 94 (1999) 956−969.

10. J. A. Guesta-Albertos, A. Gordaliza and C. Matran, Trimmed k-means: an attempt to

robustify quantizers, The Annals of Statistics 25 (1997) 553−576.

11. J. C. Bezdek, Pattern Recognition with Fuzzy Objective Function Algorithms (Plenum Press,

New York, 1981.

12. F. Hoppner, F. Klawonn, R. Kruse and T. Runkler, Fuzzy Cluster Analysis: Methods for

Classification Data Analysis and Image Recognition (Wiley, New York, 1999).

13. M. S. Yang, A survey of fuzzy clustering, Mathematical and Computer Modeling 18 (1993)

1−16.

14. Y. Cheng, Mean shift, mode seeking, and clustering, IEEE Trans. Pattern Analysis and

Machine Intelligence 17 (1995) 790−799.

15. K. Fukunaga and L.D. Hostetler, The estimation of the gradient of a density function with

applications in pattern recognition, IEEE Trans. Information Theory 21 (1975) 32−40.

16. K. L. Wu and M. S. Yang, Mean shift-based clustering. Pattern Recognition 40 (2007)

3035−3052.

17. R. Krishnapuram and J. M. Keller, A possibilistic approach to clustering, IEEE Trans. Fuzzy

Systems 1 (1993) 98−110.

18. N. R. Pal, K. Pal, J. M. Keller and J. C. Bezdek, A possibilistic fuzzy c-means clustering

algorithm, IEEE Trans. Fuzzy Systems 13 (2005) 517−530.

19. M. S. Yang and K. L. Wu, Unsupervised possibilistic clustering, Pattern Recognition 39

(2006) 5−21.

20. A. P. Dempster, N. M. Laird and D. B. Rubin, Maximum likelihood from incomplete data via

the EM algorithm (with discussion), J. Roy. Stat. Soc., Ser. B 39 (1977) 1−38.

21. G. J. McLachlan and T. Krishnan, The EM Algorithm and Extensions (Wiley, New York

1997).

22. G. Bryant and J. A. Williamson, Asymptotic behaviour of classification maximum likelihood

estimates, Biometrika 65 (1978) 273−281.

Page 22: A ROBUST FUZZY CLASSIFICATION MAXIMUM LIKELIHOOD CLUSTERING …pdfs.semanticscholar.org/55e3/1c40718ca1f134c4479a714d4d0261… · still lacks some robustness as a clustering algorithm,

M.-S. Yang, C.-Y. Lin & Y.-C. Tian 776

23. A. J. Scott and M. J. Symons, Clustering methods based on likelihood ratio criteria,

Biometrics 27 (1971) 387–397.

24. M. J. Symons, Clustering criteria and multivariate normal mixtures, Biometrics 37 (1981)

35–43.

25. M. S. Yang, On a class of fuzzy classification maximum likelihood procedures, Fuzzy Sets

and Systems 57 (1993) 365-375.

26. J. D. Banfield and A. E. Raftery, Model-based Gaussian and non-Gaussian clustering,

Biometrics 49 (1993) 803−821.

27. C. Fraley and A. E. Raftery, How many clusters? Which clustering method? Answers via

model-based cluster analysis, Computer Journal 41 (1998) 586−588.

28. C. Fraley and A.E. Raftery, Model-based clustering, discriminant analysis, and density

estimation, Journal of the American Statistical Association 97 (2002) 611−631.

29. W. L. Hung, Y. C. Chang and S. C. Chuang, Fuzzy classification maximum likelihood

algorithms for mixed-Weibull distributions, Soft Computing 12 (2008) 1013−1018.

30. C. Y. Lin and C. B. Chen, An invisible hybrid color image system using spread vector

quantization neural networks with penalized FCM, Pattern Recognition 40 (2007)

1685−1694.

31. J. S. Lin, K. S. Cheng and C. W. Mao, Segmentation of multispectral magnetic resonance

image using penalized fuzzy competitive learning network, Computers and Biomedical

Research 29 (1996) 314−326.

32. S. H. Liu and J. S. Lin, Vector quantization in DCT domain using fuzzy possibilistic c-means

based on penalized and compensated constraints, Pattern Recognition 35 (2002) 2201−2211.

33. M. S. Yang and N. Y. Yu, Estimation of parameters in latent class models using fuzzy

clustering algorithms, European Journal of Operational Research 160 (2005) 515−531.

34. L. A. Zadeh, Fuzzy sets, Information and Control 8 (1965) 338−356.

35. E. Ruspini, A new approach to clustering, Information and Control 15 (1969) 22−32.

36. J. H. Ward, Hierarchical groupings to optimize an objective function, Journal of the

American Statistical Association 58 (1963) 236−244.

37. H. P. Friedman and J. Rubin, On some invariant criteria for grouping data, Journal of the

American Statistical Association 62 (1967) 1159−1178.

38. C. L. Blake and C. J. Merz, UCI repository of machine learning databases, a huge collection

of artificial and real-world data sets, 1998. Website http://archive.ics.uci.edu/ml/datasets.html.

39. G. M. Reaven and R. G. Miller, An attempt to define the nature of chemical diabetes using a

multidimensional analysis, Diabetologia 16 (1979) 17−24.

40. W. H. Wolberg and O. L. Mangasarian, Multisurface method of pattern separation for

medical diagnosis applied to breast cytology, Proc. National Academy of Sciences 87 (1990)

9193−9196.

41. S. Chatzis and T. Varvarigou, Robust fuzzy clustering using mixtures of Student’s-t

distributions, Pattern Recognition Letters 29 (2008) 1901−1905.

42. S. Chatzis and T. Varvarigou, Factor analysis latent subspace modeling and robust fuzzy

clustering using t-distributions, IEEE Transactions on Fuzzy Systems 17 (2009) 505−517.

43. D. Peel and G. J. McLachlan, Robust mixture modelling using the t distribution, Statistics

and Computing 10 (2000) 339–348.

44. F. Greselin and S. Ingrassia, Constrained monotone EM algorithms for mixtures of

multivariate t distributions, Statistics and Computing 20 (2010) 9−22.