particle swarm optimization (pso)

53
Information Fusion Information Fusion By Dr. B. K. Panigrahi Asst. Professor Department of Electrical Engineering IIT Delhi, New Delhi-110016 01/12/2007 1

Upload: others

Post on 18-Dec-2021

16 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Particle Swarm Optimization (PSO)

Information FusionInformation FusionBy

Dr. B. K. Panigrahi

Asst. Professor

Department of Electrical Engineering

IIT Delhi, New Delhi-11001601/12/2007 1

Page 2: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

OUTLINEOUTLINE

•• IntroductionIntroduction• Classification• K-fold cross Validation• Feature selection by GA• Multiple Classifiers System• Information Fusion Methods

01/12/2007 2

Page 3: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Introduction• Information Fusion refers to the field of study of techniques attempting to merge information from disparate sources despite differing conceptual, contextual and typographical representations.

• Pattern recognition aims to classify data patterns based on either a prior knowledge or on statistical information extracted from the patterns.

• The patterns to be classified are usually groups of measurementsor observations, defining points in an appropriate multidimensional space.

01/12/2007 3

Page 4: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

OUTLINEOUTLINE

• Introduction•• ClassificationClassification• K-fold cross Validation• Feature selection by GA• Multiple Classifiers System• Information Fusion Methods

01/12/2007 4

Page 5: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

ClassificationData Modeling

Feature Extraction

Classification

Visualization&

Interpretation

•Classification is regarded as one of the most important fields of machine intelligence

•Aim in this process is to build a model that can recognize the samples

•The model must be able to classify a given pattern in one or more classes

•Methods used as classifier:•Neural Networks – LVQ, PNN, MLP,etc.•Decision trees•Support vector Machines

01/12/2007 5

Page 6: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Classification: Definitions• A “classifier” is any mapping from the space of

features (measurements) to a space of class labels (names, tags, distances, probabilities)

• A classifier is a hypothesis about the real relation between features and class labels

• A “learning algorithm” is a method to construct hypotheses

• A learning algorithm applied to a set of samples (training set) outputs a classifier

01/12/2007 6

Page 7: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Classification

•Error on the training data is not a good indicator of performance on future data

Q: Why?A: Because new data will probably not be exactly the same as the training data!

•Over fitting – fitting the training data too precisely - usually leads to poor results on new data

01/12/2007 7

Page 8: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Classification

Natural performance measure for classification problems:

Error Rate:Success: instance’s class is predicted correctlyError: instance’s class is predicted incorrectlyError rate: proportion of errors made over the whole set of instances

Training set Error Rate: is way too optimistic! you can find patterns even in random data

01/12/2007 8

Page 9: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

ClassificationStep 1

Split data into train and test sets

++--+Data

Training set

Testing set

01/12/2007 9

Page 10: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

ClassificationStep 2

Build a model on a training set

++--+Data

Training set

Model Builder

Y N

Testing set

01/12/2007 10

Page 11: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

ClassificationStep 3

Evaluate on test set

++--+Data

Training set

Testing set

Model Builder

Y N

Predictions+-+-

01/12/2007 11

Page 12: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

OUTLINEOUTLINE

• Introduction• Classification•• KK--fold cross Validationfold cross Validation• Feature selection by GA• Multiple Classifiers System• Information Fusion Methods

01/12/2007 12

Page 13: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

K-fold cross validation: (small data set)1. Data is split into k subsets of equal size.2. Each subset in turn is used for testing and the remainder

for training.

The accuracy estimates are averaged to yield an overall accuracy estimate

Simple Classification Process: (Large data set)1. Split data in to Train and test set2. Build a model on a training set3. Evaluate on test set

01/12/2007 13

Page 14: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

K-fold cross validation:

Break up data into groups of the same size

Keep the first one for Testing and others for training the model

Repeat the procedure for all the folds

01/12/2007 14

Page 15: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Classification with K-fold cross validation

Classification algorithm

Model

Test Data10-fold validation

Algorithm evaluation

Prediction of Targets

Data

Y N

01/12/2007 15

Page 16: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

OUTLINEOUTLINE

• Introduction• Classification• K-fold cross Validation•• Feature selection by GAFeature selection by GA• Multiple Classifiers System• Information Fusion Methods

01/12/2007 16

Page 17: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Feature selection by GAFeature selection by GA•• Feature vector spaces are HUGEFeature vector spaces are HUGE•• Many algorithms are sensitive to the number Many algorithms are sensitive to the number

of parametersof parameters

• Feature selection is defined as the selection of a subset of features to describe a phenomenon from a larger set that may contain irrelevant or redundant features

01/12/2007 17

Page 18: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Feature selection by GAFeature selection by GA• Advantages

– Reducing dimensionality– Improving learning efficiency– Increasing predictive accuracy– Reducing complexity of learned results

•• Mutual InformationMutual Information•• Iteratively eliminate features with least mutual information Iteratively eliminate features with least mutual information

with other remaining featureswith other remaining features•• Genetic AlgorithmsGenetic Algorithms

•• Try lots of subsets and choose the Try lots of subsets and choose the ““bestbest””

01/12/2007 18

Page 19: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Genetic AlgorithmsGenetic Algorithms•• Genetic Algorithms Genetic Algorithms -- Adaptive search and Adaptive search and

optimisation techniques based on the principles of optimisation techniques based on the principles of survival of the fittest (C. Darwin) and genetics survival of the fittest (C. Darwin) and genetics (G.J. Mendel)(G.J. Mendel)–– structure of a living being, structure of a living being, i.ei.e, a creature is , a creature is

"built"built”” decoding a set of chromosomes,decoding a set of chromosomes,–– organism which are wellorganism which are well--adapted to the adapted to the

environment are allowed to reproduce more environment are allowed to reproduce more often than those which are not.often than those which are not.

•• Holland, 1975 (American biologist) Holland, 1975 (American biologist) -- GAsGAs as an as an attempt to explain algorithmically diversity of attempt to explain algorithmically diversity of species and individuals in the nature. species and individuals in the nature.

C. Darwin

01/12/2007 19

Page 20: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Genetic Algorithms Genetic Algorithms contincontin……..

• Stochastic search algorithms based on principle of natural selection

• Chromosomes represent potential solutions– Binary or integer coding

• Population based search• Crossover and mutation operations for selection of new

individuals• Mutation introduces genetic diversity (New Information)• Computationally intense

01/12/2007 20

Page 21: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Genetic Algorithms Genetic Algorithms contincontin……..

Process:Process:• Encoding and Decoding• Cross Over - Single point , two point, multi point

and uniform cross over • Mutation - Single bit mutation • Selection - Roulette Wheel selection and

Tournament selection

01/12/2007 21

Page 22: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Genetic Algorithms Genetic Algorithms contincontin……..

EncodingEncoding

1 0 1 0 1 0 0 0 1 . . . 1 1 1

•Each 0/1 bit represents exclusion/inclusion of corresponding feature

01/12/2007 22

Page 23: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

GA Operational diagramGA Operational diagram

1 0 1 0 1 P11 0 0 0 0 P21 1 1 0 1 P30 1 0 0 1 P4Mating Pool

1 0 1 0 1 P11 0 0 0 0 P2

1 0 0 0 1 C11 0 1 0 0 C2

Cross Over

1 1 1 0 1 P3

1 1 0 0 1 C3Mutation

Off Spring

1 0 0 0 1 C11 0 1 0 0 C2

SourceEncoding

Roulette Wheel Selection

Decoding

Solution

1 1 0 0 1 C3

Fitness Computation

01/12/2007 23

Page 24: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Genetic Algorithm Flowchart

01/12/2007 24

Page 25: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Genetic Algorithms Genetic Algorithms contincontin……..

•• Cross Over Cross Over •• Is a recombination operatorIs a recombination operator•• Off springs are created by changing information among parent sOff springs are created by changing information among parent stringstrings

•• MutationMutation•• Produces spontaneous random changes Produces spontaneous random changes •• Responsible for injection of new information Responsible for injection of new information •• This prevents premature convergence to local optima.This prevents premature convergence to local optima.

•• Fitness FunctionFitness Function ComputationComputation•• GA works on Maximization problem , Here in our case The old croGA works on Maximization problem , Here in our case The old cross ss

validation is done, so that average of 10 accuracies considered validation is done, so that average of 10 accuracies considered as fitness as fitness function to be maximized.function to be maximized.

01/12/2007 25

Page 26: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Selection Procedure: (Implementation)Roulette wheel Selection:

•• Sum the fitness of all chromosomes, call it TSum the fitness of all chromosomes, call it T•• Generate a random number N between 1 and TGenerate a random number N between 1 and T•• Return chromosome whose fitness added to the running total is eqReturn chromosome whose fitness added to the running total is equal to or ual to or

larger than Nlarger than N•• Chance to be selected is exactly proportional to fitnessChance to be selected is exactly proportional to fitness

ChromosomeChromosome :: 11 22 33 44 55 66Fitness:Fitness: 88 22 1717 77 44 1111Running total:Running total: 88 1010 2727 3434 3838 4949N(1 N(1 ≤≤ N N ≤≤ 49):49): 2323Selected:Selected: 33

Tournament selection:• Randomly choose a group of T individuals from the population.• Select the best fittest one among them.

Genetic Algorithms Genetic Algorithms contincontin……..

01/12/2007 26

Page 27: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Genetic Algorithms Genetic Algorithms contincontin……..

How to guarantee that the best member of a population will survive?

Elitist model: the best member of the current population is set to be a

member of the next population always.

01/12/2007 27

Page 28: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Feature selection by GAFeature selection by GA

GA Feature Selection

ClassifierData

• Rely on a predetermined classification algorithm• Use 10 fold average as goodness measure• High accuracy, but computationally expensive

01/12/2007 28

Page 29: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

NodeSelected Feature

NodeSelected Feature

NodeSelected Feature

NodeSelected Feature

0 Mean 4 Kurtosis 8 Energy12 Standard

Deviation

1 Kurtosis 5Standard Deviation

9 Kurtosis13 Energy

2 Skewness 6 Entropy 10 Energy14 Standard

Deviation3 Mean 7 Kurtosis 11 Mean 15 Kurtosis

• The 4th level of wavelet packet contains 16 nodes , from each node among 6 features only one is selected by GA.

Optimally Selected features by GA

GA Feature selection for Wavelet PacketGA Feature selection for Wavelet Packet

01/12/2007 29

Page 30: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Results: GA Feature selection for Wavelet Results: GA Feature selection for Wavelet PacketPacket

The accuracy obtained by FkNN classifier

Case Energy(16) Entropy(16) Energy &

Entropy(32) All (96)GA selected

Feature (16)

Pure signal

(10 fold avg)97.90 97.90 98.08 99.03 99.3857

01/12/2007 30

Page 31: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

OUTLINEOUTLINE

• Introduction• Classification• K-fold cross Validation• Feature selection by GA•• Multiple Classifiers SystemMultiple Classifiers System• Information Fusion Methods

01/12/2007 31

Page 32: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Multiple classifiers SystemMultiple classifiers System

• Named as Multiple experts, Mixture of experts, co operative agents, classifier ensembles, multiple classifier systems, etc. by different authors.

• A multiple classifier system (MCS) is a structured way to combine (exploit) the outputs of individual classifiers

• The Final decision is taken by decision fusion techniques called Combiners.

01/12/2007 32

Page 33: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Multiple classifiers SystemMultiple classifiers System

Data

CLASSIFIER 1

CLASSIFIER 2

CLASSIFIER 3

CLASSIFIER n

COMBINER

D1

D2

D3

Dn

D

Parallel architecture of MCS

01/12/2007 33

Page 34: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Multiple classifiers SystemMultiple classifiers System

Architecture of MCS with sensor fusion (data itself grouped and given to different classifier)

01/12/2007 34

Page 35: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

OUTLINEOUTLINE

• Introduction• Classification• K-fold cross Validation• Feature selection by GA• Multiple Classifiers System•• Information Fusion MethodsInformation Fusion Methods

01/12/2007 35

Page 36: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Information Fusion MethodsInformation Fusion Methods• Abstract level• Rank level• Measurement level• ANN• Agent Based fusion

01/12/2007 36

Page 37: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 37

Fusion Methods Based on Fusion Methods Based on classifier outputsclassifier outputs

• Different information levels merit different fusion schemes

• Abstract Level : A classifier only outputs a unique label• Rank Level: A classifier ranks all labels or a subset of the

labels in a queue with the label at the top being the first choice

• Measurement Level: Each classifier attributes to each label a measurement value to address the degree that the sample has the label

Page 38: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 38

AbstractAbstract--level Fusion Methodslevel Fusion Methods

•• Voting methodsVoting methods•• Behaviour Knowledge space (BKS)Behaviour Knowledge space (BKS)•• BayesBayes belief methodbelief method

Page 39: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 39

AbstractAbstract--level Fusion Methodslevel Fusion Methods

Majority Voting method:

The final decision is the most frequent class in the each classifiers output.

Data

CLASSIFIER 1

CLASSIFIER 2

CLASSIFIER 3

CLASSIFIER n

Majority Voting

D1

D2

D3

Dn

D

Page 40: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 40

AbstractAbstract--level Fusion Methodslevel Fusion Methods

Behaviour Knowledge space (BKS):

•Every combination of classifier output is regarded as a cell in Look up Table•Each cell contains the number of samples of the validation set characterized by a particular value of class labels

Class 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 0 1 1 1

0 100 50 76 89 54 78 87 5

1 8 88 17 95 20 90 95 100

( ) 76P c 0 |D1 0,D2 1,D3 0 0.8276 17

= = = = = = ≥+

Threshold

Page 41: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 41

AbstractAbstract--level Fusion Methodslevel Fusion Methods

BayesBayes belief method:belief method:It assumes the mutual independency of classifiers and considers

the error matrix called Confusion Matrix is known.

C(x)A(m)

1 2 … M

1 n11 n12 … n1M2 n21 n22 … n2M.

M nM1 nM2 … nMMnij = number of events from C(i) classified as C(j)

Page 42: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 42

AbstractAbstract--level Fusion Methodslevel Fusion Methods

BayesBayes belief method:belief method:Sample x is assigned to class i if its probability becomes more

than other classes.

( ) ( )P Ci | D1,D2,...,Dn P Cj| D1,D2,...,Dn j i> ∀ ≠

Page 43: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 43

RankRank--level Fusion Methodslevel Fusion Methods

•• Based on democratic Election strategiesBased on democratic Election strategies•• Some classifiers provide class Some classifiers provide class ““scoresscores””, or some sort of , or some sort of

class probabilitiesclass probabilities•• This information can be used to This information can be used to ““rankrank”” each classeach class•• Pc1=0.20 Rc1=1Pc1=0.20 Rc1=1•• Classifier NClassifier N--> Pc2=0.69 > Pc2=0.69 --> Rc2=3> Rc2=3•• Pc3=0.34 Rc3=2Pc3=0.34 Rc3=2•• In general if In general if ΩΩ=c1,=c1,……ck is the set of classes, the ck is the set of classes, the

classifiers can provide an classifiers can provide an ““orderedordered”” (ranked) list of (ranked) list of class labelsclass labels

Page 44: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 44

The The BordaBorda Count MethodCount Method

• Let no. of classifiers N=3 and no. of classes k=4, class lables Ω=a,b,c,d

• For a given pattern, the ranked outputs of the three classifiers are as follows

Rank value Classifier1 Classifer2 Classifier34 c a b3 b b a2 d d c1 a c d

Page 45: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 45

The The BordaBorda Count MethodsCount Methods

•• So we haveSo we haverraa = r= raa

11+ra2+ ra

3 = 1+4+3=8rrbb = r= rbb

11+rb2+ rb

3 = 3+3+4=10rrcc = r= rcc

11+rc2+ rc

3 = 4+1+2=7rrdd = r= rdd

11+rd2+ rd

3 = 2+2+1=5

The winner-class is b because it has the maximum overall rank

Page 46: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 46

MeasurementMeasurement--level Fusion Methodslevel Fusion Methods

Linear combiners:-Simple and weighted averaging of classifiers output.- Weighted averaging is required for imbalanced classifiers , i.e.

classifier with different accuracy and / or different pair wise correlations

( ) ( )N

avg kk ii

k 1P W Px x

== ∑

Where Pik is the probability measure of kth classifier for ith

class

Page 47: Particle Swarm Optimization (PSO)

01/12/2007Dr B.K. Panigrahi, Dept. of Elect. Engg.,

IIT Delhi 47

Artificial Neural Network in Artificial Neural Network in Decision FusionDecision Fusion

- Simple and weighted averaging of classifiers output.- Weighted averaging is required for imbalanced classifiers , i.e.

classifier with different accuracy and / or different pair wise correlations

Data

CLASSIFIER 1

CLASSIFIER 2

CLASSIFIER 3

CLASSIFIER n

ANN

D1

D2

D3

Dn

D

Page 48: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Multi Agent Based Decision Fusion Multi Agent Based Decision Fusion

Wooldridge and Jennings have defined an agent, as "a computer system that is situated in some environment and that is capable of autonomous action in this environment in order to meet its design objectives."

01/12/2007 48

Page 49: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

• In addition to being autonomous, an intelligent agent is expected to have the following capabilities

• Reactivity: Intelligent agents are not only able to perceive their environment, but are also able to respond in a timely fashion to changes that occur in it in order to satisfy their design objectives.

• Proactive ness: Intelligent agents are able to exhibit goal directed behavior by taking the initiative in order to satisfy their design objectives.

• Social ability: Intelligent agents are capable of interacting with other agents (and possibly humans) in order to satisfy their design objectives .

01/12/2007 49

Page 50: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

• Multi-agent systems, unlike centralized systems or client server systems, are typically distributed systems in which several distinct components, each of which is an independent problem solving agent, come together to form a coherent whole

• MAS is any system that contains• * Two or more agents• * At least one autonomous agent and• * At least one relationship between two agents where one• satisfies the goal of the other."

01/12/2007 50

Page 51: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

MAS based architecture has the following advantages over client-server systems:

• Lower network bandwidth: being a distributed system, lesser volumes of data are exchanged between the agents as compared to traditional client-server architecture.

• Lesser computation time: in MAS, agents process data in parallel, thus decreasing the overall computation time.

• No single point failure: a centralized system without redundancyleads to single point failures that may collapse the entire system . However, in MAS the single point failure is alleviated due to its distributed architecture.

• Ease in addition of new resources or interconnections and extensibility.

01/12/2007 51

Page 52: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

Multi Agent based Decision FusionMulti Agent based Decision Fusion

01/12/2007 52

Page 53: Particle Swarm Optimization (PSO)

Dr B.K. Panigrahi, Dept. of Elect. Engg., IIT Delhi

ConclusionConclusion

• Intelligent computational methods are more suitable for Improving the accuracy of classification.

• Selection of a method is based on the nature of the problem.

01/12/2007 53