waqas haider khan bangyal. multi-layer perceptron (mlp)

22
Data Mining Waqas Haider Khan Bangyal

Upload: debra-lawson

Post on 30-Dec-2015

237 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Data Mining

Waqas Haider Khan Bangyal

Page 2: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Multi-Layer Perceptron (MLP)

Page 3: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Overview of the LectureWhy multilayer perceptrons?

Some applications of multilayer perceptrons.

Learning with multilayer perceptrons:

The backpropagation learning algorithm.

Page 4: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

History

1943: McCulloch–Pitts “neuron” Started the field

1962: Rosenblatt’s perceptron Learned its own weight values; convergence proof

1969: Minsky & Papert book on perceptrons Proved limitations of single-layer perceptron networks

1982: Hopfield and convergence in symmetric networks Introduced energy-function concept

1986: Backpropagation of errors Method for training multilayer networks

Page 5: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Recap: Perceptrons

Page 6: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Two-dimensional plots of basic logical operations

x1

x2

1

(a) AND (x1 x2)

1

x1

x2

1

1

(b) OR (x1 x2)

x1

x2

1

1

(c) Exclusive-OR(x1 x2)

00 0

A perceptron can learn the operations AND and OR, but not Exclusive-OR.

Page 7: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Multilayer neural networks

A multilayer perceptron (MLP) is a feed forward neural network with one or more hidden layers.

The network consists of an input layer of source neurons, at least one middle or hidden layer of computational neurons, and an output layer of computational neurons.

The input signals are propagated in a forward direction on a layer-by-layer basis.

Page 8: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Multilayer Perceptron with Single hidden layers

Page 9: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Multilayer Perceptron with Single hidden layers

Inputlayer

Firsthiddenlayer

Secondhiddenlayer

Outputlayer

O u

t p

u t

S

i g n

a l

s

I n

p u

t S

i g

n a

l s

Page 10: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

What does the middle layer hide?

A hidden layer “hides” its desired output. Neurons in the hidden layer cannot be observed through the input/output behaviour of the network. There is no obvious way to know what the desired output of the hidden layer should be.

Commercial ANNs incorporate three and sometimes four layers, including one or two hidden layers. Each layer can contain from 10 to 1000 neurons. Experimental neural networks may have five or even six layers, including three or four hidden layers, and utilise millions of neurons.

Page 11: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Back-propagation neural network

Learning in a multilayer network proceeds the same way as for a perceptron.

A training set of input patterns is presented to the network.

The network computes its output pattern, and if there is an error or in other words a difference between actual and desired output patterns the weights are adjusted to reduce this error.

Page 12: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Back-propagation neural network

In a back-propagation neural network, the learning algorithm has two phases.

Forward Pass: First, a training input pattern is presented to the network input layer. The network propagates the input pattern from layer to layer until the output pattern is generated by the output layer.

Backward Pass: If this pattern is different from the desired output, an error is calculated and then propagated backwards through the network from the output layer to the input layer. The weights are modified as the error is propagated.

Page 13: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Multilayer Perceptrons (MLPs)

Page 14: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Applications of MLPs

Page 15: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Learning with MLPs

Page 16: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Backpropagation

Page 17: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Three-layer back-propagation neural network

Inputlayer

xi

x1

x2

xn

1

2

i

n

Outputlayer

1

2

k

l

yk

y1

y2

yl

Input signals

Error signals

wjk

Hiddenlayer

wij

1

2

j

m

Page 18: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Backpropagation

Page 19: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Types of problems

• The BP algorithm is used in a great variety of problems:

• Time series predictions• Credit risk assessment• Pattern recognition• Speech processing• Cognitive modelling• Image processing• Control

• BP is the standard algorithm against which all other NN algorithms are compared!!

Page 20: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

Advantages & Disadvantages

• The MLP trained with the BP algorithm is a universal approximator of functions

• The BP algorithm is computationally efficient

• The BP algorithm has robustness

• The convergence of the BP can be very slow, especially in large problems, depending on the method

• The BP algorithm suffers from the problem of local minima

Page 21: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)

B.P Algorithm and its solved examples are attached in MS Word File.

Page 22: Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)