neural network tool box khaled a. al-utaibi. outlines neuron model transfer functions network...
TRANSCRIPT
Neural Network Tool Box
Khaled A. Al-Utaibi
ICS583: Pattern Recoginition 2009-2010 2
Outlines
Neuron Model Transfer Functions Network Architecture Neural Network Models Feed-forward Network Training & Simulation Example 1: Majority Function Example 2: Handwritten Digits
Recognition
ICS583: Pattern Recoginition 2009-2010 3
Neuron Model
ICS583: Pattern Recoginition 2009-2010 4
Neuron ModelInputs Weights
Bias
WeightedSum
TransferFunction Output
ICS583: Pattern Recoginition 2009-2010 5
Transfer Functions
Many transfer functions are included in the Neural Network Toolbox
Three of the most commonly used functions are• Hard-Limit Transfer Function• Linear Transfer Function• Log-Sigmoid Transfer Function
ICS583: Pattern Recoginition 2009-2010 6
Transfer Functions
Hard-Limit Transfer Function
ICS583: Pattern Recoginition 2009-2010 7
Transfer Functions
Linear Transfer Function
ICS583: Pattern Recoginition 2009-2010 8
Transfer Functions
Log-sigmoid Transfer Function
ICS583: Pattern Recoginition 2009-2010 9
Network Architecture
Single Layer of Neurons
ICS583: Pattern Recoginition 2009-2010 10
Network Architecture
Multiple Layers of Neurons
ICS583: Pattern Recoginition 2009-2010 11
Neural Network Models
MATLAB contains several models of neural networks (general / special purpose):• Feed-forward back-propagation network• Elman back-propagation network• Cascade-forward back-propagation
network• Pattern recognition network• Fitting network• SOM network (Self-Organizing Map)
ICS583: Pattern Recoginition 2009-2010 12
Feed-Forward Network
Create feed-forward back-propagation network
Many neural network models in MATLAB are special cases of this model (e.g. pattern recognition, fitting, and SOM)
Syntax
network_name = newff(arguments)
ICS583: Pattern Recoginition 2009-2010 13
Feed-Forward Network
ArgumentsArgument(s) DescriptionP input vectorsT Target vectors[S1 S2 … SN-1] Size of ith layer{TF1, TF2, …, TFN} Transfer function of ith layerBTF Bp network training functionBLF Bp weight/bias learning functionIPF Input processing functions. OPF Output processing functionsDDF Data division function
ICS583: Pattern Recoginition 2009-2010 14
Feed-Forward Network
Output layer size is determined from T Input and output processing functions
transform the inputs and outputs into a better form for network use:• Re-encode unknown input/output values
into numerical values• Remove redundant inputs and outputs
vectors.• Normalizes input/output values.
ICS583: Pattern Recoginition 2009-2010 15
Feed-Forward Network
MATLAB provides several data division functions that divide the input data into three sets (using different strategies and different percentage for each set:• Training Set• Validation Set• Testing Set
ICS583: Pattern Recoginition 2009-2010 16
Training the Network
Syntax
Arguments[ret_vals] = train(arguments)
Argument(s) Descriptionnet Neural network to be trainedP Network inputsT Network targetsPi Initial input delay conditionsAi Initial layer delay conditions
ICS583: Pattern Recoginition 2009-2010 17
Training the Network
Returned Values
Argument(s) Descriptionnet Trained neural networkP Training record (iter & performance)Y Network outputsE Network errorsPf Final input delay conditionsAf Final layer delay conditions
ICS583: Pattern Recoginition 2009-2010 18
Simulating the Network
Syntax
Arguments[ret_vals] = sim(arguments)
Argument(s) Descriptionnet Neural network to be simulatedP Network inputsPi Initial input delay conditionsAi Initial layer delay conditionsT Network targets
ICS583: Pattern Recoginition 2009-2010 19
Simulating the Network
Returned Values
Argument(s) DescriptionY Network outputsPf Final input delay conditionsAf Final layer delay conditionsE Network errorsPerf Network performance
ICS583: Pattern Recoginition 2009-2010 20
Example 1: Majority Function
P1 P2 P3 T0 0 0 00 0 1 00 1 0 00 1 1 11 0 0 01 0 1 11 1 0 11 1 1 1
% initialize network inputsinputs = [ 0 0 0 0 1 1 1 1; ... 0 0 1 1 0 0 1 1; ... 0 1 0 1 0 1 0 1]; % initialize network targetstargets = [0 0 0 1 0 1 1 1];
ICS583: Pattern Recoginition 2009-2010 21
Example 1: Majority Function% initialize network inputsinputs = [ 0 0 0 0 1 1 1 1; ... 0 0 1 1 0 0 1 1; ... 0 1 0 1 0 1 0 1]; % initialize network targetstargets = [0 0 0 1 0 1 1 1]; % create a feed-froward network with a hidden% layer of 3 neuronsnet = newff(inputs, targets, 3, ...{'logsig', 'purelin'}); % train the networknet = train(net,inputs,targets); % simulate the networkoutputs = sim(net,inputs);
ICS583: Pattern Recoginition 2009-2010 22
Example 2: Handwritten Digits Recognition
Given a set of 1000 samples of different handwritten digits (0,1, …, 9),
Each digit is represented as a binary image of size 28x28 pixels
ICS583: Pattern Recoginition 2009-2010 23
Example 2: Handwritten Digits Recognition
We would like to use MATLAB Neural Network Toolbox to design a neural network to recognize handwritten digits
Pattern recognition network (newpr) is suitable for this purpose
ICS583: Pattern Recoginition 2009-2010 24
Example 2: Handwritten Digits Recognition
% initialize network inputsinputs = [0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0; 0 0 0 0 0 0 1 1 1 0 0 1 0 1 0 0 1 1 1 0 0 0 0 0 0];
ICS583: Pattern Recoginition 2009-2010 25
Example 2: Handwritten Digits Recognition
% initialize network targetsinputs = [ 0 1; ... 1 0; ... 0 0; ... 0 0; ... 0 0; ... 0 0; ... 0 0; ... 0 0; ... 0 0];
ICS583: Pattern Recoginition 2009-2010 26
Questions