neural networks introduction – what is a neural network and what can it do for me? terminology,...

41
Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a good thing The Back Propagation and other training methods Network Pruning Network Development - Hints and kinks, practical application notes Examples of Neural Networks Closed loop control techniques The Neural Network Laboratory Introduction to DeltaV, HYSYS and the OLE connection Develop a soft sensor application

Upload: amanda-paul

Post on 29-Dec-2015

223 views

Category:

Documents


2 download

TRANSCRIPT

Neural Networks

• Introduction – What is a neural network and what can it do for me?

• Terminology, Design and Topology• Data Sets – When too much is not a good thing• The Back Propagation and other training methods• Network Pruning• Network Development - Hints and kinks, practical

application notes• Examples of Neural Networks• Closed loop control techniques• The Neural Network Laboratory

Introduction to DeltaV, HYSYS and the OLE connection Develop a soft sensor application

Neural Network Course Objectives

● Neural Network Fundamentals, Structure, Training, Testing

● How to use data to Train and Test the network● Develop DeltaV Neural Networks● Laboratory in DeltaV● Design Neural Networks in MATLAB and

EXCEL

Neural Networks, Heuristic

● Heuristic vs. Deterministic Model Development, NN are Heuristic Models, similar to statistics

● HYSYS, ASPEN, etc are deterministic models, using first principals

For DeltaV Neural Nets

● Understand basic principals of the DeltaV Neural Network

● Construct a DeltaV NNet and Lab Entry function blocks

● Configure the operator interface for these function blocks

DeltaV Neural

● DeltaV Neural is a complete package● Network Development● Training, Testing● Operator Interface, both Neural Network and Lab

Entry

DeltaV Neural Applications

● Critical process measurements available from grab samples, paper properties, food properties, etc.

● Backup or cross check on a measurement by a sampled or continuous analyzer, mass spec, stack gas analyzer. – Hint consider the “cost” of a sample analysis, can the

neural network “fill in” for skipped samples?

DeltaV Neural Applications

● Virtual sensors; neural network can be “trained” to calculate the laboratory results, a frequent application, a.k.a. “intelligent sensors” or “soft sensors”, ISTK

● This information is used by operators to anticipate changes in plant operation

● Can develop the network in much less time than first principal methods

#1 HD

#2 HD

#3 HD

BrokeStorage

Base RawStock Chest

MidHole (1)Refiner

Mid RefinerChest

Sweetener Stockto Saveall

MidTickler (2)Refiner

[Parallel]

MidMachineChest

Mid PlyStuffbox

OCC Raw StockChest

Mid Ply Pri. Screen (3)

ToPrimaryHeadbox

MidPly ScrnRej.Tank

BottomHole (1)Refiner

BottomRefinerChest

BottomTickler (1)Refiner

Bottom Machine Chest

Bottom PlyStuffbox

Bot. Ply Pri. Screen (1)

ToPrimaryHeadbox

Bot. PlySec. Screen (1)

Rejects to #2 PMMach. WW Chest

Mid Ply Sec. Screen (1)

Top RawStock Chest

TopHole (1)Refiner

TopRefinerChest

TopTickler (1)Refiner

Top PlySutffbox

Pri.CleanersSec.

Cleaners TertiaryCleaners

Quat.Cleaners

ToSec.Headbox

ToMid PlyScreen RejectsTank

To Mid PlyScreen RejectsTank

To Bottom Ply Silo

#1 HD

#2 HD

#3 HD

BrokeStorage

Base RawStock Chest

MidHole (1)Refiner

Mid RefinerChest

Sweetener Stockto Saveall

MidTickler (2)Refiner

[Parallel]

MidMachineChest

Mid PlyStuffbox

OCC Raw StockChest

Mid Ply Pri. Screen (3)

ToPrimaryHeadbox

MidPly ScrnRej.Tank

BottomHole (1)Refiner

BottomRefinerChest

BottomTickler (1)Refiner

Bottom Machine Chest

Bottom PlyStuffbox

Bot. Ply Pri. Screen (1)

ToPrimaryHeadbox

Bot. PlySec. Screen (1)

Rejects to #2 PMMach. WW Chest

Mid Ply Sec. Screen (1)

Top RawStock Chest

TopHole (1)Refiner

TopRefinerChest

TopTickler (1)Refiner

Top PlySutffbox

Pri.CleanersSec.

Cleaners TertiaryCleaners

Quat.Cleaners

ToSec.Headbox

ToMid PlyScreen RejectsTank

To Mid PlyScreen RejectsTank

To Bottom Ply Silo

VIRTUAL STFI SENSOR

Example, paper mill physical property

Predict the lab results

Just what is a neural network anyway?

● Neural networks are computer programs that model the functionality of the brain.

● Multi layered feed forward model, either single or multiple outputs

● Trained (in statistics called regression) by back propagation or other algorithms

● Non Linear Regression

The Neuron

● There are estimates of as many as 100 billion neurons in the human brain. They are often less than 100 microns in diameter and have as many as 10,000 connections to other neurons.

● Each neuron has an axon that acts as a wire for all connections to the other cell's neurons. The neurons have input paths which are called dendrites which gather information from these axons. The connection between the dendrites and the axon is called a synapse. The transmission of signals across the synapse is chemical in nature and the magnitude of the signal depends on the amount of chemicals (called neurotransmitter) released by the axon. Many drugs work on the basis of changing those natural chemicals. The synapse combined with the processing of information in the neuron is how the brain's memory process functions.

The neuron is a nerve cell

Dendrites

Axon

Synapses

Brain – Neural Network Analogy

 

Brain Neural NetworksNeuron CellNeural firing rate ActivationSynaptic strength Connection weight Synapse Connection

Cell body that contains the nucleusDendrites or Inputs that connect impulses to the cell bodyAxon conducts impulses away from the cell bodySynapses are gaps between neurons that act as outputs and are closely related to the input to adjoining dendrites. This connection is chemical in nature and that chemical concentration is how the weight is determined. Within the cell the inputs are weighed, that is given weight to their strength. If the weight is strong enough, the neuron “fires”, or an output is triggered.

Brain – Neural Network Analogy

Neural Network,Under the hood

● Layer approach– Input Layer– Hidden Layer (or Layers)– Output Layer

N1/N2/N3 Notation

DeltaV Neural

Three Layered Network

● Only one “hidden” Layer● Only one output● With enough hidden layers can represent any

continuous non-linear function● Can track either a single continuous variable or a

sampled variable, Lab Analysis

Neural Networks – Hidden Layer

● 1 Hidden layer sufficient to model a continuous function of several variables

● Exception to the rule: Inverse action requires 2 layers

DeltaV Neural - Output

● DeltaV Neural Network is designed for one output

● Why?● of errors will not properly distribute with more

than one output

Network Structure

● Inputs and Scaling● Synaptic Weights● Neuron, Summation and Transfer Function● Layer Concept, input, hidden and output● Output scaling and the Outputs

Network Structure – Input/Output Scaling

● PV ranges must be normalized so each variable has the same input factor for presentation to the network.

● Scaling around zero

Scaled PV = (PV – mean)/● is standard deviation

The Network is a Collection of Neurons and Weights

● Multi-layered feed forward network (MFN)● Bias Neurons

– Connected to each neuron except the input later– Provide a constant value or “bias” to the network

The Neuron

sigmoidal function, tanh

-1.0000

-0.5000

0.0000

0.5000

1.0000

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2

y=1-(2/(exp(2*x)+1))

DeltaV - Building the Network

Data Collection – The process uses data to design the network, good data is essential

Data Preprocessing – Remove outliers and missing points. 3 sigma rule

Variable and time delay selection - Determines which variable to use as well as the timing

Building the Network, continued

Network Training – Operation number of hidden neurons, adjusts the weights on the conditioned training set, learning the data

Network Verification – Checks how well the network behaves against actual data

DeltaV - Training the Network

Divides the data in three sets, Training, Testing and Verification

Presents Training data to the network

For each data set, presents the inputs, forward propagate the training set through all layers and finally the output

Compares the output to the target value, adjust the weights based on error, back-propagation

One training pass through all the data is called an epoch

Training the Network

Present the testing set data to the network after the weights are adjusted in one epoch

Propagate the test signals through the network, to the output

Compare the results, if small error exists, complete, otherwise repeat the training process

Training the Network

● Use “Balanced” Design, many points over the total range of network inputs

● Consider using techniques employed in design of experiments, DOE

● If most of the data is at one process point, it will learn that point very well.

Gradient Descent Learning

● Back propagation adjusts the weights to reduce the error between the network and the training set

● p is the pattern index i is the output nodes indexes, d is the desired output and y is the actual output.

p i

pipi ydE 2

2

1

Gradient Descent Learning

Process is:

 

a. Set weights to a random value

1. For each sample point in the training set, calculate the output value and the error.

2. Calculate the derivative E/ w

3. Adjust the weights to minimize the error

4. GOTO 1. until error decreases to the pre determined value or the number of epochs exceeds a pre determined value

Gradient Descent Learning

Two methods for weight update, batch mode and on line mode.

 

Batch mode: The pattern partial error is summed to obtain the total derivative:

p ij

p

w

E

w

E

Gradient Descent Learning● On Line mode: Weights are updated based

on the partial of the error with respect to weight based on one pattern or entry of test values. This is implemented using a momentum term. Momentum adds a portion of the previous weight to the new change

0 < < 0.9

)()1()( tw

Etwtw

ijijij

Conjugate Gradient Method

● DeltaV Neural uses Conjugate Gradient Method● Uses previous gradients● Adapts learning rate● No need to specify momentum or learning rate

factor

Training Criteria

● Predict, not memorize the data presented● DeltaV neural training software cross validates

network against test set to locate lease test error, will not over or under train

Verification of NN Accuracy

● Compare predicted and actual values● Verification should be done on a set not used for

training and testing● It is very important that the data points in

verification data set is within the values used to train and test the network. The training set must contain the minimum and maximum points!

DeltaV Neural ‘single-button’ Method

● All the tools to develop a network are built in to the DeltaV engineers workstation software

DeltaV Neural Network

● Function blocks, similar to AI, PID, etc● Lab Entry block for entry of analytical data

DeltaV - Input Data for NN Training

● Process data and lab analysis is collected when the NN and Lab Entry blocks are downloaded

● The NN application uses the data collected by the DeltaV historian

● Can input legacy data or data collected by another system as a flat text file

DeltaV NN Block

● Can Access Data anywhere within the control system

● Maximum of 20 references (30 in final release?)● 3 Modes

– Auto: Prediction of output based on the input– Manual: OUT can be set manually– Out of Service: OUT is set to a Bad status, no

calculations are executed

Neural Networks for Control

● Using Neural Networks for feedfoward and decoupling control interactions; an improvement to conventional PID control

● Using inverted networks for direct control, non-PID method