introduction to neural networks (under graduate course) lecture 2 of 9

Post on 15-Jul-2015

374 Views

Category:

Education

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Neural Networks

Dr. Randa Elanwar

Lecture 2

Lecture Content

• Neural network concepts:

– Basic definition.

– Connections.

– Processing elements.

2Neural Networks Dr. Randa Elanwar

Artificial Neural Network: Structure

• ANN posses a large number of processing elements called nodes/neurons which operate in parallel.

• Neurons are connected with others by connection link.

• Each link is associated with weights which contain information about the input signal.

• Each neuron has an internal state of its own which is a function of the inputs that neuron receives- Activation level

3Neural Networks Dr. Randa Elanwar

Artificial Neural Network: Neuron Model

(dendrite) (axon)

(soma)

4Neural Networks Dr. Randa Elanwar

f()Y

Wa

Wb

Wc

Connection weights

Summing function

Computation

(Activation Function)

X1

X3

X2

Input units

How are neural networks being used in solving problems

• From experience: examples / training data

• Strength of connection between the neurons is stored as a weight-value for the specific connection.

• Learning the solution to a problem = changing the connection weights

5Neural Networks Dr. Randa Elanwar

How are neural networks being used in solving problems

• The problem variables are mainly: inputs, weights and outputs

• Examples (training data) represent a solved problem. i.e. Both the inputs and outputs are known

• Thus, by certain learning algorithm we can adapt/adjust the NN weights using the known inputs and outputs of training data

• For a new problem, we now have the inputs and the weights, therefore, we can easily get the outputs.

6Neural Networks Dr. Randa Elanwar

How NN learns a task: Issues to be discussed

- Initializing the weights.

- Use of a learning algorithm.

- Set of training examples.

- Encode the examples as inputs.

-Convert output into meaningful results.

7Neural Networks Dr. Randa Elanwar

Linear Problems

• The simplest type of problems are the linear problems.

• Why ‘linear’? Because we can model the problem by a straight line equation (ax+by+c=z)

• or

• Example: logic linear problems And, OR, NOT problems. We know the truth tables thus we have examples and we can model the operation using a neuron

8Neural Networks Dr. Randa Elanwar

boutk

iii inw

1

.

outbinwinwinw ......332211

bXWOUT .

Linear Problems

• Example: AND (x1,x2), f(net) = 1 if net>1 and 0 otherwise

• Check the truth table: y = f(x1+x2)

9Neural Networks Dr. Randa Elanwar

x1 x2 y

0 0 0

0 1 0

1 0 0

1 1 1

x1

x2

y

1

1

Linear Problems

• Example: OR(x1,x2), f(net) = 1 if net>1 and 0 otherwise

• Check the truth table: y = f(2.x1+2.x2)

10Neural Networks Dr. Randa Elanwar

x1 x2 y

0 0 0

0 1 1

1 0 1

1 1 1

x1

x2

y

2

2

Linear Problems

• Example: NOT(x1), f(net) = 1 if net>1 and 0 otherwise

• Check the truth table: y = f(-1.x1+2)

11Neural Networks Dr. Randa Elanwar

x1 y

0 1

1 0

x1

y

-1

2

bias

1

Linear Problems

• Example: AND (x1,NOT(x2)), f(net) = 1 if net>1 and 0 otherwise

• Check the truth table: y = f(2.x1-x2)

12Neural Networks Dr. Randa Elanwar

x1 x2 y

0 0 0

0 1 0

1 0 1

1 1 0

x1

x2

y

2

-1

Neural Networks Dr. Randa Elanwar 13

The McCulloch-Pitts Neuron

• This vastly simplified model of real neurons is also known as a Threshold Logic Unit – A set of connections brings in activations from other neurons.– A processing unit sums the inputs, and then applies a non-linear activation function (i.e.

squashing/transfer/threshold function).– An output line transmits the result to other neurons.

).(1

bfoutn

iii inw

f(.)

w1

w2

wnb

).( bXWfOUT

McCulloch-Pitts Neuron Model

Neural Networks Dr. Randa Elanwar 14

Features of McCulloch-Pitts model

• Allows binary 0,1 states only

• Operates under a discrete-time assumption

• Weights and the neurons’ thresholds are fixed in the model and no interaction among network neurons

• Just a primitive model

Neural Networks Dr. Randa Elanwar 15

McCulloch-Pitts Neuron Model

• When T = 1 and w = 1• The input passes as is• Thus if input is =1 then o = 1• Thus if input is =0 then o = 0 (buffer)• Works as ‘1’ detector

• When T = 1 and w = -1• The input is inverted• Thus if input is =0 then o = 0• Thus if input is =1 then o = 0 • useless

16Neural Networks Dr. Randa Elanwar

McCulloch-Pitts Neuron Model

• When T = 0 and w = 1• The input passes as is• Thus if input is =0 then o = 1• Thus if input is =1 then o = 1• useless

• When T = 0 and w = -1• The input is inverted• Thus if input is =1 then o = 0• Thus if input is =0 then o = 1 (inverter)• Works as Null detector

17Neural Networks Dr. Randa Elanwar

McCulloch-Pitts NOR

18Neural Networks Dr. Randa Elanwar

•Can be implemented using an OR gate design followed by inverter

•We need ‘1’ detector, thus first layer is (T=1) node preceded by +1 weights

Zeros stay 0 and Ones stay 1

•We need inverter in the second layer, (T=0) node preceded by -1 weights

•Check the truth table

McCulloch-Pitts NAND

19Neural Networks Dr. Randa Elanwar

•Can be implemented using an inverter design followed by OR gate

•We need inverter in the first layer is (T=0) node preceded by -1 weightsZeros will be 1 and Ones will be zeros

•We need ‘1’ detector, thus first layer is (T=1) node preceded by +1 weights

Zeros stay 0 and Ones stay 1

General symbol of neuron consisting of processing node and synaptic connections

Neural Networks Dr. Randa Elanwar 20

Neuron Modeling for ANN

Neural Networks Dr. Randa Elanwar 21

Is referred to activation function. Domain is set of activation values net. (Not a single value fixed threshold)

Scalar product of weight and input vector

Neuron as a processing node performs the operation of summation of its weighted input.

top related