Introduction to Neural networks (under graduate course) Lecture 2 of 9
-
Upload
randa-elanwar -
Category
Education
-
view
374 -
download
0
Transcript of Introduction to Neural networks (under graduate course) Lecture 2 of 9
![Page 1: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/1.jpg)
Neural Networks
Dr. Randa Elanwar
Lecture 2
![Page 2: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/2.jpg)
Lecture Content
• Neural network concepts:
– Basic definition.
– Connections.
– Processing elements.
2Neural Networks Dr. Randa Elanwar
![Page 3: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/3.jpg)
Artificial Neural Network: Structure
• ANN posses a large number of processing elements called nodes/neurons which operate in parallel.
• Neurons are connected with others by connection link.
• Each link is associated with weights which contain information about the input signal.
• Each neuron has an internal state of its own which is a function of the inputs that neuron receives- Activation level
3Neural Networks Dr. Randa Elanwar
![Page 4: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/4.jpg)
Artificial Neural Network: Neuron Model
(dendrite) (axon)
(soma)
4Neural Networks Dr. Randa Elanwar
f()Y
Wa
Wb
Wc
Connection weights
Summing function
Computation
(Activation Function)
X1
X3
X2
Input units
![Page 5: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/5.jpg)
How are neural networks being used in solving problems
• From experience: examples / training data
• Strength of connection between the neurons is stored as a weight-value for the specific connection.
• Learning the solution to a problem = changing the connection weights
5Neural Networks Dr. Randa Elanwar
![Page 6: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/6.jpg)
How are neural networks being used in solving problems
• The problem variables are mainly: inputs, weights and outputs
• Examples (training data) represent a solved problem. i.e. Both the inputs and outputs are known
• Thus, by certain learning algorithm we can adapt/adjust the NN weights using the known inputs and outputs of training data
• For a new problem, we now have the inputs and the weights, therefore, we can easily get the outputs.
6Neural Networks Dr. Randa Elanwar
![Page 7: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/7.jpg)
How NN learns a task: Issues to be discussed
- Initializing the weights.
- Use of a learning algorithm.
- Set of training examples.
- Encode the examples as inputs.
-Convert output into meaningful results.
7Neural Networks Dr. Randa Elanwar
![Page 8: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/8.jpg)
Linear Problems
• The simplest type of problems are the linear problems.
• Why ‘linear’? Because we can model the problem by a straight line equation (ax+by+c=z)
• or
• Example: logic linear problems And, OR, NOT problems. We know the truth tables thus we have examples and we can model the operation using a neuron
8Neural Networks Dr. Randa Elanwar
boutk
iii inw
1
.
outbinwinwinw ......332211
bXWOUT .
![Page 9: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/9.jpg)
Linear Problems
• Example: AND (x1,x2), f(net) = 1 if net>1 and 0 otherwise
• Check the truth table: y = f(x1+x2)
9Neural Networks Dr. Randa Elanwar
x1 x2 y
0 0 0
0 1 0
1 0 0
1 1 1
x1
x2
y
1
1
![Page 10: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/10.jpg)
Linear Problems
• Example: OR(x1,x2), f(net) = 1 if net>1 and 0 otherwise
• Check the truth table: y = f(2.x1+2.x2)
10Neural Networks Dr. Randa Elanwar
x1 x2 y
0 0 0
0 1 1
1 0 1
1 1 1
x1
x2
y
2
2
![Page 11: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/11.jpg)
Linear Problems
• Example: NOT(x1), f(net) = 1 if net>1 and 0 otherwise
• Check the truth table: y = f(-1.x1+2)
11Neural Networks Dr. Randa Elanwar
x1 y
0 1
1 0
x1
y
-1
2
bias
1
![Page 12: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/12.jpg)
Linear Problems
• Example: AND (x1,NOT(x2)), f(net) = 1 if net>1 and 0 otherwise
• Check the truth table: y = f(2.x1-x2)
12Neural Networks Dr. Randa Elanwar
x1 x2 y
0 0 0
0 1 0
1 0 1
1 1 0
x1
x2
y
2
-1
![Page 13: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/13.jpg)
Neural Networks Dr. Randa Elanwar 13
The McCulloch-Pitts Neuron
• This vastly simplified model of real neurons is also known as a Threshold Logic Unit – A set of connections brings in activations from other neurons.– A processing unit sums the inputs, and then applies a non-linear activation function (i.e.
squashing/transfer/threshold function).– An output line transmits the result to other neurons.
).(1
bfoutn
iii inw
f(.)
w1
w2
wnb
).( bXWfOUT
![Page 14: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/14.jpg)
McCulloch-Pitts Neuron Model
Neural Networks Dr. Randa Elanwar 14
![Page 15: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/15.jpg)
Features of McCulloch-Pitts model
• Allows binary 0,1 states only
• Operates under a discrete-time assumption
• Weights and the neurons’ thresholds are fixed in the model and no interaction among network neurons
• Just a primitive model
Neural Networks Dr. Randa Elanwar 15
![Page 16: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/16.jpg)
McCulloch-Pitts Neuron Model
• When T = 1 and w = 1• The input passes as is• Thus if input is =1 then o = 1• Thus if input is =0 then o = 0 (buffer)• Works as ‘1’ detector
• When T = 1 and w = -1• The input is inverted• Thus if input is =0 then o = 0• Thus if input is =1 then o = 0 • useless
16Neural Networks Dr. Randa Elanwar
![Page 17: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/17.jpg)
McCulloch-Pitts Neuron Model
• When T = 0 and w = 1• The input passes as is• Thus if input is =0 then o = 1• Thus if input is =1 then o = 1• useless
• When T = 0 and w = -1• The input is inverted• Thus if input is =1 then o = 0• Thus if input is =0 then o = 1 (inverter)• Works as Null detector
17Neural Networks Dr. Randa Elanwar
![Page 18: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/18.jpg)
McCulloch-Pitts NOR
18Neural Networks Dr. Randa Elanwar
•Can be implemented using an OR gate design followed by inverter
•We need ‘1’ detector, thus first layer is (T=1) node preceded by +1 weights
Zeros stay 0 and Ones stay 1
•We need inverter in the second layer, (T=0) node preceded by -1 weights
•Check the truth table
![Page 19: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/19.jpg)
McCulloch-Pitts NAND
19Neural Networks Dr. Randa Elanwar
•Can be implemented using an inverter design followed by OR gate
•We need inverter in the first layer is (T=0) node preceded by -1 weightsZeros will be 1 and Ones will be zeros
•We need ‘1’ detector, thus first layer is (T=1) node preceded by +1 weights
Zeros stay 0 and Ones stay 1
![Page 20: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/20.jpg)
General symbol of neuron consisting of processing node and synaptic connections
Neural Networks Dr. Randa Elanwar 20
![Page 21: Introduction to Neural networks (under graduate course) Lecture 2 of 9](https://reader031.fdocuments.us/reader031/viewer/2022032611/55a561cb1a28ab6d4e8b4615/html5/thumbnails/21.jpg)
Neuron Modeling for ANN
Neural Networks Dr. Randa Elanwar 21
Is referred to activation function. Domain is set of activation values net. (Not a single value fixed threshold)
Scalar product of weight and input vector
Neuron as a processing node performs the operation of summation of its weighted input.