KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.
-
Upload
kyle-hodges -
Category
Documents
-
view
216 -
download
1
Transcript of KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.
![Page 1: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/1.jpg)
KULIAH II JST:BASIC CONCEPTS
Amer Sharif, S.Si, M.Kom
![Page 2: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/2.jpg)
INTRODUCTION REVIEW
Neural Network definition: A massively parallel distributed
processor of simple processing units (neuron)
Store experiential knowledge and make it available for use
Knowledge is acquired from the environment through learning process
Knowledge is stored as internerneuron connection strengths (synaptic weights)
![Page 3: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/3.jpg)
INTRODUCTION REVIEW
Benefits: Nonlinearity Input Output Mapping Adaptivity
Evidential Response Contextual Information
Fault Tolerance/Graceful Degrading VLSI Implementability Uniform Analysis & Design
![Page 4: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/4.jpg)
NEURON MODELLING
Basic elements of neuron: A set of synapses or connecting links
Each synapse is characterized by its weight Signal xj at synapse j connected to neuron k
is multiplied by synaptic weight wkj Bias is bk
An adder for summing the input signals An activation function for limiting the
output amplitude of the neuron
![Page 5: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/5.jpg)
NEURON MODELLING
Block diagram of a nonlinier neuron
Inputsignals
Biasbk
x1 wk1
wk2
wkm
x2
xm
Summingjunction
vk
Activationfunction
Output
yk
Synapticweights
.
.
.
.
.
.
m
jjkjk xwu
1
buy kkk
![Page 6: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/6.jpg)
NEURON MODELLING
Note x1, x2,…, xm are input signals wk1, wk2,…, wkm are synaptic weights of
neuron k uk is the linier combiner output bk is bias is the activation function yk is the output signal of the neuron
![Page 7: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/7.jpg)
NEURON MODELLING
If and bias is substituted for a synapse where
x0 = + 1 with weight wk0 = bk
then and
buv kkk
m
jjkjk xwv
0
vy kk
![Page 8: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/8.jpg)
NEURON MODELLING
Modified block diagram of a nonlinier neuron
Inputsignals
wk0=bk (bias)
x1 wk1
wk2
wkm
x2
xm
Summingjunction
vk
Activationfunction
Output
yk
Synapticweights
.
.
.
.
.
.Fixed input x0= +1
wk0
m
jjkjk xwv
0
vy kk
![Page 9: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/9.jpg)
ACTIVATION FUNCTIONS
Activation Function types: Threshold Function
and
also known as the McCulloch-Pitts model
00
01
k
k
k vif
vify
bxwv k
m
jjkjk
1
-2 -1 0 1 2
v
1.2
1
0.8
0.6
0.4
0.2
0
v
![Page 10: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/10.jpg)
ACTIVATION FUNCTIONS
Piecewise-Linear Function
0
0.2
0.4
0.6
0.8
1
1.2
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
2
1,0
2
1
2
1,
2
12
1,1
v
vv
v
v
![Page 11: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/11.jpg)
ACTIVATION FUNCTIONS
Sigmoid Function S-shaped Sample logistic
function:
a is the slope parameter: the larger a the steeper the function
Differentiable everywhere
)exp(1
1)(
avv
0
0.2
0.4
0.6
0.8
1
1.2
-10 -8 -6 -4 -2 0 2 4 6 8 10
v
f(v)
0
0.2
0.4
0.6
0.8
1
1.2
increasing
a
![Page 12: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/12.jpg)
NEURAL NETWORKS AS DIRECTED GRAPHS
Neural networks maybe represented as directed graphs: Synaptic links
(linier I/O) Activation links
(nonlinier I/O) Synaptic convergence
Synaptic divergence
wkjxj
yk= wkj xj
xj
xy jk
yk=yi + yj
yi
yj
xj
xj
xj
![Page 13: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/13.jpg)
NEURAL NETWORKS AS DIRECTED GRAPHS Architectural graph: partially complete
directed graph
Outputyk
x0 =+1
xm
x2
x1
.
.
.
![Page 14: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/14.jpg)
FEEDBACK
Output of a system influences some of the input applied to the system
One or more closed paths of signal transmission around the system
Feedback plays an important role in recurrent networks
![Page 15: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/15.jpg)
FEEDBACK
Sample single-loop feedback system
Output signal yk(n) is an infinite weighted summation of present and past samples of input signal xj(n)
0
1 )()(l
jl
klnn xwy )( lnx j
x’j (n)xj(n) yk(n)
w
z-1
w is fixed weight
z-1 is unit-delay operator
is sample of input signal delayed by l time units
![Page 16: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/16.jpg)
FEEDBACK
Dynamic system behavior is determined by weight w
yk(n)
wxj(0)
0 1 2 3 4 nw < 1 System is exponentially convergent/stable System posses infinite memory: Output depends on
input samples extending into the infinite past Memory is fading: influence of past samples is
reduced exponentially with time n
w < 1
![Page 17: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/17.jpg)
FEEDBACK
w = 1 System is linearly
divergent w > 1
System is exponentially divergent
yk(n)
wxj(0)
0 1 2 3 4 n yk(n)
wxj(0)
0 1 2 3 4 n
w = 1
w > 1
![Page 18: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/18.jpg)
NETWORK ARCHITECTURES
Single-Layered Feedforward Networks
input layer ofsource nodes
output layer ofneurons
Neurons are organized in layers
“Single-layer” refers to output neurons
Source nodes supply to output neurons but not vice versa
Network is feedforward or acyclic
![Page 19: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/19.jpg)
NETWORK ARCHITECTURES
One or more hidden layers
Hidden neurons enable extractions of higher-order statistic
Network acquires global perspective due to extra set of synaptic connections and neural interactions
Multilayer Feedforward Networks
Input layer ofsource nodes
Layer of hiddenneurons
Layer of outputneurons
7-4-2 fully connected network:
• 7 source nodes• 4 hidden neurons• 2 output neurons
![Page 20: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/20.jpg)
NETWORK ARCHITECTURES
Recurrent Networks
At least one feedback loop
Feedback loops affect learning capability and performance of the network
z-1
z-1
z-1
z-1
Unit-delayoperators
Inputs
Outputs
![Page 21: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/21.jpg)
KNOWLEDGE REPRESENTATION
Definition of Knowledge: Knowledge refers to stored information or
models used by a person or a machine to interpret, predict, and appropriately respond to the outside world
Issues: What information is actually made explicit How information is physically encoded for
subsequent use Knowledge representation is goal-directed Good solution depends on good
representation of knowledge
![Page 22: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/22.jpg)
KNOWLEDGE REPRESENTATION Challenges faced by Neural Networks:
Learn the model of the world/environment Maintain the model to be consistent with the
real world to achieve the goals desired Neural Networks may learn from a set of
observations data in form of input-output pairs (training data/training sample)
Input is input signal and output is the corresponding desired response
![Page 23: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/23.jpg)
KNOWLEDGE REPRESENTATION Handwritten digit recognition problem
Input signal: one of 10 images of digits Goal: to identify image presented to the
network as input Design steps:
Select the appropriate architecture Train the network with subset of examples (learning
phase) Test the network with presentation of data/digit image
not seen before, then compare response of network with actual identity of the digit image presented (generalization phase)
![Page 24: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/24.jpg)
KNOWLEDGE REPRESENTATION Difference with classical pattern-classifier:
Classical pattern-classifier design steps: Formulate mathematical model of the problem Validate model with real data Build based on model
Neural Network design is: Based on real life data Data may “speak for itself”
Neural network not only provides model of the environment but also process the information
![Page 25: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/25.jpg)
ARTIFICIAL INTELLIGENCE AND NEURAL NETWORKS
AI systems must be able to: Store knowledge Use stored knowledge to solve problem Acquire new knowledge through experience
AI components: Representation
Knowledge is presented in a language of symbolic structures
Symbolic representation makes it relatively easy for human users
![Page 26: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/26.jpg)
ARTIFICIAL INTELLIGENCE AND NEURAL NETWORKS
Reasoning Able to express and solve broad range of problems Able to make explicit and implicit information
known to it Have a control mechanism to determine which
operation for a particular problem, when a solution is obtained, or when further work on the problem must be terminated
Rules, Data, and Control: Rules operate on Data Control operate on Rules
The Travelling Salesman Problem: Data: possible tours and cost Rules: ways to go from city to city Control: which Rules to apply and when
![Page 27: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/27.jpg)
ARTIFICIAL INTELLIGENCE AND NEURAL NETWORKS
Learning
Inductive learning: determine rules from raw data and experience
Deductive learning: use rules to determine specific facts
Environment Learning
element
Knowlegdge Base
Performance
element
![Page 28: KULIAH II JST: BASIC CONCEPTS Amer Sharif, S.Si, M.Kom.](https://reader035.fdocuments.us/reader035/viewer/2022070305/5514227a550346d8488b58a7/html5/thumbnails/28.jpg)
ARTIFICIAL INTELLIGENCE AND NEURAL NETWORKS
Parameter Artificial
Intelligence
Neural Networks
Level of Explanation Symbolic representation with sequential processing
Parallel distributed processing (PDP)
Processing Style Sequential Parallel
Representational Structure
Quasi-linguistic structure
Poor
Summary Formal manipulation of algorithm and data representation in top down fashion
Parallel distributed processing with natural ability to learn in bottom up fashion