Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.

Post on 29-Dec-2015

212 views 0 download

Transcript of Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.

Artificial Neural Networks (ANN)

Artificial Neural Networks (ANN)

X1 X2 X3 Y1 0 0 01 0 1 11 1 0 11 1 1 10 0 1 00 1 0 00 1 1 10 0 0 0

X1

X2

X3

Y

Black box

Output

Input

Output Y is 1 if at least two of the three inputs are equal to 1.

Artificial Neural Networks (ANN)

X1 X2 X3 Y1 0 0 01 0 1 11 1 0 11 1 1 10 0 1 00 1 0 00 1 1 10 0 0 0

X1

X2

X3

Y

Black box

0.3

0.3

0.3 t=0.4

Outputnode

Inputnodes

otherwise0

trueis if1)( where

)04.03.03.03.0( 321

zzI

XXXIY

Artificial Neural Networks (ANN)

• Model is an assembly of inter-connected nodes and weighted links

• Output node sums up each of its input value according to the weights of its links

• Compare output node against some threshold t

X1

X2

X3

Y

Black box

w1

t

Outputnode

Inputnodes

w2

w3

)( tXwIYi

ii Perceptron Model

)( tXwsignYi

ii

or

General Structure of ANN

Activationfunction

g(Si )Si Oi

I1

I2

I3

wi1

wi2

wi3

Oi

Neuron iInput Output

threshold, t

InputLayer

HiddenLayer

OutputLayer

x1 x2 x3 x4 x5

y

Training ANN means learning the weights of the neurons

Algorithm for learning ANN

• Initialize the weights (w0, w1, …, wk)

• Adjust the weights in such a way that the output of ANN is consistent with class labels of training examples– Objective function:

– Find the weights wi’s that minimize the above objective function

• e.g., backpropagation algorithm

2),( i

iii XwfYE

Classification by Backpropagation

• Backpropagation: A neural network learning algorithm

• Started by psychologists and neurobiologists to develop and

test computational analogues of neurons

• A neural network: A set of connected input/output units wher

e each connection has a weight associated with it

• During the learning phase, the network learns by adjusting

the weights so as to be able to predict the correct class lab

el of the input tuples

• Also referred to as connectionist learning due to the conne

ctions between units

Neural Network as a Classifier• Weakness

– Long training time – Require a number of parameters typically best determined

empirically, e.g., the network topology or ``structure." – Poor interpretability: Difficult to interpret the symbolic meaning

behind the learned weights and of ``hidden units" in the network

• Strength– High tolerance to noisy data – Ability to classify untrained patterns – Well-suited for continuous-valued inputs and outputs– Successful on a wide array of real-world data– Algorithms are inherently parallel– Techniques have recently been developed for the extraction of

rules from trained neural networks

Backpropagation• Iteratively process a set of training tuples & compare the networ

k's prediction with the actual known target value

• For each training tuple, the weights are modified to minimize th

e mean squared error between the network's prediction and th

e actual target value

• Modifications are made in the “backwards” direction: from the o

utput layer, through each hidden layer down to the first hidden la

yer, hence “backpropagation”

• Steps– Initialize weights (to small random #s) and biases in the network– Propagate the inputs forward (by applying activation function) – Backpropagate the error (by updating weights and biases)– Terminating condition (when error is very small, etc.)

Software

• Excel + VBA

• SPSS Clementine

• SQL Server

• Programming

• Other Statistic Package …