Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

97
Neural networks Neural networks for data mining for data mining Eric Postma Eric Postma MICC-IKAT MICC-IKAT Universiteit Maastricht Universiteit Maastricht

Transcript of Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Page 1: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Neural networksNeural networksfor data miningfor data mining

Eric PostmaEric Postma

MICC-IKATMICC-IKAT

Universiteit MaastrichtUniversiteit Maastricht

Page 2: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

OverviewOverview

Introduction: The biology of neural networks• the biological computer

• brain-inspired models

• basic notions

Interactive neural-network demonstrations• Perceptron

• Multilayer perceptron

• Kohonen’s self-organising feature map

• Examples of applications

Page 3: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

A typical AI agentA typical AI agent

Page 4: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Two types of learningTwo types of learning

• Supervised learningSupervised learning• curve fitting, surface fitting, ...curve fitting, surface fitting, ...

• Unsupervised learningUnsupervised learning• clustering, visualisation...clustering, visualisation...

Page 5: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

An input-output functionAn input-output function

Page 6: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Fitting a surface to four pointsFitting a surface to four points

Page 8: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

ClassificationClassification

Page 9: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

The history of neural networksThe history of neural networks

• A powerful metaphorA powerful metaphor

• Several decades of theoretical analyses led Several decades of theoretical analyses led to the formalisation in terms of statisticsto the formalisation in terms of statistics

• Bayesian frameworkBayesian framework

• We discuss neural networks from the We discuss neural networks from the original metaphorical perspective original metaphorical perspective

Page 10: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

(Artificial) neural networks(Artificial) neural networks

The digital computer The digital computer versusversus

the neural computerthe neural computer

Page 11: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

The Von Neumann architectureThe Von Neumann architecture

Page 12: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

The biological architectureThe biological architecture

Page 13: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Digital versus biological computersDigital versus biological computers

5 distinguishing properties5 distinguishing properties• speedspeed• robustness robustness • flexibilityflexibility• adaptivityadaptivity• context-sensitivitycontext-sensitivity

Page 14: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Speed: Speed: The “hundred time steps” argumentThe “hundred time steps” argument

The critical resource that is most obvious is The critical resource that is most obvious is time. Neurons whose basic computational time. Neurons whose basic computational speed is a few milliseconds must be made to speed is a few milliseconds must be made to account for complex behaviors which are account for complex behaviors which are carried out in a few hudred milliseconds carried out in a few hudred milliseconds (Posner, 1978). This means that (Posner, 1978). This means that entire complex entire complex behaviors are carried out in less than a hundred behaviors are carried out in less than a hundred time steps.time steps.

Feldman and Ballard (1982)Feldman and Ballard (1982)

Page 15: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Graceful DegradationGraceful Degradation

damage

performance

Page 16: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Flexibility: the Flexibility: the NeckerNecker cube cube

Page 17: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

vision = constraint satisfactionvision = constraint satisfaction

Page 18: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

And sometimes plain search…And sometimes plain search…

Page 19: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

AdaptivitiyAdaptivitiy

processing implies learningprocessing implies learning

in biological computers in biological computers

versus versus

processing does not imply learningprocessing does not imply learning

in digital computersin digital computers

Page 20: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Context-sensitivity: patternsContext-sensitivity: patterns

emergent propertiesemergent properties

Page 21: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Robustness and context-sensitivityRobustness and context-sensitivitycoping with noisecoping with noise

Page 22: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

The neural computerThe neural computer

• Is it possible to develop a model after the Is it possible to develop a model after the natural example?natural example?

• Brain-inspired models:Brain-inspired models:• models based on a restricted set of structural en models based on a restricted set of structural en

functional properties of the (human) brainfunctional properties of the (human) brain

Page 23: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

The Neural Computer (structure)The Neural Computer (structure)

Page 24: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Neurons, Neurons, the building blocks of the brainthe building blocks of the brain

Page 26: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Neural activityNeural activity

in

out

Page 27: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Synapses,Synapses,the basis of learning and memory the basis of learning and memory

Page 28: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Learning:Learning: Hebb Hebb’s rule’s ruleneuron 1 synapse neuron 2

Page 29: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Forgetting in neural networksForgetting in neural networks

Page 30: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Towards Towards neural networksneural networks

Page 31: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

ConnectivityConnectivityAn example:An example:The visual system is a The visual system is a feedforward hierarchy of feedforward hierarchy of neural modules neural modules

Every module is (to a Every module is (to a certain extent) certain extent) responsible for a certain responsible for a certain functionfunction

Page 32: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

(Artificial) Neural Networks(Artificial) Neural Networks

• NeuronsNeurons• activityactivity• nonlinear input-output functionnonlinear input-output function

• Connections Connections • weightweight

• LearningLearning• supervisedsupervised• unsupervisedunsupervised

Page 33: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Artificial NeuronsArtificial Neurons

• input (vectors)input (vectors)• summation (excitation)summation (excitation)• output (activation)output (activation)

i

Page 34: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Input-output functionInput-output function

• nonlinear function:nonlinear function:

e

f(e)

f(x) = 1 + e -x/a

1

a 0

a

Page 35: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Artificial Connections Artificial Connections (Synapses)(Synapses)

• wwABAB

• The weight of the connection from neuron The weight of the connection from neuron AA to to neuron neuron BB

A BwAB

Page 36: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

The PerceptronThe Perceptron

Page 37: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Learning in the PerceptronLearning in the Perceptron• Delta learning ruleDelta learning rule

• the difference between the desired output the difference between the desired output ttand the actual output and the actual output oo, , given input given input xx

• Global error E Global error E • is a function of the differences between the is a function of the differences between the

desired and actual outputsdesired and actual outputs

Page 38: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Gradient DescentGradient Descent

Page 39: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Linear decision boundariesLinear decision boundaries

Page 40: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Minsky and Papert’s Minsky and Papert’s connectedness connectedness argumentargument

Page 41: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

The history of the PerceptronThe history of the Perceptron

• Rosenblatt (1959)Rosenblatt (1959)

• Minsky & Papert (1961)Minsky & Papert (1961)

• Rumelhart & McClelland (1986)Rumelhart & McClelland (1986)

Page 42: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

The multilayer perceptronThe multilayer perceptron

input

one or more hidden layers

output

Page 43: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Training the MLPTraining the MLP• supervised learningsupervised learning

• each training pattern: input + desired output each training pattern: input + desired output • in each in each epochepoch: present all patterns : present all patterns • at each presentation: adapt weightsat each presentation: adapt weights• after many epochs convergence to a local minimumafter many epochs convergence to a local minimum

Page 44: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

phoneme recognition with a MLPphoneme recognition with a MLP

input: frequencies

Output:pronunciation

Page 45: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Non-linear decision boundariesNon-linear decision boundaries

Page 46: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Compression with an MLPCompression with an MLPthe the autoencoderautoencoder

Page 47: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

hidden representationhidden representation

Page 49: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Learning in the MLPLearning in the MLP

Page 52: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Preventing OverfittingPreventing Overfitting

GENERALISATION GENERALISATION = performance on test set= performance on test set

• Early stoppingEarly stopping• Training, Test, and Validation setTraining, Test, and Validation set• kk-fold cross validation-fold cross validation

• leaving-one-out procedureleaving-one-out procedure

Page 53: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Image Recognition with the MLPImage Recognition with the MLP

Page 55: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Hidden RepresentationsHidden Representations

Page 56: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Other ApplicationsOther Applications

• PracticalPractical• OCROCR• financial time seriesfinancial time series• fraud detectionfraud detection• process controlprocess control• marketingmarketing• speech recognitionspeech recognition

• TheoreticalTheoretical• cognitive modelingcognitive modeling• biological modelingbiological modeling

Page 57: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Some mathematics…Some mathematics…

Page 59: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Derivation of the delta learning ruleDerivation of the delta learning rule

Target output

Actual output

h = i

Page 61: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Sigmoid functionSigmoid function

• May also be theMay also be the tanhtanh functionfunction • (<-1,+1> (<-1,+1> instead of instead of <0,1>)<0,1>)

• DerivativeDerivative f’(x) = f(x) [1 – f(x)] f’(x) = f(x) [1 – f(x)]

Page 62: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Derivation generalized delta ruleDerivation generalized delta rule

Page 63: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Error funError functionction (LMS) (LMS)

Page 64: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

AdaptationAdaptation hidden-output hidden-output weightsweights

Page 65: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

AAdaptationdaptation input-hidden input-hidden weightsweights

Page 66: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Forward Forward andand Backward Propagation Backward Propagation

Page 67: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Decision boundaries of PerceptronsDecision boundaries of Perceptrons

Straight lines (surfaces), linear separable

Page 68: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Decision boundaries of MLPsDecision boundaries of MLPs

Convex areas (open or closed)

Page 69: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Decision boundaries of MLPs Decision boundaries of MLPs

Combinations of convex areas

Page 70: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Learning and representing Learning and representing similaritysimilarity

Page 71: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Alternative conception of neuronsAlternative conception of neurons

• Neurons do not take the weighted sum of their Neurons do not take the weighted sum of their inputs (as in the perceptron), but measure the inputs (as in the perceptron), but measure the similarity of the weight vector to the input similarity of the weight vector to the input vectorvector

• The activation of the neuron is a measure of The activation of the neuron is a measure of similarity. The more similar the weight is to the similarity. The more similar the weight is to the input, the higher the activationinput, the higher the activation

• Neurons represent “prototypes”Neurons represent “prototypes”

Page 72: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Course CodingCourse Coding

Page 73: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

22nd ordernd order isomor isomorphismphism

Page 74: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Prototypes forPrototypes for preprocessing preprocessing

Page 75: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Kohonen’s SOFMKohonen’s SOFM(Self Organizing Feature Map)(Self Organizing Feature Map)

• Unsupervised learningUnsupervised learning• Competitive learningCompetitive learning

output

input (n-dimensional)

winner

Page 76: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Competitive learningCompetitive learning

• Determine the winner (the neuron of which Determine the winner (the neuron of which the weight vector has the smallest distance the weight vector has the smallest distance to the input vector)to the input vector)

• Move the weight vector Move the weight vector ww of the winning of the winning neuron towards the input neuron towards the input ii

Before learning

i

w

After learning

i w

Page 77: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Kohonen’s ideaKohonen’s idea

• Impose a topological order onto the Impose a topological order onto the competitive neurons (e.g., competitive neurons (e.g., rectangular map)rectangular map)

• Let neighbours of the winner share Let neighbours of the winner share the “prize” (The “postcode lottery” the “prize” (The “postcode lottery” principle.)principle.)

• After learning, neurons with similar After learning, neurons with similar weights tend to cluster on the mapweights tend to cluster on the map

Page 78: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Biological inspirationBiological inspiration

Page 79: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Topological orderTopological order

neighbourhoodsneighbourhoods• SquareSquare

• winner (red)winner (red)• Nearest neighboursNearest neighbours

• HexagonalHexagonal• Winner (red)Winner (red)• Nearest neighboursNearest neighbours

Page 82: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

A simple exampleA simple example

• A topological map of 2 x 3 neurons A topological map of 2 x 3 neurons and two inputsand two inputs

2D input

input

weights

visualisation

Page 83: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Weights before trainingWeights before training

Page 84: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Input patterns Input patterns (note the 2D distribution)(note the 2D distribution)

Page 85: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Weights after trainingWeights after training

Page 86: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Another exampleAnother example

• Input: uniformly randomly distributed pointsInput: uniformly randomly distributed points

• Output: Map of 20Output: Map of 2022 neurons neurons

• TrainingTraining• Starting with a large learning rate and Starting with a large learning rate and

neighbourhood size, both are gradually decreased neighbourhood size, both are gradually decreased to facilitate convergenceto facilitate convergence

Page 87: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Weights visualisationWeights visualisation

Page 88: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Dimension reductionDimension reduction

3D input2D output

Page 89: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Adaptive resolutionAdaptive resolution

2D input2D output

Page 90: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Output map representationOutput map representation

Page 91: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Application of SOFMApplication of SOFM

Examples (input) SOFM after training (output)

Page 92: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Visual features (biologically plausible)Visual features (biologically plausible)

Page 93: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Face Face ClassificationClassification

Page 94: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Colour classificationColour classification

Page 95: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Car classificationCar classification

Page 96: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

• Principal Components Analysis (PCA)Principal Components Analysis (PCA)

pca1pca2

pca1

pca2

Projections of data

Relation with statistical methods 1Relation with statistical methods 1

Page 97: Neural networks for data mining Eric Postma MICC-IKAT Universiteit Maastricht.

Relation with statistical methods 2Relation with statistical methods 2• Multi-Dimensional Scaling (MDS)Multi-Dimensional Scaling (MDS)• Sammon MappingSammon Mapping

Distances in high-dimensional space