… SANN … Classification …

Post on 21-Jan-2016

29 views 0 download

description

… SANN … Classification …. Classification with Subsequent Artificial Neural Networks Linder, Dew, Sudhoff, Theegarten, Remberger, P Ö ppl, Wagner. Brian Selinsky. Outline. Terminology SANN vs Other ANN approaches SANN vs All Pairs Results. Clustering Dimensionality NN vs. ANN Neuron - PowerPoint PPT Presentation

Transcript of … SANN … Classification …

… SANN … Classification …

Classification with

Subsequent Artificial Neural Networks

Linder, Dew, Sudhoff, Theegarten, Remberger, PÖppl, Wagner

Brian Selinsky

Outline

• Terminology

• SANN vs Other ANN approaches

• SANN vs All Pairs

• Results

Terminology

• Clustering– Dimensionality

• NN vs. ANN• Neuron• Synapse• Thresholds• Weights• Training

– Learning Rate– Backpropagation

• Input– Standardization– Normalization

• Hidden Layers• Approaches

– MLP– One vs All– All Pairs– SANN

• Tools

Clustering

OO

O

OOO

OOOO

XX

X

XXX

XXXX

Clustering

OO

O

OOO

OOOO

XX

X

XXX

XXXX

YY

Y

YYY

YYYY

Clustering

OO

O

OOO

OOOO

XX

X

XXX

XXXX

YY

Y

YYY

YYYY

Dimensionality

• Inputs of interest

• Hyperplanes

• Dr Frisinas’ data– 4 Clusters– 22690 Inputs of interest– 22690 Dimensional Data– 3 or 4 - 22689 Dimensional Hyperplanes

Dimensionality

• Neural Nets convert input dimensionality to 1 number!

-1 0 1

ANN vs. NN

• Semantics

• Artificial Neural Net– Meant to simulate how the brain functions– Brain is a network of neurons– Brain is the natural neural net

• I use NN

Neural Net

Neural Net Black Box(Some magic happens here)

Class 1

Class 2

Class 3

Neuron

CalculateSummation

Compare toThreshold

Class 1

Class 2

Class 3

Neural Net

Neural Network

N

C T

C1

C2

C3

N

N

N

N

N

N Neural Net

Neural Network

N

C T

C1

C2

C3

N

N

N

N

N

N Neural Net

Inputs & Processing

Learning

Training

N

C T

C1

C2

C3

N

N

N

N

N

N Neural Net

Inputs & Processing

Learning

Training Set

What gets trained

• Threshold– Categorization

• Weight– Impact of an input to a neuron– Proportionality

• Learning Rate– Effect on weights– Effect on speed of training

How? - Backpropagation

N

C T

C1

C2

C3

N

N

N

N

N

N Neural Net

Inputs & Processing

Learning

Training Set

Input Data

• Data 1 Range 12000 – 500000

• Data 2 Range 1.0 – 1.5

• Standardizing or normalizing data makes weights more consistent and more accurate

Approaches

• Multi-layer Perceptron (MLP)

• Subdividing the problem– One vs. All– All Pairs– SANN

One vs. All

ANN

ANN

ANN

Class A

Class C

Class B

Not Class C

Not Class B

Not Class A

All Pairs

ANN

ANN

ANN

Class A

Class B

Class A

Class C

Class C

Class B

Class A

Class B

Class C

SANN

ANN

ANN A vs B

ANN A vs C

Class A .12

Class C .91Class B .88

ANN B vs CClass B .90

Class C .89Final Values

Class A .12

Class B .90

Class C .89

Results

• Increased data & nodes– Increased noise

• Subdividing NNs increases accuracy

• All Pairs vs SANN– All Pairs more accurate– SANN faster

Tools (FYI)

• MatLab (Neural Network Toolbox)– On CS System (Unix and Windows)

• NeuroSolutions– 60 day free trial (Windows)

• Joone– Free (Platform Independent)