Networks of Neural Computation Self-Organising Networks

download Networks of Neural Computation  Self-Organising Networks

of 47

Transcript of Networks of Neural Computation Self-Organising Networks

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    1/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    )*6 + Self,Or-anisin- Networks

    CS 476: Networks of NeuralComputation

    WK6 Self-OrganisingNetworks:

    Dr. Statis Kas!eri!is

    Dept. of Computer S"ien"e

    #ni$ersit% of Crete

    Spring Semester& '(()

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    2/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Contents

    Contents

    *+ntro!u"tion

    *Self-Organising ,ap mo!el

    *roperties of SO,

    */amples

    *0earning 1e"tor 2uantisation

    *Con"lusions

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    3/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction

    *We will present a spe"ial "lass of NN wi" is

    "alle! a self-organising map.

    *3eir main "ara"teristi"s are:

    *3ere is "ompetiti$e learning among te

    neurons of te output la%er i.e. on tepresentation of an input pattern onl% oneneuron wins te "ompetition tis is "alle!a winner5

    *3e neurons are pla"e! in a lattice,

    usuall% 'D

    *3e neurons are sele"ti$el% tune! to$arious input patterns

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    4/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction,.

    *3e lo"ations of te neurons so tune!

    e"ome or!ere! wit respe"t to ea" oterin su" a wa% tat a meaningful "oor!inates%stem for !i8erent input features is"reate! o$er te latti"e.

    *+n summar%:A self-organising map ischaracterised by the formation of atopographic mapof the input patterns inwhich the spatial locations (i.e. coordinates)of the neurons in the lattice are indicative of

    intrinsic statistical features contained in theinput patterns.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    5/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction,#

    *3e moti$ation for te !e$elopment of tis

    mo!el is !ue to te e/isten"e of topologicallyordered computational mapsin te umanrain.

    *9 "omputational map is !ene! % an arra%

    of neurons representing sligtl% !i8erentl%tune! pro"essors& wi" operate on tesensor% information signals in parallel.

    *Conse;uentl%& te neurons transform input

    signals into aplace-coded probabilitydistributiontat represents te "ompute!$alues of parameters % sites of ma/imumrelati$e a"ti$it% witin te map.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    6/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction,/

    *3ere are two !i8erent mo!els for te self-

    organising map:

    *Willsaw-$on !er ,alsurg mo!el

    *Koonen mo!el.

    *+n ot mo!els te output neurons arepla"e! in a 'D latti"e.

    *3e% !i8er in te wa% input is gi$en:

    *+n te Willsaw-$on !er ,alsurg mo!el

    te input is also a 'D latti"e of e;ualnumer of neurons

    *+n te Koonen mo!el tere isn

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    7/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction,4

    *S"emati"all% te mo!els are sown elow:

    Willsaw $on!er ,alsurg

    mo!el

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    8/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction,0

    Koonenmo!el

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    9/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction,6

    *3e mo!el of Willsaw = $on !er ,alsurg

    was propose! as an e8ort to e/plain teretinotopi" mapping from te retina to te$isual "orte/.

    *3wo la%ers of neurons wit ea" input neuron

    full% "onne"te! to te output neurons la%er.

    *3e output neurons a$e "onne"tions of twot%pes among tem:

    *Sort-range e/"itator% ones

    *0ong-range iniitor% ones

    *Conne"tion from input output aremo!iale an! are of >eian t%pe

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    10/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction,7

    *3e total weigt asso"iate! wit a

    posts%napti" neuron is oun!e!. 9s a resultsome in"oming "onne"tions are in"rease!wile oters !e"rease. 3is is nee!e! in or!erto a"ie$e stailit% of te network !ue toe$er-in"reasing $alues of s%napti" weigts.

    *3e numer of input neurons is te same aste numer of te output neurons.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    11/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Introduction

    Introduction,1

    *3e Koonen mo!el is a more general

    $ersion of te Willsaw-$on !er ,alsurgmo!el.

    *+t allows for compression of information. +telongs to a "lass of vector-codingalgoritms.

    +.e. it pro$i!es a topologi"al mapping tatoptimall% pla"es a /e! numer of $e"torsinto a iger-!imensional spa"e an! tere%fa"ilitates !ata "ompression.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    12/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map

    *3e main goal of te SO, is to transform an

    in"oming pattern of aritrar% !imension into aone- or two- !imensional !is"rete map& an! toperform tis transformation a!apti$el% in atopologi"all% or!ere! fasion.

    *a" output neuron is full% "onne"te! to allte sour"e no!es in te input la%er.

    *3is network represents a fee!forwar!stru"ture wit a single "omputational la%er"onsisting of neurons arrange! in a 'D or ?D

    gri!. >iger !imensions @ 'D are possile utnot use! $er% often. Ari! topolog% "an es;uare& e/agonal& et".

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    13/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.

    *9n input pattern to te SO, network

    represents a lo"alise! region of Ba"ti$it%against a ;uiet a"kgroun!.

    *3e lo"ation an! nature of su" a Bspotusuall% $aries from one input pattern to

    anoter. 9ll te neurons in te network soul!terefore e e/pose! to a su"ient numer of!i8erent realisations of te input signal in or!erto ensure tat te self-organisation pro"ess aste "an"e to mature properl%.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    14/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,#

    *3e algoritm wi" is responsile for te self-

    organisation of te network is ase! on tree"omplimentar% pro"esses:

    *Competition;

    *Cooperation;

    *Synaptic Adaptation.

    *We will e/amine ne/t te !etails of ea"me"anism.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    15/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,/: Competiti2e Process

    *0et m e te !imension of te input spa"e. 9

    pattern "osen ran!oml% from input spa"e is!enote! %:

    xEF/?& /'&G& /mH3

    *3e s%napti" weigt of ea" neuron in te

    output la%er as te same !imension as teinput spa"e. We !enote te weigt of neuron Ias:

    wIEFwI?& wI'&G& wImH3& IE?&'&G&l

    Were l is te total numer of neurons in teoutput la%er.

    *3o n! te est mat" of te input $e"tor xwit te s%napti" weigts wIwe use te

    u"li!ean !istan"e. 3e neuron wit tesmallest !istan"e is "alle! ix5 an! is gi$en %:

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    16/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,4: Competiti2e Process

    ix5Earg minIJJx wIJJ& IE?&'&G&l

    *3e neuron i5 tat satises te ao$e"on!ition is "alle! best-matchingor winningneuronfor te input $e"tor x.

    *3e ao$e e;uation lea!s to te followingoser$ation:A continuous input space ofactivation patterns is mapped onto a discreteoutput space of neurons by a process ofcompetition among the neurons in the networ.

    *Depen!ing on te appli"ation

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    17/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,0: Cooperati2e Process

    $e"tor tat is "losest to te input $e"tor.

    *3e winning neuron e8e"ti$el% lo"ates te"enter of a topological neighbourhood.

    *rom neuroiolog% we know tat a winningneuron e/"ites more tan a$erage te neurons

    tat e/ist in its imme!iate neigouroo! an!iniits more te neurons tat te% are inlonger !istan"es.

    *3us we see tat te neigouroo! soul! ea !e"reasing fun"tion of te lateral distance

    etween te neurons.

    *+n te neigouroo! are in"lu!e! onl%e/"ite! neurons& wile iniite! neurons e/istoutsi!e of te neigouroo!.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    18/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,6: Cooperati2e Process

    *+f !iIis te lateral !istan"e etween neurons i

    an! I assuming tat i is te winner an! it islo"ate! in te "entre of te neigouroo!5 an!we !enote Iite topological neighbourhood

    around neuron i& ten Iiis a unimodal function

    of distancewi" satises te following twore;uirements:

    *!he topological neighbourhood h"iis

    symmetric about the ma#imum pointde$ned by di"%&; in other words, it attains its

    ma#imum value at the winning neuron i forwhich the distance is 'ero.

    *!he amplitude of the topologicalneighbourhood h"idecreases monotonically

    with increasing lateral distance di"decaying

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    19/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,7: Cooperati2e Process

    *9 t%pi"al "oi"e of Iiis te Aaussian fun"tion

    wi" is translation in$ariant i.e. in!epen!entof te lo"ation of te winning neuron5:

    *3e parameter is te Be8e"ti$e wi!t of teneigouroo!. +t measures te !egree towi" e/"ite! neurons in te $i"init% of te

    winning neuron parti"ipate in te learningpro"ess.

    )

    2

    exp(2

    2

    )(

    ij

    xji

    dh

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    20/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,1: Cooperati2e Process

    *3e !istan"e among neurons is !ene! as te

    u"li!ean metri". or e/ample for a 'D latti"ewe a$e:

    !iI' EJJrI riJJ

    '

    Were te !is"rete $e"tor rI!enes te position

    of e/"ite! neuron I an! ri!enes te position ofte winning neuron in te latti"e.

    *9noter "ara"teristi" feature of te SO,algoritm is tat te siLe of te neigouroo!

    shrinswit time. 3is re;uirement is satise!% making te wi!t of te Aaussian fun"tion!e"reasing wit time.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    21/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,%: Cooperati2e Process

    *9 popular "oi"e is te e/ponential !e"a%

    !es"rie! %:

    Were ( is te $alue of at te initialisation of

    te SO, algoritm an! ?is a time constant.

    *Correspon!ingl% te neigouroo! fun"tionassumes a time !epen!ent form of its own:

    *3us as time in"reases i.e. iterations5 tewi!t !e"reases in an e/ponential manner an!te neigouroo! srinks appropriatel%.

    ,...2,1,0)exp()(1

    0 nn

    n

    ,...2,1,0)

    )(2

    exp()(2

    2

    )( n

    n

    dnh

    ji

    xji

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    22/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.$: 3dapti2e Process

    *3e a!apti$e pro"ess mo!ies te weigts of

    te network so as to a"ie$e te self-organisation of te network.

    *Onl% te winning neuron an! neurons insi!eits neigouroo! a$e teir weigts a!apte!.

    9ll te oter neurons a$e no "ange in teirweigts.

    *9 meto! for !eri$ing te weigt up!atee;uations for te SO, mo!el is ase! on amo!ie! form of >eian learning. 3ere is a

    forgetting term in te stan!ar! >eian weigte;uations.

    *0et us assume tat te forgetting termas teform g%I5wIwere %Iis te response of neuron I

    an! g*5 is a positi$e s"alar fun"tion of %I.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    23/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,..: 3dapti2e Process

    *3e onl% re;uirement for te fun"tion g%I5 is

    tat te "onstant term in its 3a%lor seriese/pansion to e Lero wen te a"ti$it% is Lero&i.e.:

    g%I5E( for %IE(

    *3e mo!ie! >eian rule for te weigts ofte output neurons is gi$en %:

    wIE %Ix - g%I5 wI

    Were is te learning rate parameterof tealgoritm.

    *3o satisf% te re;uirement for a Lero "onstantterm in te 3a%lor series we "oose tefollowing form for te fun"tion g%

    I

    5:

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    24/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.#: 3dapti2e Process

    g%I5E %I

    *We "an simplif% furter % setting:

    %IE Ii/5

    *Comining te pre$ious e;uations we get:

    wIE

    Ii/5x w

    I5

    *inall% using a !is"rete representation for timewe "an write:

    wInM?5 E wIn5 M n5 Ii/5n5 x wIn55

    *3e ao$e e;uation mo$es te weigt $e"torof te winning neuron an! te rest of teneurons in te neigouroo!5 near te input$e"tor x. 3e rest of te neurons onl% get afra"tion of te "orre"tion toug.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    25/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,./: 3dapti2e Process

    *3e algoritm lea!s to a topological ordering

    of te feature map in te input spa"e in tesense tat neurons tat are a!Ia"ent in telatti"e ten! to a$e similar s%napti" weigt$e"tors.

    *3e learning rate must also e time $ar%ing asit soul! e for sto"asti" appro/imation. 9suitale form is gi$en %:

    Were (is an initial $alue an! 'is anoter

    time "onstant of te SO, algoritm.

    ,...2,1,0)exp()(2

    0 nn

    n

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    26/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.4: 3dapti2e Process

    *3e a!apti$e pro"ess "an e !e"ompose! in

    two pases:*A self-organising or ordering phase;

    *A convergence phase.

    *We e/plain ne/t te main "ara"teristi"s of

    ea" pase.

    *Ordering Phase: +t is !uring tis rst paseof te a!apti$e pro"ess tat te topologi"al

    or!ering of te weigt $e"tors takes pla"e. 3eor!ering pase ma% take as man% as ?(((iterations of te SO, algoritm or more. Onesoul! "oose "arefull% te learning rate an!te neigouroo! fun"tion:

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    27/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.0: 3dapti2e Process

    *3e learning rate soul! egin wit a $alue

    "lose to (.? tereafter it soul! !e"reasegra!uall%& ut remain ao$e (.(?. 3esere;uirements are satise! % making tefollowing "oi"es:

    (E(.?& 'E?(((*3e neigouroo! fun"tion soul! initiall%in"lu!e almost all neurons in te network"entere! on te winning neuron i& an! tensrink slowl% wit time. Spe"i"all% !uring

    te or!ering pase it is allowe! to re!u"e toa small $alue of "ouple of neigours or tote winning neuron itself. 9ssuming a 'Dlatti"e we ma% set te ( e;ual to te

    Bra!ius of te latti"e. Correspon!ingl% we

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    28/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.6: 3dapti2e Process

    ma% set te time "onstant ?as:

    *Convergence phase:3is se"on! pase isnee!e! to ne tune te feature map an!terefore to pro$i!e an a""urate statisti"al;uanti"ation of te input spa"e. +n general tenumer of iterations nee!e! for tis pase is

    (( times te numer of neurons in te latti"e.*or goo! statisti"al a""ura"%& te learningparameter must e maintaine! !uring tispase to a small $alue& on te or!er of (.(?.+t soul!

    0

    1log

    1000

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    29/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.7: 3dapti2e Process

    not allowe! to go to Lero& oterwise te

    network ma% stu"k to a metastable statei.e. a state wit a !efe"t5

    *3e neigouroo! soul! "ontain onl% tenearest neigours of te winning neuron

    wi" ma% e$entuall% re!u"e to one or Leroneigouring neurons.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    30/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.1: Summar of SOM 3l-orit5m

    *3e asi" ingre!ients of te algoritm are:

    *9 "ontinuous input spa"e of a"ti$ationpatterns tat are generate! in a""or!an"ewit wit a "ertain proailit% !istriution

    *9 topolog% of te network in te form of

    latti"e neurons& wi" !enes a !is"reteoutput spa"e

    *9 time-$ar%ing neigouroo! tat is!ene! aroun! a winning neuron ix5

    *9 learning rate parameter tat starts at aninitial $alue (an! ten !e"reases gra!uall%

    wit time& n& ut ne$er goes to Lero.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    31/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,.%: Summar of SOM 3l-orit5m,.

    * 3e operation of te algoritm is

    summarise! as follows:. nitialisation*Coose ran!om $alues for

    te initial weigt $e"tors wI(5. 3e

    weigt $e"tors must e !i8erent for all

    neurons. #suall% we keep te magnitu!eof te weigts small.

    +. Sampling*Draw a sample xfrom teinput spa"e wit a "ertain proailit%te $e"tor xrepresents te a"ti$ation

    pattern tat is applie! to te latti"e. 3e!imension of xis e;ual to m.

    . Similarity atching*in! te est-mat"ing winning5 neuron ix5 at time

    step n % using te minimum u"li!ean!istan"e "riterion:

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    32/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    SOM

    Self,Or-anisin- Map,#$: Summar of SOM 3l-orit5m,#

    ix5Earg minIJJx wIJJ& IE?&'&G&l

    . /pdating*9!Iust te s%napti" weigt$e"tors of all neurons % using teup!ate formula:

    wInM?5 E wIn5 M n5 Ii/5n5 xn5 wIn55

    Were n5 is te learning rate an! Ii/5n5 is te neigouroo! fun"tion aroun!

    te winner neuron ix5 ot n5 an! Ii/5n5 are $arie! !%nami"all% for estresults.

    0. Continuation*Continue wit step ' until

    no noti"eale "anges in te featuremap are oser$e!.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    33/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Properties

    Properties

    *>ere we summarise some useful properties of

    te SO, mo!el:

    *Pr1 - Approximation of the Input Space:!he feature map , represented by the set ofsynaptic weight vectors 1w"2 in the output space

    A, provides a good appro#imation to the inputspace 3.

    *Pr2 opo!ogica! Ordering: !he feature mapcomputed by the S4 algorithm istopologically ordered in the sense that the spatial

    location of a neuron in the lattice corresponds toa particular domain or feature of the inputpatterns.

    *Pr" #ensit$ %atching:!he feature map re5ects variations in the statistics of the input

    distribution* regions in the input space 3 from

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    34/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Properties

    Properties,.

    xare drawn with a high probability of

    occurrence are mapped onto larger domains ofthe output space A, and therefore with betterresolution than regions in 3 from which samplevectorsxare drawn with a low probability ofoccurrence.

    *Pr& 'eature Se!ection: 6iven data from aninput space with a nonlinear distribution, the self-organising map is able to select a set of bestfeatures for appro#imating the underlyingdistribution.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    35/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Examples

    Examples

    *We present two e/amples in or!er to

    !emonstrate te use of te SO, mo!el:

    *Colour Clustering

    *Semanti" ,aps.

    *Co!our C!ustering:+n te rst e/ample anumer of images is gi$en wi" "ontain a setof "olours wi" are foun! in a natural s"ene.We seek to "luster te "olours foun! in te$arious images.

    *We sele"t a network wit input neuronsrepresenting te PAQ $alues of a single pi/el5an! an output 'D la%er "onsisting of 4(/4(neurons arrange! in a s;uare latti"e. We use 4,

    pi/els to train te

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    36/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Examples

    Examples,.

    network. We use a /e! learning rate of

    E?.(-4 an! ?((( epo"s. 9out '(( imageswere use! in or!er to e/tra"t te pi/el $alues fortraining.

    *Some of te original images an! unsu""essful

    = su""essful "olour maps are sown elow:

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    37/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Examples

    Examples,#

    *Semantic %aps:9 useful meto! of

    $isualisation of te SO, stru"ture a"ie$e! atte en! of training assigns "lass laels in a 'Dlatti"e !epen!ing on ow ea" test pattern notseen efore5 e/"ites a parti"ular neuron.

    *3e neurons in te latti"e are partitione! to anumer of coherent regions& "oerent in tesense tat ea" grouping of neurons representsa !istin"t set of "ontiguous s%mols or laels.

    *9n e/ample is sown elow& were we assumetat we a$e traine! te map for ?6 !i8erent

    animals.

    *We use a latti"e of ?(/?( output neurons.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    38/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    Examples

    Examples,/

    *We oser$e tat tere are tree !istin"t"lusters of animals: Bir!s& Bpea"eful spe"iesan! Bunters.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    39/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    &'(

    &'(

    *7ector 8uantisationis a te"ni;ue tat e/ploits

    te un!erl%ing stru"ture of input $e"tors for tepurpose of !ata "ompression.

    *9n input spa"e is !i$i!e! in a numer of!istin"t regions an! for ea" region a

    re"onstru"tion representati$e5 is !ene!.*Wen te ;uantiLer is presente! wit a newinput $e"tor& te region in wi" te $e"tor liesis rst !etermine!& an! is ten represente! %te repro!u"tion $e"tor for tis region.

    *3e "olle"tion of all possile repro!u"tion$e"tors is "alle! te code booof te ;uantiLeran! its memers are "alle! code words.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    40/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    &'(

    &'(,.

    *9 $e"tor ;uantiLer wit minimum en"o!ing

    distortionis "alle! 7oronoior nearest-neighbour9uanti'er& sin"e te 1oronoi "ells aout a set ofpoints in an input spa"e "orrespon! to apartition of tat spa"e a""or!ing to te nearest-neigour rule ase! on te u"li!ean metri".

    *9n e/ample wit an input spa"e !i$i!e! to four"ells an! teir asso"iate! 1oronoi $e"tors issown elow:

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    41/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    &'(

    &'(,#

    *3e SO, algoritm pro$i!es an appro/imatemeto! for "omputing te 1oronoi $e"tors in anunsuper$ise! manner& wit te appro/imationeing spe"ie! % te

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    42/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    &'(

    &'(,/

    weigt $e"tors of te neurons in te feature

    map.*Computation of te feature map "an e $iewe!as te rst of two stages for a!apti$el% sol$ing apattern "lassi"ation prolem as sown elow.

    3e se"on! stage is pro$i!e! % te learning$e"tor ;uantiLation& wi" pro$i!es a meto!for ne tuning of a feature map.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    43/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    &'(

    &'(,4

    *:earning vector 9uanti'ation 0125 is a

    super$ise! learning te"ni;ue tat uses "lassinformation to mo$e te 1oronoi $e"tors sligtl%&so as to impro$e te ;ualit% of te "lassier!e"ision regions.

    *9n input $e"tor xis pi"ke! at ran!om from teinput spa"e. +f te "lass laels of te input

    $e"tor an! a 1oronoi $e"tor wagree& te1oronoi $e"tor is mo$e! in te !ire"tion of teinput $e"tor x. +f& on te oter an!& te "lass

    laels of te input $e"tor an! te 1oronoi $e"tor!isagree& te 1oronoi $e"tor wis mo$e! awa%from te input $e"tor x.

    *0et us !enote RwIIE?lte set of 1oronoi

    $e"tors& an! let RxiiE?N

    e te set of input$e"tors. We assume tat

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    44/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    &'(

    &'(,0

    N @@ l.

    * 3e 012 algoritm pro"ee!s as follows:

    i. Suppose tat te 1oronoi $e"tor w"is te

    "losest to te input $e"tor xi. 0et Cw"an!

    C/i!enote te "lass laels asso"iate!

    wit w" an! xirespe"ti$el%. 3en te

    1oronoi $e"tor w"is a!Iuste! as follows:

    * +f Cw"E C/i ten

    ("nM?5E w"n5ManFxi- w"n5H

    Were (T an T?

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    45/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    Conclusions

    &'(

    &'(,6

    * +f Cw"C/i ten

    ("nM?5E w"n5-anFxi- w"n5H

    ii. 3e oter 1oronoi $e"tors are notmo!ie!.

    * +t is !esirale for te learning "onstant an to!e"rease monotoni"all% wit time n. ore/ample an"oul! e initiall% (.? an!

    !e"rease linearl% wit n.

    * 9fter se$eral passes troug te input !atate 1oronoi $e"tors t%pi"all% "on$erge atwi" point te training is "omplete.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    46/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    CS 476: Networks of Neural Computation CS! "OC #$$%

    &'(

    ConclusionsConclusions

    Conclusions

    *3e SO, mo!el is neuroiologi"all% moti$ate!

    an! it "aptures te important features "ontaine!in an input spa"e of interest.

    *3e SO, is also a $e"tor ;uantiLer.

    *+t supports te form of learning wi" is "alle!

    unsupervisedin te sense tat no targetinformation is gi$en wit te presentation of teinput.

    *+t "an e "omine! wit te meto! of

    0eanring 1e"tor 2uantiLation in or!er to pro$i!ea "omine! supervisedlearning te"ni;ue forne-tuning te 1oronoi $e"tors of a suitalepartition of te input spa"e.

  • 7/24/2019 Networks of Neural Computation Self-Organising Networks

    47/47

    Contents

    Introduction

    SOM

    Properties

    Examples

    &'(

    ConclusionsConclusions

    Conclusions,.

    *+t is use! in multiple appli"ations su" as

    "omputational neuros"ien"e& nan"e& languagestu!ies& et".

    *+t "an e $isualise! wit two meto!s:

    *3e rst represents te map as an elasti"

    gri! of neurons

    *3e se"on! "orrespon!s to te semanti"map approa".