Hopfield NNets

Post on 06-Jan-2016

53 views 6 download

Tags:

description

Hopfield NNets. N. Laskaris. Professor John Hopfield The Howard A. Prior Professor of Molecular Biology Dept. of Molecular Biology Computational Neurobiology; Biophysics Princeton University. - PowerPoint PPT Presentation

Transcript of Hopfield NNets

N. Laskaris

Professor John Hopfield

The Howard A. Prior Professor of Molecular Biology

Dept. of Molecular Biology Computational Neurobiology; Biophysics

Princeton University

The physicist Hopfield showed that models of physical systems could be used to solve computational problems

Such systems could be implemented in hardware by combining standard components such as capacitors and resistors.

The importance of the Hopfield nets in practical application is limited due to theoretical limitations of the structure,but, in some cases, they may form interesting models.

Usually employed in binary-logic tasks : e.g. pattern completion and association

The concept

In the beginning of 80s Hopfield published two scientific papers, which attracted much interest.

This was the starting point of the new era of neural networks, which continues today

(1982): ‘’Neural networks and physical systems with emergent collective computational abilities’’. Proceedings of the National Academy of Sciences, pp. 2554-2558.

(1984): ‘’Neurons with graded response have collective computational properties like those of two-state neurons’’. Proceedings of the National Academy of Sciences, pp. 81:3088-3092

‘‘The dynamics of brain computation”

How is one to understand the incredible effectiveness of

a brain in tasks such as recognizing

a particular face in a complex scene?

The core question :

Simple models of the dynamics of neural circuits are described that have collective dynamical properties.

These can be exploited in recognizing sensory patterns.

Using these collective properties in processing information

is effective in that it exploits the spontaneous properties

of nerve cells and circuits to produce robust computation.

Like all computers,

a brain is a dynamical system that carries out its computations by the change of its 'state' with time.

Associative memory, logic and inference,

recognizing an odor or a chess position, parsing the world into objects,

and generating appropriate sequences of locomotor muscle commands are all describable

as computation.

His research focuses on understanding

how the neural circuits of the brain produce

such powerful and complex computations.

J. Hopfield’s quest While the brain is totally unlike modern computers, much of what it does can be described as computation.

However, olfaction allows remote sensing, and much more complex computations

involving wind direction and fluctuating mixtures of odors

must be described to account for the ability of homing pigeons or slugs to navigate

through the use of odors.

Hopfield has been studying how such computations might be

performed by the known neural circuitry of the

olfactory bulb and prepiriform cortex of mammals or the analogous circuits

of simpler animals.

Olfaction

The simplest problem in olfaction is simply identifying a known odor.

Any computer does its computation by its changes in internal state.

In neurobiology, the change of potentials of neurons

(and changes in the strengths of the synapses) with time is what performs the computations.

Dynamical systems

Systems of differential equations can represent these aspects of neurobiology.

He seeks to understand some aspects of neurobiological computation through studying the behavior of equations modeling the time-evolution of neural activity.

Action potential computationFor much of neurobiology, information is represented by the paradigm of ‘‘firing rates’’,

i.e. information is represented by the rate of generation of action potential spikes, and the exact timing of these spikes is unimportant.

Action potential computation

Since action potentials last only about a millisecond,

the use of action potential timing seems a powerful potential means of neural

computation.

Action potential computationThere are cases,

for example the binaural auditory determination of the location of a sound source,

where information is encoded in the timing of action potentials.

Identifying words in natural speech is a difficult computational task which brains can easily do.

They use this task as a test-bed for thinking about the computational abilities of neural networks and neuromorphic ideas

Speech

Simple (e.g. binary-logic ) neurons are

coupled in a system with

recurrent signal flow

A 2-neurons Hopfield network of continuous states characterized by 2 stable states

1st Example

Contour-plot

A 3-neurons Hopfield network of 23=8 states characterized by 2 stable states

2nd Example

Wij = Wji

The behavior of such a dynamical system is fully determined by the synaptic weights

And can be thought of as an Energy minimization

process

3rd Example

Hopfield Nets are fully connected, symmetrically-weighted networks that extended the ideas of linear associative memories by adding cyclic connections .

Note: no self-feedback !

Regarding training a Hopfield net as a content-addressable memory

the outer-product rule for storing patterns is used

After the ‘teaching-stage’, in which the weights are defined, the initial state of the network is set (input pattern) and a simple recurrent rule is iterated till convergence to a stable state (output pattern)

Operation of the network

There are two main modes of operation:

Synchronous vs. Asynchronous updating

Hebbian Learning

Probe pattern

Dynamical evolution

A Simple Example

Step_1. Design a network with memorized patterns (vectors) [ 1, -1, 1 ] & [ -1, 1, -1 ]

There are 8 different states that can be reached by the net and therefore can be used as its initial state

#1: y1

#2: y2

#3: y3

Step_2. Initialization

Step_3. Iterate till convergence- Synchronous Updating - 3 different examples

of the net’s flow

It converges immediately

Schematic diagram of all the dynamical trajectories that correspond to the designed net.

Stored pattern

Step_3. Iterate till convergence

- Synchronous Updating -

Or Step_3. Iterate till convergence- Asynchronous Updating -

Each time, select one neuron at random and update its state with the previous ruleand the –usual- convention that if the total input to that neuron is 0 its state remains unchanged

Explanation of the convergenceThere is an energy function related with each state of the Hopfield network

E( [y1, y2, …, yn]T ) = -Σ Σ wij yi yj

where [y1, y2, …, yn]T is the vector of neurons’ output,

wij is the weight from neuron j to neuron i,

and the double sum is over i and j.

The corresponding dynamical system evolves toward states of lower Energy

States of lowest energy correspond to attractors of Hopfield-net dynamicsE( [y1, y2, …, yn]T ) = = -Σ Σ wij yi yj

Attractor-state

Capacity of the Hopfield memory

When this is found, the corresponding pattern of activation is outputted

In short, while training the net (via the outer-product rule) we’re storing patterns by posing different attractors in the state-space of the system. While operating, the net searches the closest attractor.

How many patterns we can store in a Hopfield-net ?

0.15 N, N: # neurons

A simple Pattern

Recognition Example

Computer Experimentation

Class-project

Stored Patterns (binary images)

Perfect Recall- Image Restoration

Erroneous Recall

Irrelevant results

Note: explain the ‘negatives’ ….

The continuous Hopfield-Net as optimization machinery

‘Simple "Neural" Optimization Networks: An A/D Converter, Signal Decision Circuit, and a Linear Programming Circuit’

[ Tank and Hopfield ; IEEE Trans. Circuits Syst. 1986; 33: 533-541.]:

Hopfield modified his network so as to work with continuous activation and

-by adopting a dynamical-systems approach-

showed that the resulting system is characterized

by a Lyaponov-function who termed it ‘Computational-Energy’ & which can be used to tailor the net for specific optimizations

i

n

jjjij

i

ii IugTn

u

dt

du

1

)(

g u gain u( ) tanh( ) 1

21

E T g u g u I g ui

n

ijj

n

i i j j i i ii

n

1

2 1 1 1( ) ( ) ( )

E T V V I Vi

n

ijj

n

i j i ii

n

1

2 1 1 1

Tij=Tji και Tij=0

The system of coupled differential equation describing the operation of continuous Hopfield net

The Computational Energy

Weights: Wij ≡ Tij

Biases: Ii

Neuronal outputs: Yi ≡ Vi

When Hopfield nets are used for function optimization, the objective function F to be minimized is written as energy function in the form of computational energy E .

The comparison between E and F leads to the design, i.e. definition of links and biases, of the network that can solve the problem.

The actual advantage of doing this is that the Hopfield-net has a direct hardware implementation that enables even a VLSI-integration of the algorithm performing the optimization task

An example: ‘Dominant-Mode Clustering’ Given a set of N vectors {Xi} define the k among them that form the most compact cluster {Zi}

}{if

}{if: k = u with i

N

i ii

iiii

Z X 0

Z X 1}u{}u{

2

ji X-X ji

N

j=1

N

=1ii uu=)}u({F

The objective function F can be written easily in the form of computational energy E

0=I

jiif

= j)D(i, -2=T=T

VI - VVT2

1- =F

objijiij

ii

N

=1ijiij

N

j=1

N

=1i

2

ji2

0

XX

ji

N

j=1

N

=1ii uu=)}u({F

2

ji X-X

With each pattern Xi we associate a neuron in the Hopfield network ( i.e. #neurons = N ).

The synaptic weights are the pairwise-distances (*2)

If its activation is ‘1’ when the net will converge the corresponding pattern will be included in the cluster.

There’s an additional Constraint so as k neurons are

‘on’

A classical example: ‘The Travelling Salesman Problem’

Coding a possible route as a combination of neurons’ firings

The principle

53 4 1 2 5

|5-3|+|3-4|+|4-1|+|1-2|+|2-5|

The problem :

The idea :

An example from clinical Encephalography

‘‘ ‘‘Hopfield Neural Nets Hopfield Neural Nets

for monitoring Evoked Potential Signalsfor monitoring Evoked Potential Signals’’’’

[ Electroenc. Clin. Neuroph. 1997;104(2) ]

The solution :

N. Laskaris et al.

The Boltzmann Machine

Improving Hopfield nets by simulating annealing and adopting more complex topologies

(430 – 355) π.X.

‘Ας κλείσω λοιπόν εδώ . . . .

. . . . . . . . . . . . . .. . . . κάποιος άλλος, ίσως θα συμπληρώσει όσα δεν μπόρεσα να ολοκληρώσω’

- Θεμιστογένης ο Συρακούσιος 1ο έτος της 105ης Ολυμπιάδας

ΕΛΛΗΝΙΚΑ

(1979-1982)

Hopfield-netsPNAS

(1982)

‘‘ Τα παιδιά στην Κερκίδα είναι η μόνη σου Ελπίδα ....’’

A Very Last Comment on Brain-Mind-

Intelligence-Life-Happiness

How I Became Stupid by

Martin Page

Penguin Books, 2004, 160 pp. ISBN: 0-14-200495-2

In HOW I BECAME STUPID,

The 25-year-old Antoine concludes

‘‘to think is to suffer’’,

a twist on the familiar assertion of Descartes.

For Antoine, intelligence is the source of unhappiness.

He embarks on a series of hilarious strategies to make himself

stupid and possibly happy

Animals that Abandon their Brains Dr. Jun Aruga Laboratory for Comparative Neurogenesis

A “primitive but successful”

animal

Oxycomanthus japonicus

There is astonishing diversity in the nervous systems of animals, and the variation between species is remarkable.

From the basic, distributed nervous systems of jellyfish and sea anemones to the centralized neural networks of squid and octopuses to the complex brain structures at the terminal end of the neural tube in vertebrates, the variation across species is humbling

people may claim that “more advanced” species like humans are the result of an increasingly centralized nervous system that was produced through evolution. This claim of advancement through evolution is a common, but misleading, one. It suggests that evolution always moves in one direction: the advancement of species by increasing complexity

evolution may selectively enable body structures that are more enhanced and complicated, but it may just as easily enable species

that have abandon complex adaptations in favour of simplification. Brains, too, have evolved in the same way. While the brains of some species, including humans, developed to allow them to thrive, others have abandoned their brains because they are no longer necessary.

For example, the ascidian, or sea squirt, lives in shallow coastal waters and which is a staple food in certain regions, has a vertebrate-like neural structure with a neural tube and notochord in its larval stage.

As the larvae becomes an adult, however, these features disappear until only very basic ganglions remain.

In evolutionary terms this animal is a “winner” because it develops a very simplified neural system better adapted to a stationary life in seawater

In the long run, however, evolutionary success will be determined by what species survives longer:

humans with their complex brains (and their weapons) or the brainless Dicyemida

1948-1990

Δισέγγονος του Ζορμπά και ανηψιός της Ελλης Αλεξίου.

Γεννήθηκε στην Αθήνα.

Ξεκίνησε την καριέρα του το 1970 από τη Θεσσαλονίκη με το συγκρότημα-ντουέτο "Δάμων και Φιντίας". Το 1976 ιδρύει το συγκρότημα "Σπυριδούλα".

Η σκέψη μας είναι το

αφεντικό ή ο

υπηρέτης μας ;

Emotional Intelligence

also called EI or EQ , describes an ability, capacity, or skill to perceive, assess, and manage the emotions of one's self, of others, and of groups

H ποιητική νοημοσύνη μπορεί να λείπει από

τους παντογνώστες,

κι ωστόσο να κατοικεί μέσα στον πιο απλόν

άνθρωπο

Class-project Oral-Exams

Oral-Exam Appointments

DateDate

TimeTime

AEM 31 May31 May 5 June5 June 7 June7 June

1st hour

794845 893 899

711809874909

627887946960

2nd

hour915920932 949

9239509791024

9629809951202

3rd hour

1023

1227

1223

Further Inquiries