Several strategies for simple cells to learn orientation and direction selectivity Michael Eisele &...
-
Upload
emily-wood -
Category
Documents
-
view
218 -
download
0
description
Transcript of Several strategies for simple cells to learn orientation and direction selectivity Michael Eisele &...
Several strategies for simple cells to learn orientation and
direction selectivityMichael Eisele & Kenneth D. Miller
Columbia University
ONOFF
illustration by de Angelis et al. 99
Orientation and Direction SelectivityOrientation Selectivity
(OS)
Direction Selectivity
(DS)
spacesp
ace
spac
e
space
no OS OSorientation-selective?
spac
e
timesp
ace
time
DSno DSdirection-selective?
orientation-selective?
Lampl et al 01
Priebe & Ferster 05
Selected models• Simple Hebbian learning rule produces OS (Miller
94), but not DS (Wimbauer et al 97) for unstructured input.
• Nonlinear Hebbian learning rules produce DS, but only for structured input (Feidler et al 97, Blais et al 00).
• More general principles (sparse coding, ICA, blind source separation) can explain occurence of OS (Olshausen & Field 96; Bell & Sejnowski 97) and DS (van Hateren & Ruderman 98), if applied to input from natural scenes.
Some OS and DS develops early
(kittens at time of eye opening; Albus & Wolf 84)
awake ferret P27 (before eye opening)Chiu & Weliky 01
Early spontaneous activity
Ferret P30-32 correlations decay over a few 100 ms and several mm cortex (Fiser et al 04)
•Find rule that robustly produces DS, using only unstructured input.
•Identify underlying principle.
Goal
Blind source separationmixing
sources unmixed sources
mixing unmixing
Blind source separation (BSS)
sensors
sources sensors
random mixing
Blind source separationof random, spontaneous activity
unmixing
more even mixing
mixing
?
Motivation for blind source mixing (BSM)
DS responses to all positions
⇒
no responseto some positions
no DS no DSresponses
to all positions
Hebbian learning
Combining BSM and Hebbian learning
Δw = η⋅(x⋅y + ε⋅x⋅y3) - λ⋅w
w = weightΔw = weight-change η = learning ratex = inputy = outputλ = multiplicative constraint
linear Hebbian
ε>0: blind source separationε<0: blind source mixing
based on bottom-up approach to blind-source separation; see “Independent Component Analysis” Hyvärinen, Karhunen, Oja 2001
Combined learning rule
•spatial correlations: Mexican hat
•distribution of input amplitudes: long tails
•upper weight limits: none
•temporal input filters: diverse
Important factors
4 week old kittensCai et al 97
•single neuron learning
•rate-coded
•only feedforward input
•arbor function
•linear neuron model
Simplifications
•Whitened input ⇒ BSM can perfectly mix sources.
•Gradient principle ⇒ convergence
A few analytical results
preferred orientations of 100 receptive fields
other choice ofinitial weights:
Dependence on initial conditions
rotationON ⇔ OFF
ε = −0.25
ε = 0(Hebb)
ε = −0.15
ε = −0.5
ε = −0.15
ε = −0.2
ε = −0.5
Robustness against parameter changes Δw = η⋅(x⋅y + ε⋅x⋅y3) - λ⋅w
OS and DS develop robustly under BSM + Hebb
Limitations
special initial conditions input = drifting gratings
input amplitudes = subgaussian distributionlarge negative ε: BSM dominates
response amplitudenum
ber o
f res
pons
esComparision of response distributions
Other strategies:BSS with structured inputBSM with subgaussian input
Hebb with hard upper w-limitHebb with soft upper w-limit
hybrid with unstructured inputhybrid with structured input hybrid = BSS and
Hebb with upper
weight limit
Any rule that produces OS and DS for structured and unstructured input?
Linear Hebbian rule + upper weight limitMiller 94
•Blind source mixing (BSM) is designed to produce an output that responds evenly to many sources.
•BSM and and Hebbian learning can be combined to a simple synaptic learning rule.
•This rule robustly produces OS and DS while the input is unstructured.
Conclusions
BSS +➧BSM +Hebb +OS, DS ➧OS, DS
known: new:
Speculationexternal world internal network neuron
unlearn correlations that are produced internally: BSM
learn correlations that are produced externally: BSS
Unlearning of higher-order correlations.Compare Crick & Mitchison 83:unlearning of any-order correlations.
➡ ➡
supported by the Swartz Foundation and theHuman Frontiers Science Program