kinetic theory for the dynamics of fluctuation-driven neural systems

Post on 25-Feb-2016

34 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

Kinetic Theory for the Dynamics of Fluctuation-Driven Neural Systems. David W. McLaughlin Courant Institute & Center for Neural Science New York University http://www.cims.nyu.edu/faculty/dmac/ Toledo – June ‘06. Happy Birthday, Peter & Louis. - PowerPoint PPT Presentation

TRANSCRIPT

Kinetic Theory for the Dynamicsof Fluctuation-Driven Neural Systems

David W. McLaughlin

Courant Institute & Center for Neural ScienceNew York University

http://www.cims.nyu.edu/faculty/dmac/

Toledo – June ‘06

Happy Birthday, Peter & Louis

Kinetic Theory for the Dynamicsof Fluctuation-Driven Neural Systems

In collaboration with:

David Cai

Louis Tao

Michael Shelley

Aaditya Rangan

Visual Pathway: Retina --> LGN --> V1 --> Beyond

Integrate and Fire Representation

t v = -(v – VR) – g (v-VE)

t g = - g + l f (t – tl) +

(Sa/N) l,k (t – tlk)

plus spike firing and reset v (tk) = 1; v (t = tk + ) = 0

Nonlinearity from spike-threshold: Whenever V(x,t) = 1, the neuron "fires", spike-time recorded,

and V(x,t) is reset to 0 ,

The The “primary visual cortex (V1)”“primary visual cortex (V1)” is a “layered structure”, is a “layered structure”,with O(10,000) neurons per square mm, per layer. with O(10,000) neurons per square mm, per layer.

O(10O(1044) neuons) neuons per mmper mm22

Map ofMap of

OrientationOrientation

PreferencePreference

With both regular &With both regular &random patternsrandom patternsof neurons’ preferencesof neurons’ preferences

Lateral Connections and Orientation -- Tree ShrewBosking, Zhang, Schofield & Fitzpatrick

J. Neuroscience, 1997

Line-Motion-Illusion

LMI

Coarse-Grained Asymptotic Representations

Needed for “Scale-up”

• Larger lateral area • Multiple layers

First, tile the cortical layer with coarse-grained (CG) patchesFirst, tile the cortical layer with coarse-grained (CG) patches

Coarse-Grained Reductions for V1

Average firing rate models [Cowan & Wilson (’72); ….; Shelley & McLaughlin(’02)]

Average firing rate of an excitatory (inhibitory) neuron, within coarse-grained patch located at location x in the cortical layer:

m(x,t), = E,I

Cortical networks have a very “noisy” dynamics

• Strong temporal fluctuations • On synaptic timescale• Fluctuation driven spiking

Experiment ObservationExperiment ObservationFluctuations in Orientation Tuning (Cat data from Ferster’s Lab)Fluctuations in Orientation Tuning (Cat data from Ferster’s Lab)

Ref:Anderson, Lampl, Gillespie, FersterScience, 1968-72 (2000)

Fluctuation-driven spiking

Solid: average ( over 72 cycles)

Dashed: 10 temporal trajectories

(very noisy dynamics,on the synaptic time scale)

• To accurately and efficiently describe these networks requires that fluctuations be retained in a coarse-grained representation.

• “Pdf ” representations –(v,g; x,t), = E,I

will retain fluctuations.

• But will not be very efficient numerically

• Needed – a reduction of the pdf representations which retains1. Means &2. Variances

• Kinetic Theory provides this representationRef: Cai, Tao, Shelley & McLaughlin, PNAS, pp 7757-7762 (2004)

Kinetic Theory begins from

PDF representations(v,g; x,t), = E,I

• Knight & Sirovich; • Nykamp & Tranchina, Neural Comp (2001)• Haskell, Nykamp & Tranchina, Network

(2001) ;

• For convenience of presentation, I’ll sketch the derivation a single CG patch, with 200 excitatory Integrate & Fire neurons

• First, replace the 200 neurons in this CG cell by an equivalent pdf representation

• Then derive from the pdf rep, kinetic theory

• The results extend to interacting CG cells which include inhibition – as well as different cell types such as “simple” & “complex” cells.

• N excitatory neurons (within one CG cell)• Random coupling throughout the CG cell; • AMPA synapses (with a short time scale )

t vi = -(vi – VR) – gi (vi -VE) t gi = - gi + l f (t – tl) +

(Sa/N) l,k (t – tlk)

plus spike firing and reset vi (ti

k) = 1; vi (t = tik + ) = 0

• N excitatory neurons (within one CG cell)• Random coupling throughout the CG cell; • AMPA synapses (with time scale )

t vi = -(v – VR) – gi (v-VE)

t gi = - gi + l f (t – tl) +

(Sa/N) l,k (t – tlk)

(g,v,t) N-1 i=1,N E{[v – vi(t)] [g – gi(t)]},

Expectation “E” over Poisson spike train { tl }

t vi = -(v – VR) – gi (v-VE) t gi = - gi + l f (t – tl) + (Sa/N) l,k (t – tl

k)

Evolution of pdf -- (g,v,t): (i) N>1; (ii) the total input to each neuron is (modulated) Poisson spike trains.

t = -1v {[(v – VR) + g (v-VE)] } + g {(g/) } + 0(t) [(v, g-f/, t) - (v,g,t)]

+ N m(t) [(v, g-Sa/N, t) - (v,g,t)],

0(t) = modulated rate of incoming Poisson spike train;

m(t) = average firing rate of the neurons in the CG cell = J(v)(v,g; )|(v= 1) dg,

and where J(v)(v,g; ) = -{[(v – VR) + g (v-VE)] }

t = -1v {[(v – VR) + g (v-VE)] } + g {(g/) } + 0(t) [(v, g-f/, t) - (v,g,t)]

+ N m(t) [(v, g-Sa/N, t) - (v,g,t)],

N>>1; f << 1; 0 f = O(1);

t = -1v {[(v – VR) + g (v-VE)] } + g {[g – G(t)]/) } + g

2 / gg + …

where g2 = 0(t) f2 /(2) + m(t) (Sa)2 /(2N)

G(t) = 0(t) f + m(t) Sa

Kinetic Theory Begins from Moments (g,v,t) (g)(g,t) = (g,v,t) dv (v)(v,t) = (g,v,t) dg 1

(v)(v,t) = g (g,tv) dg

where (g,v,t) = (g,tv) (v)(v,t).

t = -1v {[(v – VR) + g (v-VE)] } + g {[g – G(t)]/) } + g

2 / gg + …

First, integrating (g,v,t) eq over v yields:

t (g) = g {[g – G(t)]) (g)} + g2

gg (g)

Fluctuations in g are Gaussian t (g) = g {[g – G(t)]) (g)} + g

2 gg (g)

Integrating (g,v,t) eq over g yields:

t (v) = -1v [(v – VR) (v) + 1(v) (v-VE) (v)]

Integrating [g (g,v,t)] eq over g yields an equation for

1(v)(v,t) = g (g,tv) dg,

where (g,v,t) = (g,tv) (v)(v,t)

t 1(v) = - -1[1

(v) – G(t)]

+ -1{[(v – VR) + 1(v)(v-VE)] v 1

(v)}

+ 2(v)/ ((v)) v [(v-VE) (v)] + -1(v-VE) v2(v)

where 2(v) = 2(v) – (1

(v))2 .

Closure: (i) v2(v) = 0;

(ii) 2(v) = g2

One obtains:

t (v) = -1v [(v – VR) (v) + 1(v)(v-VE) (v)]

t 1(v) = - -1[1

(v) – G(t)]

+ -1{[(v – VR) + 1(v)(v-VE)] v 1

(v)}

+ g2 / ((v)) v [(v-VE) (v)]

Together with a diffusion eq for (g)(g,t):

t (g) = g {[g – G(t)]) (g)} + g2

gg (g)

PDF of v

Theory→ ←I&F (solid)

Fokker-Planck→

Theory→ ←I&F

←Mean-driven limit ( ): Hard thresholding

Fluctuation-Driven DynamicsFluctuation-Driven Dynamics

N=75

N=75σ=5msecS=0.05f=0.01

firin

g ra

te

(Hz)

N

Mean Driven:

Bistability and HysteresisBistability and Hysteresis Network of Simple, Excitatory only

Fluctuation Driven:

N=16

Relatively Strong Cortical Coupling:

N=16!

N

Mean Driven:

N=16!

Bistability and HysteresisBistability and Hysteresis Network of Simple, Excitatory only

Relatively Strong Cortical Coupling:

Computational Efficiency

• For statistical accuracy in these CG patch settings, Kinetic Theory is 103 -- 105 more efficient than I&F;

Realistic Extensions

Extensions to coarse-grained local patches, to excitatory and inhibitory neurons, and to neurons of different types (simple & complex). The pdf then takes the form

,(v,g; x,t), where x is the coarse-grained label, = E,I

and labels cell type

Three Dynamic Regimes of Cortical Amplification:

1) Weak Cortical AmplificationNo Bistability/Hysteresis

2) Near Critical Cortical Amplification 3) Strong Cortical Amplification

Bistability/Hysteresis (2) (1)

(3)

Excitatory Cells Shown

Firing rate vs. input conductance for 4 networks with varying pN: 25 (blue), 50 (magneta), 100 (black), 200 (red). Hysteresis occurs for pN=100 and 200. Fixed synaptic coupling Sexc/pN

Summary• Kinetic Theory is a numerically efficient (103 -- 105 more efficient

than I&F), and remarkably accurate, method for “scale-up” Ref: PNAS, pp 7757-7762 (2004)

• Kinetic Theory introduces no new free parameters into the model, and has a large dynamic range from the rapid firing “mean-driven” regime to a fluctuation driven regime.

• Sub-networks of point neurons can be embedded within kinetic theory to capture spike timing statistics, with a range from test neurons to fully interacting sub-networks.

Ref: Tao, Cai, McLaughlin, PNAS, (2004)

Too good to be true? What’s missing?

• First, the zeroth moment is more accurate than the first moment, as in many moment closures

Too good to be true? What’s missing?

• Second, again as in many moment closures, existence can fail -- (Tranchina, et al – 2006).

• That is, at low but realistic firing rates, equations too rigid to have steady state solutions which satisfy the boundary conditions.

• Diffusion (in v) fixes this existence problem – by introducing boundary layers

Too good to be true? What’s missing?

• But a far more serious problem • Kinetic Theory does not capture detailed

“spike-timing” information

WhyWhy does the kinetic theory (Boltzman-type approach in general) not work? does the kinetic theory (Boltzman-type approach in general) not work?

NoteNote Ensemble Average (Network Mechanism)

Network Mechanism (Ensemble Average)

Too good to be true? What’s missing?

• But a far more serious problem • Kinetic Theory does not capture detailed

“spike-timing” statistics

Too good to be true? What’s missing?

• But a far more serious problem • Kinetic Theory does not capture detailed

“spike-timing” statistics• And most likely the cortex works, on very

short time time scales, through neurons correlated by detailed spike timing.

• Take, for example, the line-motion illusion

Line-Motion-Illusion

LMI

Model Voltage

Model NMDA

time

0

128

space

Trials40%

‘coarse’0%

‘coarse’

Direct ‘naïve’ coarse grainingmay not suffice:

• Priming mechanism relies on Recruitment

• Recruitment relies on locally correlated cortical firing events

• Naïve ensemble average destroys locally correlated events

Stimulus

Conclusion• Kinetic Theory is a numerically efficient (103

-- 105 more efficient than I&F), and remarkably accurate.

• Kinetic Theory accurately captures firing rates in fluctuation dominated systems

• Kinetic Theory does not capture detailed spike-timed correlations – which may be how the cortex works, as it has no time to average.

• So we’ve returned to integrate & fire networks, and have developed fast “multipole” algorithms for integrate & fire systems (Cai and Rangan, 2005).

top related