Workshop 1: Mathematical Challenges in Neural Network Dynamics

(October 1,2012 - October 5,2012 )

Organizers


Nicolas Brunel
Statistics and Neurobiology, University of Chicago
John Rinzel
Center for Neural Science & Courant Institute, New York University
Eric Shea-Brown
Applied Mathematics, University of Washington
Sara Solla
Department of Physiology, Northwestern University Medical School

We will focus on dynamics and information processing in large, nonlinear networks. The aim is to highlight a set of mathematical questions that recur across neuroscience, and to discuss both recent progress and outstanding problems. The final day will feature a series of retrospective talks on the interplay of mathematics and neuroscience, leading into moderated discussions of future prospects. These sessions will align with the major themes of the workshop, which are proposed to be as follows: Linking large-scale network structure and dynamics: The heterogeneous components and vast scale and connectivity of neural systems make for an overwhelming range of possible networks. However, network architectures are constrained by key principles - for example, each cell produces connections of only one sign, leading to non-normal connectivity matrices (Murphy and Miller). What are the consequences of such features for general properties of network dynamics? We will focus on this and other systematic departures from random connectivity, including small-world structures, localized connection "neighborhoods," feedforward inhibition circuits, and the impact of highly recurrent, and hence bistable, network components. In addition, we will cover the latest results on how basic assumptions about single-neuron properties do and do not impact network-wide dynamics. Bridging scales -- mean field models: What mathematical tools can bridge scales from networks of spiking cells to averaged statistical variables that usefully summarize the activity of large networks? Mean-field techniques have yielded major advances in mathematical neuroscience for decades, but many developments remain to be realized, especially for networks with nonsparse connections and hence partial synchrony among spikes. Information and coding in large spiking networks: Information-theoretic studies have shown that certain patterns of correlated, or partially synchronized, spiking across large networks enhances the fidelity with which they can transmit information. But what network dynamics lead to such patterns? We will highlight general mathematical results that connect architecture and information processing. Plasticity and learning in network connections: Perhaps the most fascinating aspect of neural dynamics is how network activity drives network architecture to evolve over time. We will focus on mathematical tools for understanding the consequences of such rules, both in terms of the general connectivity structures that they produce and in terms of network function - e.g., encoding and releasing "memories" of past inputs. A related theme is robustness and variability in neural circuits - for example, how widely can connection strengths and intrinsic properties vary while preserving basic features of a network's dynamics? (Please note that this last might instead be covered in a separate short workshop on control theory in neuroscience.)

Accepted Speakers

Paul Bressloff
Department of Mathematics, University of Utah
Dean Buonomano
Neurobiology, University of California, Los Angeles
Dan Butts
Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland
Carson Chow
Laboratory of Biological Modeling, National Institutes of Health
Claudia Clopath
Center for Theoretical Neuroscience, Columbia University
Sophie Deneve
Group for Neural Theory, 'Ecole Normale Sup'erieure
Brent Doiron
Mathematics, University of Pittsburgh
Ila Fiete
The Center for Learning and Memory, University of Texas
Ken Miller
Center for Theoretical Neuroscience and Dept. of Neuroscience, Columbia University
Astrid Prinz
Biology, Emory University
Kanaka Rajan
Biophysics, Princeton University
Misha Tsodyks
Neurobiology, Weizmann Institute of Science
Carl van Creeswijk
Neurophysics and Physiology, Paris Descartes University
Fred Wolf
Nonlinear Dynamics, Max-Planck Institute for Dynamics and Self-Organization
Lai-Sang Young
Mathematics, New York University
Monday, October 1, 2012
Time Session
08:00 AM

Shuttle to MBI

08:15 AM
08:45 AM

Breakfast

08:45 AM
09:00 AM

Welcome, overview, introductions: Marty Golubitsky

09:00 AM
09:45 AM
09:45 AM
10:30 AM
Lai-Sang Young - Emergent dynamics in a model of visual cortex
I will report on recent work which proposes that the network dynamics of the mammalian visual cortex are neither homogeneous nor synchronous but highly structured and strongly shaped by temporally localized barrages of excitatory and inhibitory firing we call `multiple-firing events' (MFEs). Our proposal is based on careful study of a network of spiking neurons built to reflect the coarse physiology of a small patch of layer 2/3 of V1. When appropriately benchmarked this network is capable of reproducing the qualitative features of a range of phenomena observed in the real visual cortex, including orientation tuning, spontaneous background patterns, surround suppression and gamma-band oscillations. Detailed investigation into the relevant regimes reveals causal relationships among dynamical events driven by a strong competition between the excitatory and inhibitory populations. Testable predictions are proposed; challenges for mathematical neuroscience will also be discussed. This is joint work with Aaditya Rangan.
10:30 AM
11:00 AM

Break

11:00 AM
11:45 AM
Dean Buonomano - Complexity without chaos: Plasticity within random recurrent networks generates locally stable neural trajectories
A number of related neurocomputational models have proposed that brain computations rely on the evolving dynamics of recurrent neural networks. In contrast to conventional attractor models, in this framework (which includes state-dependent networks, liquid-state machines, and echo-state machines) computations arise from the voyage through state space rather than the arrival at a given location. To date however, these models have been limited by two facts. First, the regimes that are potentially the most powerful from a computational perspective are generally chaotic. Second, while synapses in cortical networks are plastic, incorporating robust forms of plasticity in simulated recurrent networks (that exhibit self-perpetuating dynamics) has proved very challenging. We address both these problems by demonstrating how random recurrent networks (RRNs) that initially exhibit self-perpetuating and chaotic dynamics can be tuned through a supervised learning rule to generate locally stable neural patterns of activity. The outcome is a novel neural network regime that exhibits both transiently stable and chaotic trajectories. We further show that the recurrent learning rule dramatically increased the ability of RRNs to generate complex spatiotemporal motor patterns, and accounts for recent experimental data showing a decrease in neural variability in response to stimulus onset.
11:45 AM
12:30 PM
Ken Miller - The stabilized supralinear network: A unifying circuit motif underlying multi-input integration in sensory cortex
Neurons in sensory cortex integrate multiple influences to parse objects and support perception. For weak stimuli, responses to multiple driving stimuli can add supralinearly and modulatory contextual influences can facilitate. Stronger stimuli yield sublinear response summation ("normalization"), which also shapes attentional influences, and contextual suppression. Understanding the circuit operations underlying these diverse phenomena is critical to understanding cortical function and disease. I will present a simple, general theory, showing that a wealth of integrative properties -- including the above, certain spatially periodic behaviors, and stimulus-evoked noise suppression -- arise robustly from dynamics induced by three properties of cortical circuitry: (1) short-range inhibitory and longer-range excitatory connections; (2) strong feedback inhibition; (3) supralinear neuronal input/output functions. The supralinear input/output function quite generally creates a transition from supralinear response summation for weak stimuli to sublinear summation for stronger stimuli, as the subnetwork of excitatory neurons becomes increasingly unstable for stronger stimuli but is dynamically stabilized by feedback inhibition. In new recordings in visual cortex we have confirmed key model predictions.
12:30 PM
02:00 PM

Lunch Break

02:00 PM
02:45 PM
02:45 PM
05:30 PM

Informal Discussions

05:30 PM
07:00 PM

Reception and poster session in MBI Lounge

07:00 PM

Shuttle pick-up from MBI

Tuesday, October 2, 2012
Time Session
08:30 AM

Shuttle to MBI

08:45 AM
09:00 AM

Breakfast

09:00 AM
09:45 AM
Bard Ermentrout - Wandering bumps in stochastic neural fields
We study the effects of noise on stationary pulse solutions (bumps) in spatially extended neural fields. The dynamics of a neural field is described by an integrodifferential equation whose integral term characterizes synaptic interactions between neurons in different spatial locations of the network. Translationally symmetric neural fields support a continuum of stationary bump solutions, which may be centered at any spatial location. Random fluctuations are introduced by modeling the system as a spatially extended Langevin equation whose noise term we take to be additive. For nonzero noise, bumps are shown to wander about the domain in a purely diffusive way. We can approximate the associated diffusion coefficient using a small noise expansion. Upon breaking the (continuous) translation symmetry of the system using spatially heterogeneous inputs or synapses, bumps in the stochastic neural field can become temporarily pinned to a finite number of locations in the network. As a result, the effective diffusion of the bump is reduced, in comparison to the homogeneous case. As the modulation frequency of this heterogeneity increases, the effective diffusion of bumps in the network approaches that of the network with spatially homogeneous weights. We end with some simulations of spiking models which show the same dynamics (This is joint work with Zachary Kilpatrick, UH)
09:45 AM
10:30 AM
Carson Chow - Finite-size effects in neural networks
The dynamics of neural networks have traditionally been analyzed for small systems or in the infinite size mean field limit. While both of these approaches have made great strides in understanding these systems, large but finite-sized networks have not been explored as much analytically. Here, I will show how the dynamical behavior of finite-sized systems can be inferred by expanding in the inverse system-size around the mean field solution. The approach can also be used to solve the inverse problem of inferring the effective dynamics of a single neuron embedded in a large network where only incomplete information is available. The formalism I will outline can be generalized to any high dimensional dynamical system.
10:30 AM
11:00 AM

Break

11:00 AM
11:45 AM
Brent Doiron - Slow dynamics and high variability in balanced cortical networks with clustered excitatory connections
Anatomical studies show that excitatory connections in cortex are not uniformly distributed across a network but instead exhibit clustering into groups of highly connected neurons. The implications of clustering for cortical activity are unclear. We study the effect of clustered excitatory connections on the dynamics of neuronal networks that exhibit high spike time variability due to a balance between excitation and inhibition. Even modest clustering substantially changes the behavior of these networks, introducing slow dynamics where clusters of neurons transiently increase or decrease their firing rate. Consequently, neurons exhibit both short timescale spiking variability and long timescale firing rate fluctuations. We show that stimuli bias networks toward particular activity states, suppressing the mechanisms underlying slow timescale dynamics. This thereby reduces firing rate variability in evoked compared to spontaneous states, as observed experimentally in many cortical systems. Our model thus relates cortical architecture to the reported variability in spontaneous and evoked spiking activity.
11:45 AM
12:30 PM
12:30 PM
02:30 PM

Lunch Break

02:30 PM
03:00 PM
03:00 PM
03:30 PM
03:30 PM
05:30 PM

Informal Discussion

05:30 PM

Shuttle pick-up from MBI

06:30 PM
07:00 PM

Cash Bar

07:00 PM
07:00 PM

Banquet in the Fusion Room @ Crowne Plaza Hotel

Wednesday, October 3, 2012
Time Session
08:15 AM

Shuttle to MBI

08:45 AM
09:00 AM

Breakfast

09:00 AM
09:45 AM
- A Mathematical Theory of Semantic Development
A wide array of psychology experiments have revealed remarkable regularities in the developmental time course of infant semantic cognition, as well as its progressive disintegration in adult dementia. For example, infants tend to acquire the ability to make broad categorical distinctions between concepts before they can make finer scale distinctions, and this process is reversed in dementia, where finer scale categorical distinctions are lost before broad distinctions. We develop a phenomenological, mathematical theory of this process through an analysis of the learning dynamics of multilayer networks exposed to hierarchically structured data. We find new exact solutions to the nonlinear dynamics of error corrective learning in deep, 3 layer networks. These solutions reveal that networks learn input-output covariation structure on a time scale that is inversely proportional to their statistical strength. We further analyze the covariance structure of hierarchical generative models, and show how data generated from such models yield a hierarchy of input-output modes, leading to a hierarchy of time-scales over which such modes are learned. When combined, these results provide a unified, phenomenological account of the time-course of acquisition and disintegration of semantic knowledge.

Joint work with: Andrew Saxe and Jay McClelland.
09:45 AM
10:30 AM
Dan Butts - Detecting the many roles of inhibition in shaping sensory processing
Inhibition is a component of nearly every neural system, and increasingly prevalent component in theoretical network models. However, its role in sensory processing is often difficult to directly measure and/or infer. Using a nonlinear modeling framework that can infer the presence and stimulus tuning of inhibition using extracellular and intracellular recordings, I will both describe different forms of inferred inhibition (subtractive and multiplicative), and suggest multiple roles in sensory processing. I will primarily refer to studies in the retina, where it likely contributes to contrast adaptation, the generation of precise timing, and also to diversity of computation among different retinal ganglion cell types. I will also describe roles of shaping sensory processing in other areas, including the auditory areas and the visual cortex. Understanding the role of inhibition in neural processing both can inform a richer view of how single neuron processing can contribute to network behavior, as well as provide tools to validate network models using neural data.
10:30 AM
11:00 AM

Break

11:00 AM
11:45 AM
11:45 AM
12:30 PM
12:30 PM
02:30 PM

Lunch Break

02:30 PM
03:00 PM
03:00 PM
03:30 PM
03:30 PM
05:30 PM

Informal Discussions

05:30 PM

Shuttle pick-up from MBI

Thursday, October 4, 2012
Time Session
08:15 AM

Shuttle to MBI

08:45 AM
09:00 AM

Breakfast

09:00 AM
09:30 AM
Duane Nykamp - Capturing effective neuronal dynamics in random networks with complex topologies
We introduce a random network model in which one can prescribe the frequency of second order edge motifs. We derive effective equations for the activity of spiking neuron models coupled via such networks. A key consequence of the motif-induced edge correlations is that one cannot derive closed equations for average activity of the nodes (the average firing rate neurons) but instead must develop the equations in terms of the average activity of the edges (the synaptic drives). As a result, the network topology increases the dimension of the effective dynamics and allows for a larger repertoire of behavior. We demonstrate this behavior through simulations of spiking neuronal networks.
09:30 AM
10:00 AM
Peter Thomas - Redefining Asymptotic Phase for Stochastic Conductance-Based Neural Models with Statistical Limit Cycles
In deterministic dynamics, a stable limit cycle is a closed, isolated periodic orbit that attracts nearby trajectories. Points in its basin of attraction may be disambiguated by their asympototic phase. In stochastic systems with approximately periodic trajectories, asymptotic phase is no longer well defined, because all initial densities typically converge to the same stationary measure. We explore circumstances under which one may nevertheless define an analog of the "asymptotic phase". In particular, we consider jump Markov process models incorporating ion channel noise, and study a stochastic version of the classical Morris-Lecar system in this framework. And we show that the stochastic asymptotic phase can be defined even for some systems in which no underlying deterministic limit cycle exists, such as an attracting heteroclinic cycle.
10:00 AM
10:30 AM

Jon Rubin

10:30 AM
11:00 AM

Break

11:00 AM
11:30 AM

Marty Golubitsky

11:30 AM
12:00 PM

David Terman

12:00 PM
12:30 PM
12:30 PM
02:30 PM

Lunch Break

02:30 PM
05:30 PM

Late-breaking topics and small group discussions

05:30 PM

Shuttle pick-up from MBI

Friday, October 5, 2012
Time Session
08:15 AM

Shuttle to MBI

08:45 AM
09:00 AM

Breakfast

09:00 AM
09:45 AM
Fred Wolf - Entropy production and phase space structure in models of cortical circuits
Entropy production and phase space structure in models of cortical circuits
09:45 AM
10:30 AM
10:30 AM
11:00 AM

Break

11:00 AM
11:45 AM
11:45 AM
12:30 PM

Final information discussion

12:30 PM

Shuttle pick-up from MBI

Name Affiliation
Amro, Rami ramiamromath233_1@yahoo.com Physics and Astronomy, Ohio University
Anderson, David anderson@math.wisc.edu Mathematics, University of Wisconsin
Barreiro, Andrea abarreiro@smu.edu Mathematics, Southern Methodist University
Belykh, Igor ibelykh@gsu.edu Mathematics and Statistics, Georgia State University
Billock, Vincent vincent.billock.ctr@wpafb.af.mil National Research Council, U.S. Air Force Research Laboratory
Borisyuk, Alla borisyuk@math.utah.edu Mathematics, University of Utah
Bressloff, Paul bressloff@math.utah.edu Department of Mathematics, University of Utah
Brunel, Nicolas nicolas.brunel@univ-paris5.fr Statistics and Neurobiology, University of Chicago
Buonomano, Dean dbuono@ucla.edu Neurobiology, University of California, Los Angeles
Butts, Daniel dab@umd.edu Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland
Cayco Gajic, Natasha caycogajic@gmail.com Applied Mathematics, University of Washington
Chow, Carson carsonc@mail.nih.gov Laboratory of Biological Modeling, National Institutes of Health
Clopath, Claudia cc3450@columbia.edu Center for Theoretical Neuroscience, Columbia University
Curto, Carina ccurto2@math.unl.edu Mathematics, University of Nebraska
Curtu, Rodica Rodica-Curtu@uiowa.edu Mathematics, University of Iowa
Deneve, Sophie sophie.deneve@ens.fr Group for Neural Theory, 'Ecole Normale Sup'erieure
DeWoskin, Daniel dewoskin@umich.edu Mathematics, University of Michigan
Doiron, Brent bdoiron@pitt.edu Mathematics, University of Pittsburgh
Engelken, Rainer rainer@nld.ds.mpg.de Nonlinear Dynamics, Max-Planck Institute for Dynamics and Self-Organization
Ermentrout, Bard bard@pitt.edu Department of Mathematics, University of Pittsburgh
Farkhooi, Farzad farzad@zedat.fu-berlin.de Theoretical Neuroscience & Neuroinformatics, Freie Universität Berlin
Fiete, Ila fiete@mail.clm.utexas.edu The Center for Learning and Memory, University of Texas
Franci, Alessio afranci@ulg.ac.be Systems and modeling group, University of Liege
Giusti, Chad cgiusti2@unl.edu Mathematics, University of Nebraska
Goldwyn, Joshua jhg262@nyu.edu Applied Mathematics, University of Washington
Greenwood, Priscilla pgreenw@math.asu.edu Department of Mathematics , University of British Columbia
Hillar , Christopher chillar@msri.org Redwood Center for Theoretical Neuroscience, MSRI / Berkeley
Hirschauer, Thomas thomas.hirschauer@osumc.edu Neuroscience Graduate Studies Program, The Ohio State University
Holmes, William holmes@ohio.edu Biological Sciences, Ohio University
Hu, Yu huyu@uw.edu Applied Mathematics, University of Washington
Joo, Jaewook jjoo1@utk.edu Physics, University of Tennessee
Kilpatrick, Zachary zpkilpat@pitt.edu Mathematics, University of Houston
Kim, Jae Kyoung jaekkim@umich.edu Mathematics,
Kim, Chris cmkim@umn.edu School of Mathematics, University of Minnesota
Kramer, Peter kramep@rpi.edu Mathematical Sciences, Rensselaer Polytechnic Institute
Kurtz, Thomas kurtz@math.wisc.edu Mathematics and Statistics, University of Wisconsin
Laing, Carlo c.r.laing@massey.ac.nz IIMS, Massey University
Lajoie, Guillaume glajoie@amath.washington.edu Applied Mathematics, University of Washington
Lim, Sukbin sblim@ucdavis.edu Center for neuroscience, University of California, Davis
Lin, Kevin klin@math.arizona.edu Department of Mathematics, University of Arizona
Miller, Ken ken@neurotheory.columbia.edu; Center for Theoretical Neuroscience and Dept. of Neuroscience, Columbia University
Molkov, Yaroslav ymolkov@iupui.edu Department of Mathematical Sciences, Indiana University--Purdue University
Murillo, Anarina anarina.murillo@asu.edu Mathematical, Computational, and Modeling Sciences Center, Arizona State University
Njap, Felix fnjap2001@gmail.com signal processing, GS-CMLS
Nykamp, Duane nykamp@umn.edu School of Mathematics, University of Minnesota
Ostojic, Srdjan srdjan.ostojic@ens.fr Laboratoire de Neurosciences Cognitives, Ecole Normale Superieure
Patel, Mainak mainak@math.duke.edu Mathematics, Duke University
Prinz, Astrid astrid.prinz@emory.edu Biology, Emory University
Puelma Touzel, Maximilian mptouzel@nld.ds.mpg.de Nonlinear Dynamics, Max Planck Institute for Dynamics and Selforganization
Radulescu, Anca radulesc@colorado.edu Mathematics, University of Colorado
Rajan, Kanaka krajan@princeton.edu Biophysics, Princeton University
Rinzel, John rinzel@cns.nyu.edu Center for Neural Science & Courant Institute, New York University
Rotstein, Horacio horacio@njit.edu Mathematical Sciences, New Jersey Institute of Technology
Rotter, Stefan stefan.rotter@biologie.uni-freiburg.de Bernstein Center Freiburg & Faculty of Biology, University of Freiburg
Sauer, Tim tsauer@gmu.edu Department of Mathematics, George Mason University
Schiff, Steven sjs49@engr.psu.edu Depts. Neurosurgery / Eng Science & Mechanics / Physics, Pennsylvania State University
Sharpee, Tatyana sharpee@salk.edu Computational Neurobiology Laboratory, The Salk Institute for Biological Studies
Shea-Brown, Eric etsb@washington.edu Applied Mathematics, University of Washington
Shiau, LieJune shiau@uhcl.edu Mathematics, University of Houston--Clear Lake
Shilnikov, Andrey ashilnikov@gsu.edu Neuroscience Institute, Georgia State University
Smith, Ruth pmxrs3@nottingham.ac.uk University of Nottingham
Stewart, Ian I.N.Stewart@warwick.ac.uk Dept. of Mathematics, University of Warwick
Thieullen, Michèle michele.thieullen@upmc.fr Laboratoire de Probabilités et Modèles Aléatoires, Université Pierre et Marie Curie
Thomas, Peter peter.j.thomas@case.edu Department of Mathematics, Applied Mathematics, and Statistics, Case Western Reserve University
Trenado, Carlos trenado@cdb-unit.de Comp. Diagnostics and Biocybernetics Unit, Saarland University Hospital
Tsodyks, Misha misha@weizmann.ac.il Neurobiology, Weizmann Institute of Science
Urban, Alexander alexanderdarius28@gmail.com Human Research and Engineering Division, Oak Ridge Associated Universities (ARL postdoc program)
van Vreeswijk, Carl carl.van-vreeswijk@biomedicale.univ-paris5.fr Neurophysics and Physiology, Paris Descartes University
Wang, Xiao-Jing xjwang@yale.edu Neuroscience, New York University
Wedgwood, Kyle kyz1024@gmail.com Mathematical Sciences, University of Nottingham
Wolf, Fred fred@nld.ds.mpg.de Nonlinear Dynamics, Max-Planck Institute for Dynamics and Self-Organization
Young, Lai-Sang lsy@cims.nyu.edu Mathematics, New York University
Neural field model of binocular rivalry waves in primary visual cortex
Neural fields model the large-scale dynamics of spatially structured cortical networks in terms of continuum integro-differential equations, whose associated integral kernels represent the spatial distribution of neuronal synaptic connections. The advantage of a continuum rather than a discrete representation of spatially structured networks is that various techniques from the analysis of PDEs can be adapted to study the nonlinear dynamics of cortical patterns, oscillations and waves. In this talk we consider a neural field model of binocular rivalry waves in primary visual cortex (V1), which are thought to be the neural correlate of the wave-like propagation of perceptual dominance during binocular rivalry. Binocular rivalry is the phenomenon where perception switches back and forth between different images presented to the two eyes. The resulting fluctuations in perceptual dominance and suppression provide a basis for non-invasive studies of the human visual system and the identification of possible neural mechanisms underlying conscious visual awareness. We derive an analytical expression for the speed of a binocular rivalry wave as a function of various neurophysiological parameters, and show how properties of the wave are consistent with the wave-like propagation of perceptual dominance observed in recent psychophysical experiments. In addition to providing an analytical framework for studying binocular rivalry waves, we show how neural field methods provide insights into the mechanisms underlying the generation of the waves. In particular, we highlight the important role of slow adaptation in providing a "symmetry breaking mechanism" that allows waves to propagate. We end by discussing recent extensions of the work that incorporate the effects of noise, and the detailed functional architecture of V1.
Complexity without chaos: Plasticity within random recurrent networks generates locally stable neural trajectories
A number of related neurocomputational models have proposed that brain computations rely on the evolving dynamics of recurrent neural networks. In contrast to conventional attractor models, in this framework (which includes state-dependent networks, liquid-state machines, and echo-state machines) computations arise from the voyage through state space rather than the arrival at a given location. To date however, these models have been limited by two facts. First, the regimes that are potentially the most powerful from a computational perspective are generally chaotic. Second, while synapses in cortical networks are plastic, incorporating robust forms of plasticity in simulated recurrent networks (that exhibit self-perpetuating dynamics) has proved very challenging. We address both these problems by demonstrating how random recurrent networks (RRNs) that initially exhibit self-perpetuating and chaotic dynamics can be tuned through a supervised learning rule to generate locally stable neural patterns of activity. The outcome is a novel neural network regime that exhibits both transiently stable and chaotic trajectories. We further show that the recurrent learning rule dramatically increased the ability of RRNs to generate complex spatiotemporal motor patterns, and accounts for recent experimental data showing a decrease in neural variability in response to stimulus onset.
Detecting the many roles of inhibition in shaping sensory processing
Inhibition is a component of nearly every neural system, and increasingly prevalent component in theoretical network models. However, its role in sensory processing is often difficult to directly measure and/or infer. Using a nonlinear modeling framework that can infer the presence and stimulus tuning of inhibition using extracellular and intracellular recordings, I will both describe different forms of inferred inhibition (subtractive and multiplicative), and suggest multiple roles in sensory processing. I will primarily refer to studies in the retina, where it likely contributes to contrast adaptation, the generation of precise timing, and also to diversity of computation among different retinal ganglion cell types. I will also describe roles of shaping sensory processing in other areas, including the auditory areas and the visual cortex. Understanding the role of inhibition in neural processing both can inform a richer view of how single neuron processing can contribute to network behavior, as well as provide tools to validate network models using neural data.
Finite-size effects in neural networks
The dynamics of neural networks have traditionally been analyzed for small systems or in the infinite size mean field limit. While both of these approaches have made great strides in understanding these systems, large but finite-sized networks have not been explored as much analytically. Here, I will show how the dynamical behavior of finite-sized systems can be inferred by expanding in the inverse system-size around the mean field solution. The approach can also be used to solve the inverse problem of inferring the effective dynamics of a single neuron embedded in a large network where only incomplete information is available. The formalism I will outline can be generalized to any high dimensional dynamical system.
Cerebellar learning - A model of vestibulo-ocular reflex adaptation
Cerebellar learning - A model of vestibulo-ocular reflex adaptation
Spatially structured networks from sequences
Spatially structured networks, such as bump attractor networks, have enjoyed considerable success in modeling a wide range of phenomena in cortical and hippocampal networks. A key question that arises in the case of hippocampal models, however, is how such a spatial organization of the synaptic connectivity matrix can arise in the absence of any a priori topographic structure of the network. Here we demonstrate a simple mechanism by which robust sequences of neuronal activation, such as those observed during hippocampal sharp waves, can lead to the formation of spatially structured networks that exhibit robust bump attractor dynamics.
Balanced spiking networks can implement dynamical systems with predictive coding
Neural networks can integrate sensory information and generate continuously varying outputs, even though individual neurons communicate only with spikes---all-or-none events. Here we show how this can be done efficiently if spikes communicate "prediction errors" between neurons. We focus on the implementation of linear dynamical systems and derive a spiking network model from a single optimization principle. Our model naturally accounts for two puzzling aspects of cortex. First, it provides a rationale for the tight balance and correlations between excitation and inhibition. Second, it predicts asynchronous and irregular firing as a consequence of predictive population coding, even in the limit of vanishing noise. We show that our spiking networks have error-correcting properties that make them far more accurate and robust than comparable rate models. Our approach suggests spike times do matter when considering how the brain computes, and that the reliability of cortical representations could have been strongly under-estimated.
Slow dynamics and high variability in balanced cortical networks with clustered excitatory connections
Anatomical studies show that excitatory connections in cortex are not uniformly distributed across a network but instead exhibit clustering into groups of highly connected neurons. The implications of clustering for cortical activity are unclear. We study the effect of clustered excitatory connections on the dynamics of neuronal networks that exhibit high spike time variability due to a balance between excitation and inhibition. Even modest clustering substantially changes the behavior of these networks, introducing slow dynamics where clusters of neurons transiently increase or decrease their firing rate. Consequently, neurons exhibit both short timescale spiking variability and long timescale firing rate fluctuations. We show that stimuli bias networks toward particular activity states, suppressing the mechanisms underlying slow timescale dynamics. This thereby reduces firing rate variability in evoked compared to spontaneous states, as observed experimentally in many cortical systems. Our model thus relates cortical architecture to the reported variability in spontaneous and evoked spiking activity.
Wandering bumps in stochastic neural fields
We study the effects of noise on stationary pulse solutions (bumps) in spatially extended neural fields. The dynamics of a neural field is described by an integrodifferential equation whose integral term characterizes synaptic interactions between neurons in different spatial locations of the network. Translationally symmetric neural fields support a continuum of stationary bump solutions, which may be centered at any spatial location. Random fluctuations are introduced by modeling the system as a spatially extended Langevin equation whose noise term we take to be additive. For nonzero noise, bumps are shown to wander about the domain in a purely diffusive way. We can approximate the associated diffusion coefficient using a small noise expansion. Upon breaking the (continuous) translation symmetry of the system using spatially heterogeneous inputs or synapses, bumps in the stochastic neural field can become temporarily pinned to a finite number of locations in the network. As a result, the effective diffusion of the bump is reduced, in comparison to the homogeneous case. As the modulation frequency of this heterogeneity increases, the effective diffusion of bumps in the network approaches that of the network with spatially homogeneous weights. We end with some simulations of spiking models which show the same dynamics (This is joint work with Zachary Kilpatrick, UH)
A novel mechanism for sparse and reliable stimulus coding in sensory system
A novel mechanism for sparse and reliable stimulus coding in sensory system
Managing heterogeneity in the study of neural oscillator dynamics
We consider a coupled, heterogeneous population of relaxation oscillators used to model rhythmic oscillations in the pre-Botzinger complex. By choosing specific values of the parameter used to describe the heterogeneity, sampled from the probability distribution of the values of that parameter, we show how the effects of heterogeneity can be studied in a computationally efficient manner. When more than one parameter is heterogeneous, full or sparse tensor product grids are used to select appropriate parameter values. The method allows us to effectively reduce the dimensionality of the model, and it provides a means for systematically investigating the effects of heterogeneity in coupled systems, linking ideas from uncertainty quantification to those for the study of network dynamics.
Balanced cortical microcircuitry for maintaining short-term memory
Persistent neural activity in the absence of a stimulus has been identified as a neural correlate of working memory, but how such activity is maintained by neocortical circuits remains unknown. Here we show that the inhibitory and excitatory microcircuitry of neocortical memory-storing regions is sufficient to implement a corrective feedback mechanism that enables persistent activity to be maintained stably for prolonged durations. When recurrent excitatory and inhibitory inputs to memory neurons are balanced in strength, but offset in time, drifts in activity trigger a corrective signal that counteracts memory decay. Circuits containing this mechanism temporally integrate their inputs, generate the irregular neural firing observed during persistent activity, and are robust against common perturbations that severely disrupt previous models of short-term memory storage. This work reveals a mechanism for the accumulation and storage of memories in neocortical circuits based upon principles of corrective negative feedback widely used in engineering applications.
The stabilized supralinear network: A unifying circuit motif underlying multi-input integration in sensory cortex
Neurons in sensory cortex integrate multiple influences to parse objects and support perception. For weak stimuli, responses to multiple driving stimuli can add supralinearly and modulatory contextual influences can facilitate. Stronger stimuli yield sublinear response summation ("normalization"), which also shapes attentional influences, and contextual suppression. Understanding the circuit operations underlying these diverse phenomena is critical to understanding cortical function and disease. I will present a simple, general theory, showing that a wealth of integrative properties -- including the above, certain spatially periodic behaviors, and stimulus-evoked noise suppression -- arise robustly from dynamics induced by three properties of cortical circuitry: (1) short-range inhibitory and longer-range excitatory connections; (2) strong feedback inhibition; (3) supralinear neuronal input/output functions. The supralinear input/output function quite generally creates a transition from supralinear response summation for weak stimuli to sublinear summation for stronger stimuli, as the subnetwork of excitatory neurons becomes increasingly unstable for stronger stimuli but is dynamically stabilized by feedback inhibition. In new recordings in visual cortex we have confirmed key model predictions.
Capturing effective neuronal dynamics in random networks with complex topologies
We introduce a random network model in which one can prescribe the frequency of second order edge motifs. We derive effective equations for the activity of spiking neuron models coupled via such networks. A key consequence of the motif-induced edge correlations is that one cannot derive closed equations for average activity of the nodes (the average firing rate neurons) but instead must develop the equations in terms of the average activity of the edges (the synaptic drives). As a result, the network topology increases the dimension of the effective dynamics and allows for a larger repertoire of behavior. We demonstrate this behavior through simulations of spiking neuronal networks.
Amplification of high frequency oscillations by cell morphology and input segregation
Amplification of high frequency oscillations by cell morphology and input segregation
Reliable and unreliable spike times in simple neural networks
Reliable and unreliable spike times in simple neural networks.
Redefining Asymptotic Phase for Stochastic Conductance-Based Neural Models with Statistical Limit Cycles
In deterministic dynamics, a stable limit cycle is a closed, isolated periodic orbit that attracts nearby trajectories. Points in its basin of attraction may be disambiguated by their asympototic phase. In stochastic systems with approximately periodic trajectories, asymptotic phase is no longer well defined, because all initial densities typically converge to the same stationary measure. We explore circumstances under which one may nevertheless define an analog of the "asymptotic phase". In particular, we consider jump Markov process models incorporating ion channel noise, and study a stochastic version of the classical Morris-Lecar system in this framework. And we show that the stochastic asymptotic phase can be defined even for some systems in which no underlying deterministic limit cycle exists, such as an attracting heteroclinic cycle.
Computations with Population Spikes during Different Cortical States
Brain computational challenges vary between behavioral states. Engaged animals react according to incoming sensory information, while in relaxed and sleeping states consolidation of the learned information is believed to take place. Different states are characterized by different forms of cortical activity. We study a possible neuronal mechanism for generating these diverse dynamics and suggest their possible functional significance. Previous studies demonstrated that brief synchronized increase in a neural firing (Population Spikes) can be generated in homogenous recurrent neural networks with short-term synaptic depression. Here we consider more realistic networks with clustered architecture. We show that the level of synchronization in neural activity can be controlled smoothly by network parameters. The network shifts from asynchronous activity to a regime in which clusters synchronized separately, then, the synchronization between the clusters increases gradually to fully synchronized state. We examine the effects of different synchrony levels on the transmission of information by the network. We find that the regime of intermediate synchronization is preferential for the flow of information between sparsely connected areas. Based on these results, we suggest that the regime of intermediate synchronization corresponds to engaged behavioral state of the animal, while global synchronization is exhibited during relaxed and sleeping states.
Orientation selectivity without orientation map
Neurons in primary visual cortex (V1) display substantial orientation selectivity even in species where V1 lacks an orientation map, such as in mice and rats. The mechanism underlying orientation selectivity in V1 with such a salt-and-pepper organization is unknown; it is unclear whether a connectivity that depends on feature similarity is required, or a random connectivity suffices. Here we argue for the latter. We studied the response to a drifting grating of a network model of layer 2/3 with random recurrent connectivity and feedforward input from layer 4 neurons with random preferred orientations. We show that even though the total feedforward and total recurrent excitatory and inhibitory inputs all have a very weak orientation selectivity, strong selectivity emerges in the neuronal spike responses if the network operates in the balanced excitation/inhibition regime. This is because in this regime the (large) untuned components in the excitatory and inhibitory contributions approximately cancel. As a result the untuned part of the input into a neuron as well as its modulation with orientation and time all have a size comparable to the neuronal threshold. However the tuning of the F0 and F1 components are uncorrelated and the high frequency fluctuations are not tuned. This is reflected in the subthreshold voltage response. Remarkably, due to the non-linear voltage-firing rate transfer function, the preferred orientation of the F0 and F1 components of the spike response are highly correlated.
Entropy production and phase space structure in models of cortical circuits
Entropy production and phase space structure in models of cortical circuits
Emergent dynamics in a model of visual cortex
I will report on recent work which proposes that the network dynamics of the mammalian visual cortex are neither homogeneous nor synchronous but highly structured and strongly shaped by temporally localized barrages of excitatory and inhibitory firing we call `multiple-firing events' (MFEs). Our proposal is based on careful study of a network of spiking neurons built to reflect the coarse physiology of a small patch of layer 2/3 of V1. When appropriately benchmarked this network is capable of reproducing the qualitative features of a range of phenomena observed in the real visual cortex, including orientation tuning, spontaneous background patterns, surround suppression and gamma-band oscillations. Detailed investigation into the relevant regimes reveals causal relationships among dynamical events driven by a strong competition between the excitatory and inhibitory populations. Testable predictions are proposed; challenges for mathematical neuroscience will also be discussed. This is joint work with Aaditya Rangan.
video image

Wilson's Rivalry Networks and Derived Patterns
Marty Golubitsky Wilson's Rivalry Networks and Derived Patterns

video image

Neural field model of binocular rivalry waves in primary visual cortex
Paul Bressloff Neural fields model the large-scale dynamics of spatially structured cortical networks in terms of continuum integro-differential equations, whose associated integral kernels represent the spatial distribution of neuronal synaptic connections. The ad

video image

Balanced cortical microcircuitry for maintaining short-term memory
Sukbin Lim Persistent neural activity in the absence of a stimulus has been identified as a neural correlate of working memory, but how such activity is maintained by neocortical circuits remains unknown. Here we show that the inhibitory and excitatory microcir

video image

Emergent dynamics in a model of visual cortex
Lai-Sang Young I will report on recent work which proposes that the network dynamics of the mammalian visual cortex are neither homogeneous nor synchronous but highly structured and strongly shaped by temporally localized barrages of excitatory and inhibitory firin

video image

Finite-size effects in neural networks
Carson Chow The dynamics of neural networks have traditionally been analyzed for small systems or in the infinite size mean field limit. While both of these approaches have made great strides in understanding these systems, large but finite-sized networks have n

video image

Wandering bumps in stochastic neural fields
Bard Ermentrout We study the effects of noise on stationary pulse solutions (bumps) in spatially extended neural fields. The dynamics of a neural field is described by an integrodifferential equation whose integral term characterizes synaptic interactions between ne

video image

Detecting the many roles of inhibition in shaping sensory processing
Daniel Butts Inhibition is a component of nearly every neural system, and increasingly prevalent component in theoretical network models. However, its role in sensory processing is often difficult to directly measure and/or infer. Using a nonlinear modeling frame

video image

Capturing effective neuronal dynamics in random networks with complex topologies
Duane Nykamp We introduce a random network model in which one can prescribe the frequency of second order edge motifs. We derive effective equations for the activity of spiking neuron models coupled via such networks. A key consequence of the motif-induced edge c

video image

Balanced spiking networks can implement dynamical systems with predictive coding
Sophie Deneve Neural networks can integrate sensory information and generate continuously varying outputs, even though individual neurons communicate only with spikes---all-or-none events. Here we show how this can be done efficiently if spikes communicate "p