### Organizers

We will focus on dynamics and information processing in large, nonlinear networks. The aim is to highlight a set of mathematical questions that recur across neuroscience, and to discuss both recent progress and outstanding problems. The final day will feature a series of retrospective talks on the interplay of mathematics and neuroscience, leading into moderated discussions of future prospects. These sessions will align with the major themes of the workshop, which are proposed to be as follows: Linking large-scale network structure and dynamics: The heterogeneous components and vast scale and connectivity of neural systems make for an overwhelming range of possible networks. However, network architectures are constrained by key principles - for example, each cell produces connections of only one sign, leading to non-normal connectivity matrices (Murphy and Miller). What are the consequences of such features for general properties of network dynamics? We will focus on this and other systematic departures from random connectivity, including small-world structures, localized connection "neighborhoods," feedforward inhibition circuits, and the impact of highly recurrent, and hence bistable, network components. In addition, we will cover the latest results on how basic assumptions about single-neuron properties do and do not impact network-wide dynamics. Bridging scales -- mean field models: What mathematical tools can bridge scales from networks of spiking cells to averaged statistical variables that usefully summarize the activity of large networks? Mean-field techniques have yielded major advances in mathematical neuroscience for decades, but many developments remain to be realized, especially for networks with nonsparse connections and hence partial synchrony among spikes. Information and coding in large spiking networks: Information-theoretic studies have shown that certain patterns of correlated, or partially synchronized, spiking across large networks enhances the fidelity with which they can transmit information. But what network dynamics lead to such patterns? We will highlight general mathematical results that connect architecture and information processing. Plasticity and learning in network connections: Perhaps the most fascinating aspect of neural dynamics is how network activity drives network architecture to evolve over time. We will focus on mathematical tools for understanding the consequences of such rules, both in terms of the general connectivity structures that they produce and in terms of network function - e.g., encoding and releasing "memories" of past inputs. A related theme is robustness and variability in neural circuits - for example, how widely can connection strengths and intrinsic properties vary while preserving basic features of a network's dynamics? (Please note that this last might instead be covered in a separate short workshop on control theory in neuroscience.)

### Accepted Speakers

Monday, October 1, 2012 | |
---|---|

Time | Session |

08:00 AM | Shuttle to MBI |

08:15 AM 08:45 AM | Breakfast |

08:45 AM 09:00 AM | Welcome, overview, introductions: Marty Golubitsky |

09:00 AM 09:45 AM | |

09:45 AM 10:30 AM | Lai-Sang Young - Emergent dynamics in a model of visual cortex I will report on recent work which proposes that the network dynamics of the mammalian visual cortex are neither homogeneous nor synchronous but highly structured and strongly shaped by temporally localized barrages of excitatory and inhibitory firing we call `multiple-firing events' (MFEs). Our proposal is based on careful study of a network of spiking neurons built to reflect the coarse physiology of a small patch of layer 2/3 of V1. When appropriately benchmarked this network is capable of reproducing the qualitative features of a range of phenomena observed in the real visual cortex, including orientation tuning, spontaneous background patterns, surround suppression and gamma-band oscillations. Detailed investigation into the relevant regimes reveals causal relationships among dynamical events driven by a strong competition between the excitatory and inhibitory populations. Testable predictions are proposed; challenges for mathematical neuroscience will also be discussed. This is joint work with Aaditya Rangan. |

10:30 AM 11:00 AM | Break |

11:00 AM 11:45 AM | Dean Buonomano - Complexity without chaos: Plasticity within random recurrent networks generates locally stable neural trajectories A number of related neurocomputational models have proposed that brain computations rely on the evolving dynamics of recurrent neural networks. In contrast to conventional attractor models, in this framework (which includes state-dependent networks, liquid-state machines, and echo-state machines) computations arise from the voyage through state space rather than the arrival at a given location. To date however, these models have been limited by two facts. First, the regimes that are potentially the most powerful from a computational perspective are generally chaotic. Second, while synapses in cortical networks are plastic, incorporating robust forms of plasticity in simulated recurrent networks (that exhibit self-perpetuating dynamics) has proved very challenging. We address both these problems by demonstrating how random recurrent networks (RRNs) that initially exhibit self-perpetuating and chaotic dynamics can be tuned through a supervised learning rule to generate locally stable neural patterns of activity. The outcome is a novel neural network regime that exhibits both transiently stable and chaotic trajectories. We further show that the recurrent learning rule dramatically increased the ability of RRNs to generate complex spatiotemporal motor patterns, and accounts for recent experimental data showing a decrease in neural variability in response to stimulus onset. |

11:45 AM 12:30 PM | Ken Miller - The stabilized supralinear network: A unifying circuit motif underlying multi-input integration in sensory cortex Neurons in sensory cortex integrate multiple influences to parse objects and support perception. For weak stimuli, responses to multiple driving stimuli can add supralinearly and modulatory contextual influences can facilitate. Stronger stimuli yield sublinear response summation ("normalization"), which also shapes attentional influences, and contextual suppression. Understanding the circuit operations underlying these diverse phenomena is critical to understanding cortical function and disease. I will present a simple, general theory, showing that a wealth of integrative properties -- including the above, certain spatially periodic behaviors, and stimulus-evoked noise suppression -- arise robustly from dynamics induced by three properties of cortical circuitry: (1) short-range inhibitory and longer-range excitatory connections; (2) strong feedback inhibition; (3) supralinear neuronal input/output functions. The supralinear input/output function quite generally creates a transition from supralinear response summation for weak stimuli to sublinear summation for stronger stimuli, as the subnetwork of excitatory neurons becomes increasingly unstable for stronger stimuli but is dynamically stabilized by feedback inhibition. In new recordings in visual cortex we have confirmed key model predictions. |

12:30 PM 02:00 PM | Lunch Break |

02:00 PM 02:45 PM | |

02:45 PM 05:30 PM | Informal Discussions |

05:30 PM 07:00 PM | Reception and poster session in MBI Lounge |

07:00 PM | Shuttle pick-up from MBI |

Tuesday, October 2, 2012 | |
---|---|

Time | Session |

08:30 AM | Shuttle to MBI |

08:45 AM 09:00 AM | Breakfast |

09:00 AM 09:45 AM | Bard Ermentrout - Wandering bumps in stochastic neural fields We study the effects of noise on stationary pulse solutions (bumps) in spatially extended neural fields. The dynamics of a neural field is described by an integrodifferential equation whose integral term characterizes synaptic interactions between neurons in different spatial locations of the network. Translationally symmetric neural fields support a continuum of stationary bump solutions, which may be centered at any spatial location. Random fluctuations are introduced by modeling the system as a spatially extended Langevin equation whose noise term we take to be additive. For nonzero noise, bumps are shown to wander about the domain in a purely diffusive way. We can approximate the associated diffusion coefficient using a small noise expansion. Upon breaking the (continuous) translation symmetry of the system using spatially heterogeneous inputs or synapses, bumps in the stochastic neural field can become temporarily pinned to a finite number of locations in the network. As a result, the effective diffusion of the bump is reduced, in comparison to the homogeneous case. As the modulation frequency of this heterogeneity increases, the effective diffusion of bumps in the network approaches that of the network with spatially homogeneous weights. We end with some simulations of spiking models which show the same dynamics (This is joint work with Zachary Kilpatrick, UH) |

09:45 AM 10:30 AM | Carson Chow - Finite-size effects in neural networks The dynamics of neural networks have traditionally been analyzed for small systems or in the infinite size mean field limit. While both of these approaches have made great strides in understanding these systems, large but finite-sized networks have not been explored as much analytically. Here, I will show how the dynamical behavior of finite-sized systems can be inferred by expanding in the inverse system-size around the mean field solution. The approach can also be used to solve the inverse problem of inferring the effective dynamics of a single neuron embedded in a large network where only incomplete information is available. The formalism I will outline can be generalized to any high dimensional dynamical system. |

10:30 AM 11:00 AM | Break |

11:00 AM 11:45 AM | Brent Doiron - Slow dynamics and high variability in balanced cortical networks with clustered excitatory connections Anatomical studies show that excitatory connections in cortex are not uniformly distributed across a network but instead exhibit clustering into groups of highly connected neurons. The implications of clustering for cortical activity are unclear. We study the effect of clustered excitatory connections on the dynamics of neuronal networks that exhibit high spike time variability due to a balance between excitation and inhibition. Even modest clustering substantially changes the behavior of these networks, introducing slow dynamics where clusters of neurons transiently increase or decrease their firing rate. Consequently, neurons exhibit both short timescale spiking variability and long timescale firing rate fluctuations. We show that stimuli bias networks toward particular activity states, suppressing the mechanisms underlying slow timescale dynamics. This thereby reduces firing rate variability in evoked compared to spontaneous states, as observed experimentally in many cortical systems. Our model thus relates cortical architecture to the reported variability in spontaneous and evoked spiking activity. |

11:45 AM 12:30 PM | |

12:30 PM 02:30 PM | Lunch Break |

02:30 PM 03:00 PM | |

03:00 PM 03:30 PM | |

03:30 PM 05:30 PM | Informal Discussion |

05:30 PM | Shuttle pick-up from MBI |

06:30 PM 07:00 PM | Cash Bar |

07:00 PM 07:00 PM | Banquet in the Fusion Room @ Crowne Plaza Hotel |

Wednesday, October 3, 2012 | |
---|---|

Time | Session |

08:15 AM | Shuttle to MBI |

08:45 AM 09:00 AM | Breakfast |

09:00 AM 09:45 AM | - A Mathematical Theory of Semantic Development A wide array of psychology experiments have revealed remarkable regularities in the developmental time course of infant semantic cognition, as well as its progressive disintegration in adult dementia. For example, infants tend to acquire the ability to make broad categorical distinctions between concepts before they can make finer scale distinctions, and this process is reversed in dementia, where finer scale categorical distinctions are lost before broad distinctions. We develop a phenomenological, mathematical theory of this process through an analysis of the learning dynamics of multilayer networks exposed to hierarchically structured data. We find new exact solutions to the nonlinear dynamics of error corrective learning in deep, 3 layer networks. These solutions reveal that networks learn input-output covariation structure on a time scale that is inversely proportional to their statistical strength. We further analyze the covariance structure of hierarchical generative models, and show how data generated from such models yield a hierarchy of input-output modes, leading to a hierarchy of time-scales over which such modes are learned. When combined, these results provide a unified, phenomenological account of the time-course of acquisition and disintegration of semantic knowledge. Joint work with: Andrew Saxe and Jay McClelland. |

09:45 AM 10:30 AM | Dan Butts - Detecting the many roles of inhibition in shaping sensory processing Inhibition is a component of nearly every neural system, and increasingly prevalent component in theoretical network models. However, its role in sensory processing is often difficult to directly measure and/or infer. Using a nonlinear modeling framework that can infer the presence and stimulus tuning of inhibition using extracellular and intracellular recordings, I will both describe different forms of inferred inhibition (subtractive and multiplicative), and suggest multiple roles in sensory processing. I will primarily refer to studies in the retina, where it likely contributes to contrast adaptation, the generation of precise timing, and also to diversity of computation among different retinal ganglion cell types. I will also describe roles of shaping sensory processing in other areas, including the auditory areas and the visual cortex. Understanding the role of inhibition in neural processing both can inform a richer view of how single neuron processing can contribute to network behavior, as well as provide tools to validate network models using neural data. |

10:30 AM 11:00 AM | Break |

11:00 AM 11:45 AM | |

11:45 AM 12:30 PM | |

12:30 PM 02:30 PM | Lunch Break |

02:30 PM 03:00 PM | |

03:00 PM 03:30 PM | |

03:30 PM 05:30 PM | Informal Discussions |

05:30 PM | Shuttle pick-up from MBI |

Thursday, October 4, 2012 | |
---|---|

Time | Session |

08:15 AM | Shuttle to MBI |

08:45 AM 09:00 AM | Breakfast |

09:00 AM 09:30 AM | Duane Nykamp - Capturing effective neuronal dynamics in random networks with complex topologies We introduce a random network model in which one can prescribe the frequency of second order edge motifs. We derive effective equations for the activity of spiking neuron models coupled via such networks. A key consequence of the motif-induced edge correlations is that one cannot derive closed equations for average activity of the nodes (the average firing rate neurons) but instead must develop the equations in terms of the average activity of the edges (the synaptic drives). As a result, the network topology increases the dimension of the effective dynamics and allows for a larger repertoire of behavior. We demonstrate this behavior through simulations of spiking neuronal networks. |

09:30 AM 10:00 AM | Peter Thomas - Redefining Asymptotic Phase for Stochastic Conductance-Based Neural Models with Statistical Limit Cycles In deterministic dynamics, a stable limit cycle is a closed, isolated periodic orbit that attracts nearby trajectories. Points in its basin of attraction may be disambiguated by their asympototic phase. In stochastic systems with approximately periodic trajectories, asymptotic phase is no longer well defined, because all initial densities typically converge to the same stationary measure. We explore circumstances under which one may nevertheless define an analog of the "asymptotic phase". In particular, we consider jump Markov process models incorporating ion channel noise, and study a stochastic version of the classical Morris-Lecar system in this framework. And we show that the stochastic asymptotic phase can be defined even for some systems in which no underlying deterministic limit cycle exists, such as an attracting heteroclinic cycle. |

10:00 AM 10:30 AM | Jon Rubin |

10:30 AM 11:00 AM | Break |

11:00 AM 11:30 AM | Marty Golubitsky |

11:30 AM 12:00 PM | David Terman |

12:00 PM 12:30 PM | |

12:30 PM 02:30 PM | Lunch Break |

02:30 PM 05:30 PM | Late-breaking topics and small group discussions |

05:30 PM | Shuttle pick-up from MBI |

Friday, October 5, 2012 | |
---|---|

Time | Session |

08:15 AM | Shuttle to MBI |

08:45 AM 09:00 AM | Breakfast |

09:00 AM 09:45 AM | Fred Wolf - Entropy production and phase space structure in models of cortical circuits Entropy production and phase space structure in models of cortical circuits |

09:45 AM 10:30 AM | |

10:30 AM 11:00 AM | Break |

11:00 AM 11:45 AM | |

11:45 AM 12:30 PM | Final information discussion |

12:30 PM | Shuttle pick-up from MBI |

Name | Affiliation | |
---|---|---|

Amro, Rami | ramiamromath233_1@yahoo.com | Physics and Astronomy, Ohio University |

Anderson, David | anderson@math.wisc.edu | Mathematics, University of Wisconsin |

Barreiro, Andrea | abarreiro@smu.edu | Mathematics, Southern Methodist University |

Belykh, Igor | ibelykh@gsu.edu | Mathematics and Statistics, Georgia State University |

Billock, Vincent | vincent.billock.ctr@wpafb.af.mil | National Research Council, U.S. Air Force Research Laboratory |

Borisyuk, Alla | borisyuk@math.utah.edu | Mathematics, University of Utah |

Bressloff, Paul | bressloff@math.utah.edu | Department of Mathematics, University of Utah |

Brunel, Nicolas | nicolas.brunel@univ-paris5.fr | Statistics and Neurobiology, University of Chicago |

Buonomano, Dean | dbuono@ucla.edu | Neurobiology, University of California, Los Angeles |

Butts, Daniel | dab@umd.edu | Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland |

Cayco Gajic, Natasha | caycogajic@gmail.com | Applied Mathematics, University of Washington |

Chow, Carson | carsonc@mail.nih.gov | Laboratory of Biological Modeling, National Institutes of Health |

Clopath, Claudia | cc3450@columbia.edu | Center for Theoretical Neuroscience, Columbia University |

Curto, Carina | ccurto2@math.unl.edu | Mathematics, University of Nebraska |

Curtu, Rodica | Rodica-Curtu@uiowa.edu | Mathematics, University of Iowa |

Deneve, Sophie | sophie.deneve@ens.fr | Group for Neural Theory, 'Ecole Normale Sup'erieure |

DeWoskin, Daniel | dewoskin@umich.edu | Mathematics, University of Michigan |

Doiron, Brent | bdoiron@pitt.edu | Mathematics, University of Pittsburgh |

Engelken, Rainer | rainer@nld.ds.mpg.de | Nonlinear Dynamics, Max-Planck Institute for Dynamics and Self-Organization |

Ermentrout, Bard | bard@pitt.edu | Department of Mathematics, University of Pittsburgh |

Farkhooi, Farzad | farzad@zedat.fu-berlin.de | Theoretical Neuroscience & Neuroinformatics, Freie UniversitÃƒÂ¤t Berlin |

Fiete, Ila | fiete@mail.clm.utexas.edu | The Center for Learning and Memory, University of Texas |

Franci, Alessio | afranci@ulg.ac.be | Systems and modeling group, University of Liege |

Giusti, Chad | cgiusti2@unl.edu | Mathematics, University of Nebraska |

Goldwyn, Joshua | jhg262@nyu.edu | Applied Mathematics, University of Washington |

Greenwood, Priscilla | pgreenw@math.asu.edu | Department of Mathematics , University of British Columbia |

Hillar , Christopher | chillar@msri.org | Redwood Center for Theoretical Neuroscience, MSRI / Berkeley |

Hirschauer, Thomas | thomas.hirschauer@osumc.edu | Neuroscience Graduate Studies Program, The Ohio State University |

Holmes, William | holmes@ohio.edu | Biological Sciences, Ohio University |

Hu, Yu | huyu@uw.edu | Applied Mathematics, University of Washington |

Joo, Jaewook | jjoo1@utk.edu | Physics, University of Tennessee |

Kilpatrick, Zachary | zpkilpat@pitt.edu | Mathematics, University of Houston |

Kim, Jae Kyoung | jaekkim@umich.edu | Mathematics, |

Kim, Chris | cmkim@umn.edu | School of Mathematics, University of Minnesota |

Kramer, Peter | kramep@rpi.edu | Mathematical Sciences, Rensselaer Polytechnic Institute |

Kurtz, Thomas | kurtz@math.wisc.edu | Mathematics and Statistics, University of Wisconsin |

Laing, Carlo | c.r.laing@massey.ac.nz | IIMS, Massey University |

Lajoie, Guillaume | glajoie@amath.washington.edu | Applied Mathematics, University of Washington |

Lim, Sukbin | sblim@ucdavis.edu | Center for neuroscience, University of California, Davis |

Lin, Kevin | klin@math.arizona.edu | Department of Mathematics, University of Arizona |

Miller, Ken | ken@neurotheory.columbia.edu; | Center for Theoretical Neuroscience and Dept. of Neuroscience, Columbia University |

Molkov, Yaroslav | ymolkov@iupui.edu | Department of Mathematical Sciences, Indiana University--Purdue University |

Murillo, Anarina | anarina.murillo@asu.edu | Mathematical, Computational, and Modeling Sciences Center, Arizona State University |

Njap, Felix | fnjap2001@gmail.com | signal processing, GS-CMLS |

Nykamp, Duane | nykamp@umn.edu | School of Mathematics, University of Minnesota |

Ostojic, Srdjan | srdjan.ostojic@ens.fr | Laboratoire de Neurosciences Cognitives, Ecole Normale Superieure |

Patel, Mainak | mainak@math.duke.edu | Mathematics, Duke University |

Prinz, Astrid | astrid.prinz@emory.edu | Biology, Emory University |

Puelma Touzel, Maximilian | mptouzel@nld.ds.mpg.de | Nonlinear Dynamics, Max Planck Institute for Dynamics and Selforganization |

Radulescu, Anca | radulesc@colorado.edu | Mathematics, University of Colorado |

Rajan, Kanaka | krajan@princeton.edu | Biophysics, Princeton University |

Rinzel, John | rinzel@cns.nyu.edu | Center for Neural Science & Courant Institute, New York University |

Rotstein, Horacio | horacio@njit.edu | Mathematical Sciences, New Jersey Institute of Technology |

Rotter, Stefan | stefan.rotter@biologie.uni-freiburg.de | Bernstein Center Freiburg & Faculty of Biology, University of Freiburg |

Sauer, Tim | tsauer@gmu.edu | Department of Mathematics, George Mason University |

Schiff, Steven | sjs49@engr.psu.edu | Depts. Neurosurgery / Eng Science & Mechanics / Physics, Pennsylvania State University |

Sharpee, Tatyana | sharpee@salk.edu | Computational Neurobiology Laboratory, The Salk Institute for Biological Studies |

Shea-Brown, Eric | etsb@washington.edu | Applied Mathematics, University of Washington |

Shiau, LieJune | shiau@uhcl.edu | Mathematics, University of Houston--Clear Lake |

Shilnikov, Andrey | ashilnikov@gsu.edu | Neuroscience Institute, Georgia State University |

Smith, Ruth | pmxrs3@nottingham.ac.uk | University of Nottingham |

Stewart, Ian | I.N.Stewart@warwick.ac.uk | Dept. of Mathematics, University of Warwick |

Thieullen, Michèle | michele.thieullen@upmc.fr | Laboratoire de ProbabilitÃƒÂ©s et ModÃƒÂ¨les AlÃƒÂ©atoires, UniversitÃƒÂ© Pierre et Marie Curie |

Thomas, Peter | peter.j.thomas@case.edu | Department of Mathematics, Applied Mathematics, and Statistics, Case Western Reserve University |

Trenado, Carlos | trenado@cdb-unit.de | Institute for Clinical Neuroscience, Heinrich-Heine-Universit""at D""usseldorf |

Tsodyks, Misha | misha@weizmann.ac.il | Neurobiology, Weizmann Institute of Science |

Urban, Alexander | alexanderdarius28@gmail.com | Human Research and Engineering Division, Oak Ridge Associated Universities (ARL postdoc program) |

van Vreeswijk, Carl | carl.van-vreeswijk@biomedicale.univ-paris5.fr | Neurophysics and Physiology, Paris Descartes University |

Wang, Xiao-Jing | xjwang@yale.edu | Neuroscience, New York University |

Wedgwood, Kyle | kyz1024@gmail.com | Mathematical Sciences, University of Nottingham |

Wolf, Fred | fred@nld.ds.mpg.de | Nonlinear Dynamics, Max-Planck Institute for Dynamics and Self-Organization |

Young, Lai-Sang | lsy@cims.nyu.edu | Mathematics, New York University |

**Wilson's Rivalry Networks and Derived Patterns**

Marty Golubitsky Wilson's Rivalry Networks and Derived Patterns

**Neural field model of binocular rivalry waves in primary visual cortex**

Paul Bressloff Neural fields model the large-scale dynamics of spatially structured cortical networks in terms of continuum integro-differential equations, whose associated integral kernels represent the spatial distribution of neuronal synaptic connections. The ad

**Balanced cortical microcircuitry for maintaining short-term memory**

Sukbin Lim Persistent neural activity in the absence of a stimulus has been identified as a neural correlate of working memory, but how such activity is maintained by neocortical circuits remains unknown. Here we show that the inhibitory and excitatory microcir

**Emergent dynamics in a model of visual cortex**

Lai-Sang Young I will report on recent work which proposes that the network dynamics of the mammalian visual cortex are neither homogeneous nor synchronous but highly structured and strongly shaped by temporally localized barrages of excitatory and inhibitory firin

**Finite-size effects in neural networks**

Carson Chow The dynamics of neural networks have traditionally been analyzed for small systems or in the infinite size mean field limit. While both of these approaches have made great strides in understanding these systems, large but finite-sized networks have n

**Wandering bumps in stochastic neural fields**

Bard Ermentrout We study the effects of noise on stationary pulse solutions (bumps) in spatially extended neural fields. The dynamics of a neural field is described by an integrodifferential equation whose integral term characterizes synaptic interactions between ne

**Detecting the many roles of inhibition in shaping sensory processing**

Daniel Butts Inhibition is a component of nearly every neural system, and increasingly prevalent component in theoretical network models. However, its role in sensory processing is often difficult to directly measure and/or infer. Using a nonlinear modeling frame

**Capturing effective neuronal dynamics in random networks with complex topologies**

Duane Nykamp We introduce a random network model in which one can prescribe the frequency of second order edge motifs. We derive effective equations for the activity of spiking neuron models coupled via such networks. A key consequence of the motif-induced edge c

**Balanced spiking networks can implement dynamical systems with predictive coding**

Sophie Deneve Neural networks can integrate sensory information and generate continuously varying outputs, even though individual neurons communicate only with spikes---all-or-none events. Here we show how this can be done efficiently if spikes communicate "p