## Workshop 3: Dynamical Systems and Data Analysis in Neuroscience: Bridging the Gap

### Organizers

Casey Diekman
Department of Mathematical Sciences, New Jersey Institute of Technology
Uri Eden
Mathematics and Statistics, Boston University
Leslie Kay
Dept. of Psychology, University of Chicago
Mark Kramer
Math and Stats, Boston University
Horacio Rotstein
Mathematical Sciences, New Jersey Institute of Technology

Powerful technologies exist - and continue to emerge - for large-scale recording and manipulation of neuronal activity. However, even as neuronal data become increasingly rich and imaging procedures become increasingly sophisticated, only a small fraction of neuronal activity (e.g., a neuron's voltage, or calcium flux) remains observed. A fundamental challenge in neuroscience is to link these observed activities to the unobserved biophysical mechanisms that produce it. In some cases, experimental manipulations permit detailed assessment of these mechanisms. However, as emerging technologies facilitate high-throughput experiments and accelerate the number of neurons observed simultaneously, targeted experimental manipulations become infeasible or intractable. Therefore, the challenge of rigorously connecting observed neuronal activity to underlying biophysical mechanisms is becoming increasingly critical to address fundamental questions in neuroscience. One powerful strategy to address the challenge of linking data to mechanisms is through the development of computational models in which unobserved biological mechanisms can be expressed and controlled. These computational models typically possess many variables and parameters, and rigorous 'matching' of the model dynamics to the observed neuronal data remains difficult. In this workshop, we will assemble an interdisciplinary group of researchers that combine expertise in complex neuronal data, computational modeling, and statistical methods with a focus on methodologies that constrain the diverse biophysical mechanisms that support neuronal activity. In doing so, a primary goal will be to develop and share resources that facilitate a coherent synthesis of information collected from observational data and expressed in computational models, and provide rigorous approaches for uncertainty management and model validation. Critical to the development of this framework are interactions among statistical neuroscientists, mathematical neuroscientists, and experimental / clinical neuroscientists. However, these interactions are typically rather limited, and therefore the methodologies applied to a given data set may be quite distinct. Among the various reasons for the segregation among communities is the lack of a forum for reconciling differences in the language and approaches typical to each discipline in a pedagogical manner. For example, the notion of a 'model' for a statistical neuroscientist (e.g., a Poisson process), mathematical neuroscientist (e.g., the Hodgkin-Huxley equations), and clinical neuroscientist (e.g., an in vitro preparation) may be quite different. This workshop will bring together a diverse group of researchers interested in principled statistical techniques that link complex neuronal data with mathematical models of neuronal activity. Researchers will present a variety of data assimilation techniques, which will include estimation of model parameters, as well as estimation of latent dynamical variables, model identification, goodness-of-fit assessment, and a broader characterization of the space of dynamical models that are consistent with complex neuronal data. Additionally, we recognize that other fields have made progress assimilating complex data into dynamical models, notably climate modeling and weather prediction. Representative researchers from these fields will be invited to share their insights and experiences, and to explore the utility of their techniques on physiological data. The workshop will include research talks and informal tutorials, and be appropriate both for researchers in the field and those interested in methods to connect statistical, mathematical, and experimental neuroscience through principled data assimilation techniques. Specifically, the duration of the workshop will be five days. A two-day tutorial will be followed by three days of research talks, with ample time for interactions among the participants. Students and postdocs will be encouraged to participate in the poster session. The tutorials will include not only introductory talks but also hands-on work with realistic data, which will be provided to participants in advance of the workshop.

### Accepted Speakers

Henry Abarbanel
Department of Physical Sciences, University of California, San Diego
Asohan Amarasingham
Department of Mathematics, City College of New York
Alla Borisyuk
Mathematics, University of Utah
Nicolas Brunel
Statistics and Neurobiology, University of Chicago
Carina Curto
Mathematics, Pennsylvania State University
Susanne Ditlevsen
Department of Mathematical Sciences, University of Copenhagen
Fernando Fernandez
Biomedical Engineering, Boston University
Joshua Goldwyn
Mathematics, The Ohio State University
Robert Kass
Department of Statistics and Center for the Neural Basis of Cognition, Carnegie-Mellon University
Don Katz
Psychology/Neuroscience, Brandeis University
Kyle Lepage
Modeling and Theory, Allen Institute for Brain Science
Tay Netoff
Biomedical Engineering, University of Minnesota
Montreal Neurological Institute, McGill University
Jonathan Pillow
Psychology & Neurobiology, University of Texas
Mason Porter
Mathematical Institute, University of Oxford
Alex Roxin
Computational Neuroscience Group, Centre de Recerca Matemàtica
Leonid Rubchinsky
Department of Mathematical Sciences and Stark Neurosciences Research Institute at the Indiana University School of Medicine, Indiana University--Purdue University
Sridevi Sarma
Biomedical Engineering, Johns Hopkins University
Steven Schiff
Depts. Neurosurgery / Eng Science & Mechanics / Physics, Pennsylvania State University
Frances Skinner
Krembil Research Institute, Krembil Research Institute, University Health Network
Sara Solla
Department of Physiology, Northwestern University Medical School
Esteban Tabak
Courant Institute of Mathematical Sciences, New York University
Peter Thomas
Department of Mathematics, Applied Mathematics, and Statistics, Case Western Reserve University
Monday, October 17, 2016
Time Session
07:45 AM

Shuttle to MBI

08:00 AM
08:45 AM

Breakfast

08:45 AM
09:00 AM

MBI Welcome

09:00 AM
09:30 AM
Leslie Kay - Introduction: Interplay of data and models

TBD

09:30 AM
10:45 AM
Uri Eden, Mark Kramer - Statistical neuronal models: an overview
10:45 AM
11:15 AM

Break

11:15 AM
12:30 PM
Horacio Rotstein, Casey Diekman - Dynamic (deterministic) neuronal models: an overview
12:30 PM
02:00 PM

Lunch Break

02:00 PM
03:30 PM

Interplay of dynamic and statistical models: open discussion and dataset challenge (organizers)

03:30 PM
04:00 PM

Break

04:00 PM
05:00 PM
Robert Kass - Some Statistical Issues in Spike Train Analysis

A now-standard framework for statistical analysis of spike train data, based on point process regression using the modern methodology of generalized linear models (GLMs), is closely related to the leaky integrate and fire conception. I will review some of the work I've been involved with in developing point process regression, and will then spend some time discussing what I consider to be the biggest outstanding problem we face in analyzing spike train data.

05:00 PM
07:00 PM

Reception and Poster session

07:00 PM

Shuttle pick-up from MBI

Tuesday, October 18, 2016
Time Session
08:00 AM

Shuttle to MBI

08:15 AM
09:00 AM

Breakfast

09:00 AM
09:45 AM
Henry Abarbanel - Data Assimilation for Verifying Models of Neurons and Networks

Using data assimilation tools built via methods of statistical physics, we discuss +ideas and applications to experimental data for estimating properties of Hodgkin-Huxley models of individual neuron biophysics. The experimental data comes primarily from nucleus HVC of the avain song production system.

We then discuss new tools for the analysis of networks of neurons using extracellular recordings and results from the theory of electrostatics.

09:45 AM
10:30 AM
Frances Skinner - Building bridges between model and experiment to obtain an essence of theta rhythm generation

Oscillatory activities are hallmarks of brain output that are linked to normal and pathological functioning. Thus, determining mechanisms for how brain oscillations are generated is essential. However, the multi-scale, nonlinear nature of our brains makes them highly challenging to understand. In particular, theta oscillations (3-12 Hz) were discovered almost 80 years ago and are one of the most robust oscillations in the brain, including the hippocampus where they are associated with exploration. Although several cellular-based network models of varying levels of complexity have been developed, it is still unclear how theta oscillations are generated in the hippocampus. In this talk, I will describe the development of our cellular-based network models where we have taken advantage of an *in vitro* whole hippocampus preparation that spontaneously generates theta rhythms. Using theoretical insights and biological constraints, our developed models can produce theta rhythms, thus suggesting the underlying essence of their generation.

10:30 AM
11:00 AM

Break

11:00 AM
11:45 AM
Sridevi Sarma - Fragility in the human decision making system: when irrationality hijacks logic

Decision-making links cognition to behavior and is a key driver of human personality, fundamental for survival, and essential for our ability to learn and adapt. It has been well established that humans make logical decisions where they maximize an expected reward, but this rationality is influenced by their internal biases (e.g. emotional state, preferences). Psychiatric patients who have dysfunctional cognitive and emotional circuitry frequently have severe alterations in decision-making. Unfortunately, the function of relevant neural circuits in humans is largely uncharted at fine temporal scales, severely limiting the understanding of changes underlying disruption associated with age or psychiatric diseases. In this study, we localize neural populations, circuits, and their temporal patterns on a millisecond scale that are critically involved in human decision-making.

Twelve human subjects, implanted with multiple depth electrodes for clinical purposes, performed a gambling task while we recorded local field potential neural activity from deep and peripheral brain structures. We propose a dynamical system model to explain the individual variability in decision making. We then identify neural correlates of model variables. Our models suggest a spectrum of decision-makers that range from irrationally to logical, and analyses of the neural data suggest that, specific oscillations in brain structures, including anterior insula, amygdala and cingulate cortex are shown to influence betting behavior (what you bet and how quickly you make the bet) in a profound way. These findings provide new insight into how humans link their internal biases (e.g. emotions) to decisions.

11:45 AM
12:30 PM
Don Katz - Sequences of attractor-like states and the prediction of consumption behavior in single trials

Neural responses to taste administration are highly dynamic€”single-neuron responses reflect first taste presence, then taste identity, and then taste palatability (an experience-dependent variable that is intimately tied to consumption/rejection decisions), all within the first second following administration. Our ensemble analysis reveals the firing-rate transitions between these response €œepochs€? to be precipitous, near-instantaneous, and coherent across populations of simultaneously-recorded cortical neural ensembles; the sequence of attractor-like population states is highly reliable, but the timing of any particular transition varies from trial to trial. Furthermore, the onset of the late, palatability-related state provides a high-quality prediction of decision-related behavior (the latency of which also varies widely from trial to trial). Together, our data reveal a dynamical characterization of the taste system in action.

12:30 PM
02:00 PM

Lunch Break

02:00 PM
02:45 PM
Jonathan Pillow - Inferring synaptic conductances from spikes using a biophysically inspired extension of the Poisson generalized linear model
02:45 PM
03:45 PM
Steven Schiff - Model-based Observation and Control for the Brain: From Control of Seizures and Migraines, to Reducing Infant Brain Infections in Africa

Since the 1950s, we have developed mature theories of modern control theory and computational neuroscience with little interaction between these disciplines. With the advent of computationally efficient nonlinear Kalman filtering techniques (developed in robotics and weather prediction), along with improved neuroscience models that provide increasingly accurate reconstruction of dynamics in a variety normal and disease states in the brain, the prospects for synergistic interaction between these fields are now strong. I will show recent examples of the use of nonlinear control theory for the assimilation and control of single neuron and network dynamics, a control framework for Parkinsonâ€™s disease, and the potential for unification in control of spreading depression and seizures. Recent results help explain why the subtle and deep intersection of symmetry, in brains and models, is important to take into account in this transdisciplinary fusion of computational models of the computational brain with real-time control. Lastly, I will describe how such symmetries apply to network optimization and control for the prevention of infant brain infections in Africa.

03:45 PM
04:15 PM

Break

04:15 PM
05:00 PM
Kyle Lepage - Allen Brain Observatory: 2P Calcium Imaging Inference

As a portion of the Allen Brain Observatory, a major initiative at the Allen Institute for Brain Science, head fixed adult mice are shown both artificial and natural visual stimuli while high-throughput 2- photon calcium imaging data are recorded from the mouse visual cortex, allowing assessment of stimulus processing across cortical layers, cortical areas, and Cre lines. In support of this effort, to assess the accuracy of statistical inference, cell-attached electrophysiological (ephys) data are collected simultaneously with two-photon calcium imaging (ophys). This latter experiment reveals, particularly when imaging at the lower magnification required to cover the mouse cortex, that a large number of spikes are not represented by the fluorescence signal, and conversely, an upward transient in the fluorescence signal does not always correspond with the occurrence of a neuron action potential. Despite considerable methodological effort, it remains a challenge to associate fluorescence signal with neural spiking. In this presentation, I will describe a a three-part approach to estimate neural receptive- fields and filters: (i) generalized linear models of spiking are estimated directly from fluorescence signal without direct knowledge of spike times, (ii) methods of estimation make explicit use of the calcium moments to facilitate (iii) the incorporation of calcium models derived from the joint ephys/ophys experiment.

05:00 PM

Shuttle pick-up from MBI

Wednesday, October 19, 2016
Time Session
08:00 AM

Shuttle to MBI

08:15 AM
09:00 AM

Breakfast

09:00 AM
09:45 AM
Joshua Goldwyn - Inferring dynamics of neurons from field potentials in the auditory brainstem

Membrane potential (Vm) is the standard representation of neural activity at the single-cell level. Vm represents the difference between intracellular voltage and extracellular voltage (Vm = Vi-Ve), where extracellular voltage (Ve) is generated by the combined activity of many neurons. I will explore the connection between these cell-level and population-level signals. I use a combination of mathematical modeling and data analysis to draw insights regarding the dynamics of neurons and synaptic currents in the auditory brainstem. In addition, I use simulations to show that extracellular voltage -- often thought of as a byproduct of neural activity -- may in fact modulate neural activity and influence neural computations.

This work is in collaboration with John Rinzel (New York University) and the Laboratory of Auditory Neurophysiology at the University of Leuven (director: Philip Joris)

09:45 AM
10:30 AM
Leonid Rubchinsky - Temporal patterns of intermittent neural synchronization

Synchronization of neural activity in the brain is involved in a variety of brain functions including perception, cognition, memory, and motor behavior. Excessively strong, weak, or otherwise improperly organized patterns of synchronous oscillatory activity appear to contribute to the generation of symptoms of different neurological and psychiatric diseases. However, neuronal synchrony is frequently not perfect, but rather exhibits intermittent dynamics. So the same synchrony strength may be achieved with markedly different temporal patterns of activity (roughly speaking oscillations may go out of the synchronous state for many short episodes or few long episodes). I will discuss this situation from two perspectives: the phase-space perspective and associated considerations of dynamical systems theory and time-series analysis perspective. I will then proceed with the application of this analysis to the neurophysiological data in healthy brain, Parkinson's disease, and in drug addiction disorders.

10:30 AM
11:00 AM

Break

11:00 AM
11:45 AM
Peter Thomas - Challenges in Parameter Estimation for Conductance-Based Models

Stochastic effects, arising from the random gating of ion channels, complicate efforts to estimate conductance based model parameters, such as channel conductances and kinetics, from electrophysiological data. Channel noise is not always harmful, however. Some parameters that are not identifiable in a deterministic model can be estimated within a stochastic model. For example, channel noise may facilitate estimation of the numbers of channels in a given cell. On the other hand, unlike their deterministic idealizations, neurons with stochastic conductances do not produce periodic orbits, thus presenting a moving target for trajectory-based parameter estimation. We will discuss these and related challenges in estimating parameters of conductance-based models in the presence of channel noise.

11:45 AM
12:30 PM
Susanne Ditlevsen - Neuronal responses to stimulus pairs as probability mixtures of responses to single stimuli

A fundamental question concerning the way the visual world is represented in our brain is how a cortical cell responds when its classical receptive field contains more than a single stimulus object. It is a statistically challenging problem how to infer such behavior and distinguish between different explanatory models from neurobiological data. Particular challenges are that data are partially observed, highly noisy and autocorrelated. A standard way to deal with noisy data is to average over trials. In this talk I will argue that this might blur or entirely remove essential characteristics and mechanisms, which are fundamental for understanding brain function.

Two opposing models have been proposed in the literature. In the response-averaging model [1], the firing rate of the cell to a pair of stimulus objects is a weighted average of the firing rates to the individual objects. By contrast, in the probability-mixing model [2], the cell responds to the pair of objects as if only one of the objects was present in any given trial. Here we compare the abilities of the two models to account for spike trains recorded from single cells in the middle temporal visual area (MT) of rhesus monkeys, using point process techniques. The results support the probability-mixing model [3].

References:
[1] Reynolds, J. H., Chelazzi, L. & Desimone, R. Competitive mechanisms subserve attention in macaque areas V2 and V4. Journal of Neuroscience 19, 1736€“1753 (1999).
[2] Bundesen, C., Habekost, T. & Kyllingsbæk, S. A neural theory of visual attention: bridging cognition and neurophysiology. Psychological Review 112, 291 (2005).
[3] Li, K. , Kozyrev, V., Kyllingsbæk, S., Treue, S., Ditlevsen, S. and Bundesen, C.: Neurons in primate visual cortex alternate between responses to multiple stimuli in their receptive field. Submitted.

12:30 PM
02:00 PM

Lunch Break

02:00 PM
02:45 PM
Asohan Amarasingham - Inference of connectivity from extracellular data

A problem of fundamental importance in neuroscience is to understand how properties of synaptic connectivity vary across behavior. A possible, but challenging, approach is to interpret fine (millisecond) timescale correlations among populations of spike trains, which can be recorded extracellularly, in such terms. There are many technical barriers to such an approach, including a paucity of ground truth data available for interpretative calibration. Here, we approach the question indirectly from the perspective of biophysical models. Working with simple models of monosynaptic transmission and varieties of background noise, we sought conditions for precise pre- and post-synaptic spike time relationships. A primary conclusion is that, in addition to connectivity parameters, low variability of subthreshold potentials appears to be an important ingredient. We discuss the plausibility of this hypothesis, and its implications for measuring input variability experimentally as well as for nonparametric statistical inference of connectivity from extracellular data.

02:45 PM
03:30 PM
Alex Roxin - A computational model of spatial learning in rodent hippocampus

Place cells of the rodent hippocampus fire action potentials when the animal traverses a particular spatial location in a given environment. Therefore, for any given trajectory one will observe a repeatable sequence of place cell activations as the animal explores. Interestingly, when the animal is quiescent or sleeping, one can observe similar sequences of activation, although at a highly compressed rate, known as "replays". It is hypothesized that this replay underlies the process of memory consolidation whereby memories are "transferred" from hippocampus to cortex. However, it remains unclear how the memory of a particular environment is actually encoded in the place cell activity and what the mechanism for replay is.

Here we study how spike-timing dependent plasticity (STDP) during spatial exploration shapes the patterns of synaptic connectivity in model networks of place cells. We show how an STDP rule can lead to the formation of attracting manifolds, essentially patterns of activity which represent the spatial environment learned. These states become spontaneously active when the animal is quiescent, reproducing the phenomonology of replays. Interestingly, the attractors are formed most rapidly when place cell activity is modulated by an ongoing oscillation. The optimal oscillation frequency can be calculated analytically, is directly related to the STDP rule, and for experimentally determined values of the STDP window in rodent slices gives values in the theta range.

A major prediction of these model is that the structure of replay during sharp-wave/ripples should undergo a transition during exploration. Specifically, at a critical time the correlation of replay with trajectories in the currently explored environment should increase. We look for this increase by examining the activity of hundreds of simultaneously recorded hippocampal cells in rats exploring a novel environment from the laboratory of Eva Pastalkova.

03:30 PM
04:00 PM

Break

04:00 PM
05:30 PM

Informal discussions

05:30 PM

Shuttle pick-up from MBI

Thursday, October 20, 2016
Time Session
08:00 AM

Shuttle to MBI

08:15 AM
09:00 AM

Breakfast

09:00 AM
09:45 AM
Carina Curto - Emergent dynamics from network connectivity: a minimal model

Many networks in the brain display internally-generated patterns of activity -- that is, they exhibit emergent dynamics that are shaped by intrinsic properties of the network rather than inherited from an external input. While a common feature of these networks is an abundance of inhibition, the role of network connectivity in pattern generation remains unclear.

In this talk I will introduce Combinatorial Threshold-Linear Networks (CTLNs), which are simple "toy models" of recurrent networks consisting of threshold-linear neurons with binary inhibitory interactions. The dynamics of CTLNs are controlled solely by the structure of an underlying directed graph. By varying the graph, we observe a rich variety of emergent patterns including: multistability, neuronal sequences, and complex rhythms. These patterns are reminiscent of population activity in cortex, hippocampus, and central pattern generators for locomotion. I will present some theorems about CTLNs, and explain how they allow us to predict features of the dynamics by examining properties of the underlying graph. Finally, I'll show examples illustrating how these mathematical results guide us to engineer complex networks with prescribed dynamic patterns.

09:45 AM
10:30 AM
Sara Solla - Neural manifolds for the control of movement

The fundamental question of how the dynamics of networks of neurons implement neural computations and information processing remains unanswered. The problem is formidable, as neural activity in any specific area not only reflects its intrinsic population dynamics, but must also represent both inputs to and outputs from that area. The analysis of neural dynamics in sensory cortices in response to controlled stimuli, or in motor cortices in relation to controlled movements, has consistently revealed the existence of low dimensional manifolds, spanned by latent variables that capture a significant fraction of neural variability. We have focused on motor cortex and explored the relation between manifolds associated with several motor tasks. We have identified remarkable similarities across such manifolds, and investigated how these geometric similarities affect the low dimensional dynamics constrained to these manifolds, and the manner in which information related to task conditions and muscle activity is encoded by these dynamics.

10:30 AM
11:00 AM

Break

11:00 AM
11:45 AM
Tay Netoff - Optimizing deep brain stimulation for treatment of epilepsy and Parkinson's disease

Deep brain stimulation is used for treating Parkinson€™s Disease, essential tremor and epilepsy, and there is interest in using it for many other neurological diseases. Deep brain stimulation devices have many parameters, such as voltage and stimulation frequency, that can be adjusted by the clinician. Adjusting the parameters is a lengthy and somewhat haphazard process. New devices allow us to monitor neural activity from deep brain stimulation leads not being used for stimulation. This provides us with the opportunity to use machine learning algorithms to optimize stimulation parameters based on biomarkers. In this talk I will discuss how we have been using closed-loop control algorithms to develop stimulation waveforms that suppress neural activity in epilepsy models, and tune stimulation parameters for Parkinson€™s Disease.

11:45 AM
12:30 PM
Nicolas Brunel - Inferring Synaptic Plasticity Rules From the Statistics of Neuronal Responses to Sets of Novel and Familiar Stimuli

Abstract not submitted.

12:30 PM
02:00 PM

Lunch Break

02:00 PM
02:45 PM

The head-direction (HD) system functions as a compass, with member neurons robustly increasing their firing rates when the animal€™s head points in a specific direction. HD neurons may be driven by peripheral sensors or, as computational models postulate, internally generated (attractor) mechanisms. We addressed the contributions of stimulus-driven and internally generated activity by recording ensembles of HD neurons in the antero-dorsal thalamic nucleus and the post-subiculum of mice by comparing their activity in various brain states. The temporal correlation structure of HD neurons was preserved during sleep, characterized by a 60°-wide correlated neuronal firing (activity packet), both within and across these two brain structures. During rapid eye movement sleep, the spontaneous drift of the activity packet was similar to that observed during waking and accelerated tenfold during slow-wave sleep. These findings demonstrate that peripheral inputs impinge on an internally organized network, which provides amplification and enhanced precision of the HD signal.

02:45 PM
03:30 PM
Alla Borisyuk - Diversity of Evoked Astrocyte Calcium Responses: Mathematical Modeling

Evidence suggests that astrocytes play a key role in neuronal function through their calcium signaling. I will present a mathematical model of astrocyte calcium signaling that provides a tool to study the underlying mechanisms of these signals.

Using experimental data from my collaborator's lab and model simulations, we categorize astrocyte calcium responses, evoked by focal, brief (<250 ms) ATP applications, into four types: Single-Peak, Multi-Peak, Plateau, and Long-Lasting responses. Applying this categorization, we find experimentally that as we move from the soma to the large and, finally, small processes, the occurrence of Single-Peak responses decreases, while the occurrence of Multi-Peak responses increases.

We use our model to provide insight into the possible sources of calcium response variability: (1) temporal dynamics of IP3, and (2) relative flux rates through calcium channels and pumps such as store-operated calcium (SOC) channels, SERCA pump, etc. Further, our model generates predictions about the effects of blocking calcium channels/pumps; for instance, blocking SOC channels is expected to eliminate Plateau and Long-Lasting responses. Finally, using the model, we hypothesize that di fferences in the response type distribution between astrocyte subcompartments can be attributed to diff erences in IP3 rise durations (between all three subcompartments) and flux rates through calcium pumps (between somata and large processes).

03:30 PM
04:00 PM

Break

04:00 PM
05:30 PM

Informal discussions

05:30 PM

Shuttle pick-up from MBI

06:30 PM
07:00 PM

Cash Bar

07:00 PM
09:00 PM

Banquet in the Fusion Room @ Crowne Plaza Hotel

Friday, October 21, 2016
Time Session
08:00 AM

Shuttle to MBI

08:15 AM
09:00 AM

Breakfast

09:00 AM
09:45 AM
Jiawei Zhang - Stochastic Vesicle Release in Synaptic Transmission

Noise is not only a source of disturbance, but it also can be beneficial for neuronal information processing. The release of neurotransmitter vesicles in synapses is an unreliable process, especially in the central nervous system. Here we show that the probabilistic nature of neurotransmitter release directly influences the functional role of a synapse, and that a small probability of release per docked vesicle helps reduce the error in the reconstruction of desired signals from the time series of vesicle release events.

09:45 AM
10:30 AM
Mason Porter - Mescoscale Structures in Functional Neuronal Networks

I will discuss mesoscale structures in functional neuronal networks. My talk will include ideas from community structure, core-periphery structure, and persistent homology. I will introduce these ideas and illustrate how they can be used to obtain insights from neuroimaging data.

10:30 AM
11:00 AM

Break

11:00 AM
11:45 AM
Esteban Tabak - Explanation of variability in data through optimal transport

A methodology based on the theory of optimal transport is developed to attribute variability in data sets to known and unknown factors and to remove such attributable components of the variability from the data. Denoting by $x$ the quantities of interest and by $z$ the explanatory factors, the procedure transforms $x$ into filtered variables $y$ through a $z$-dependent map, so that the conditional probability distributions $ho(x|z)$ are pushed forward into a target distribution $mu(y)$, independent of $z$. Among all maps and target distributions that achieve this goal, the procedure selects the one that minimally distorts the original data: the barycenter of the $ho(x|z)$.

We will discuss the relevance of this methodology to medicine and biology, including the amalgamation of data sets and removal of batch effects, the analysis of time series, the analysis of dependence among variables and the discovery of previously unknown variability factors.

11:45 AM
12:30 PM

Workshop wrap up

12:30 PM

Shuttle pick-up from MBI (one to hotel, one to airport)

Name Email Affiliation
Abarbanel, Henry habarbanel@ucsd.edu Department of Physical Sciences, University of California, San Diego
Amarasingham, Asohan aamarasingham@ccny.cuny.edu Department of Mathematics, City College of New York
Armstrong, Eve earmstrong@physics.ucsd.edu Physics, BioCircuits, University of California, San Diego
Bel, Andrea andreabelnqn@gmail.com Departamento de Matematica, Universidad Nacional del Sur
Borisyuk, Alla borisyuk@math.utah.edu Mathematics, University of Utah
Breen, Daniel dnlbreen@gmail.com Physics, University of California, San Diego
Brunel, Nicolas nbrunel@uchicago.edu Statistics and Neurobiology, University of Chicago
Curto, Carina ccurto@psu.edu Mathematics, Pennsylvania State University
De Pitt, Maurizio mdepitta@uchicago.edu Department of Neurobiology, University of Chicago
Diekman, Casey casey.o.diekman@njit.edu Department of Mathematical Sciences, New Jersey Institute of Technology
Ditlevsen, Susanne susanne@math.ku.dk Department of Mathematical Sciences, University of Copenhagen
Eden, Uri tzvi@bu.edu Mathematics and Statistics, Boston University
Fernandez, Fernando fernrf@bu.edu Biomedical Engineering, Boston University
Fletcher, Patrick patrick.allen.fletcher@gmail.com Laboratory of Biological Modeling, NIDDK, National Institutes of Health
Goldwyn, Joshua jhgoldwyn@gmail.com Mathematics, The Ohio State University
Govinder, Kesh govinder@ukzn.ac.za Mathematics, Statistics and Computer Science, University of KwaZulu-Natal
Handy, Gregory handy@math.utah.edu Mathematics, University of Utah
Hesse, Janina janina.hesse@bccn-berlin.de Biology - Institute for Theoretical Biology, Humboldt-Universit""at
Ito, Takuya taku.ito1@gmail.com Center for Molecular and Behavioral Neuroscience, Rutgers University
Kalies, William wkalies@fau.edu Mathematical Sciences, Florida Atlantic University
Kass, Robert kass@stat.cmu.edu Department of Statistics and Center for the Neural Basis of Cognition, Carnegie-Mellon University
Katz, Don dbkatz@brandeis.edu Psychology/Neuroscience, Brandeis University
Kay, Leslie lkay@uchicago.edu Dept. of Psychology, University of Chicago
Keeley, Stephen sk3931@nyu.edu Center for Neural Science, New York University
Khaledi Nasab, Ali ak705714@ohio.edu Department of Physics and Astronomy, Ohio University
Khamesian, Mahvand mk341511@ohio.edu Physics and Astronomy, Ohio University
Kramer, Mark mak@bu.edu Math and Stats, Boston University
Leiser, Randolph rjl22@njit.edu Mathematics, New Jersey Institute of Technology
Lepage, Kyle kylel@alleninstitute.org Modeling and Theory, Allen Institute for Brain Science
McKenna, Joseph joepatmckenna@gmail.com Mathematics, Florida State University
Mirzakhalili, Ehsan mirzakh@umich.edu Mechanical Engineering, University of Michigan
Morrison, Katherine katherine.morrison@unco.edu School of Mathematical Sciences, University of Northern Colorado
Moye, Matthew mjm83@njit.edu Mathematical Sciences, New Jersey Institute of Technology
Nagaraj, Vivek nagar030@umn.edu Neuroscience, University of Minnesota
Netoff, Tay tnetoff@umn.edu Biomedical Engineering, University of Minnesota
Ostergaard, Jacob Mathematical Sciences, University of Copenhagen
Pillow, Jonathan pillow@princeton.edu Psychology & Neurobiology, University of Texas
Platkiewicz, Jonathan jplatkiewicz@ccny.cuny.edu Mathematics, City College, CUNY
Porter, Mason porterm@maths.ox.ac.uk Mathematical Institute, University of Oxford
Pu, Shusen sxp600@case.edu Department of Mathematics, Applied Mathematics and Statistics, Case Western Reserve University
Rotstein, Horacio horacio@njit.edu Mathematical Sciences, New Jersey Institute of Technology
Roxin, Alex aroxin@crm.cat Computational Neuroscience Group, Centre de Recerca Matemàtica
Rubchinsky, Leonid leo@math.iupui.edu Department of Mathematical Sciences and Stark Neurosciences Research Institute at the Indiana University School of Medicine, Indiana University--Purdue University
Sarma, Sridevi ssarma2@jhu.edu Biomedical Engineering, Johns Hopkins University
Schiff, Steven sjs49@engr.psu.edu Depts. Neurosurgery / Eng Science & Mechanics / Physics, Pennsylvania State University
Schleimer, Jan-Hendrik jh.schleimer@hu-berlin.de Biology, Humboldt-Universit""at
Sederberg, Audrey audrey.sederberg@gatech.edu Biomedical Engineering, Georgia Institute of Technology
Shirman, Aleksandra ashirman@physics.ucsd.edu Physics, University of California, San Diego
Skinner, Frances frances.skinner@utoronto.ca Krembil Research Institute, Krembil Research Institute, University Health Network
Solla, Sara sasolla@gmail.com Department of Physiology, Northwestern University Medical School
Spencer, Elizabeth erss@bu.edu Neuroscience, Boston University
Tabak, Esteban tabak@cims.nyu.edu Courant Institute of Mathematical Sciences, New York University
Thomas, Peter pjthomas@case.edu Department of Mathematics, Applied Mathematics, and Statistics, Case Western Reserve University
Walch, Olivia ojwalch@umich.edu Mathematics, University of Michigan
Yang, Luojun luoyang@ucdavis.edu School of Life Science, Nanjing University
Zhang, Jiawei (Calvin) calvinz@cims.nyu.edu Mathematics, University of Arizona
Data Assimilation for Verifying Models of Neurons and Networks

Using data assimilation tools built via methods of statistical physics, we discuss +ideas and applications to experimental data for estimating properties of Hodgkin-Huxley models of individual neuron biophysics. The experimental data comes primarily from nucleus HVC of the avain song production system.

We then discuss new tools for the analysis of networks of neurons using extracellular recordings and results from the theory of electrostatics.

Inference of connectivity from extracellular data

A problem of fundamental importance in neuroscience is to understand how properties of synaptic connectivity vary across behavior. A possible, but challenging, approach is to interpret fine (millisecond) timescale correlations among populations of spike trains, which can be recorded extracellularly, in such terms. There are many technical barriers to such an approach, including a paucity of ground truth data available for interpretative calibration. Here, we approach the question indirectly from the perspective of biophysical models. Working with simple models of monosynaptic transmission and varieties of background noise, we sought conditions for precise pre- and post-synaptic spike time relationships. A primary conclusion is that, in addition to connectivity parameters, low variability of subthreshold potentials appears to be an important ingredient. We discuss the plausibility of this hypothesis, and its implications for measuring input variability experimentally as well as for nonparametric statistical inference of connectivity from extracellular data.

Diversity of Evoked Astrocyte Calcium Responses: Mathematical Modeling

Evidence suggests that astrocytes play a key role in neuronal function through their calcium signaling. I will present a mathematical model of astrocyte calcium signaling that provides a tool to study the underlying mechanisms of these signals.

Using experimental data from my collaborator's lab and model simulations, we categorize astrocyte calcium responses, evoked by focal, brief (<250 ms) ATP applications, into four types: Single-Peak, Multi-Peak, Plateau, and Long-Lasting responses. Applying this categorization, we find experimentally that as we move from the soma to the large and, finally, small processes, the occurrence of Single-Peak responses decreases, while the occurrence of Multi-Peak responses increases.

We use our model to provide insight into the possible sources of calcium response variability: (1) temporal dynamics of IP3, and (2) relative flux rates through calcium channels and pumps such as store-operated calcium (SOC) channels, SERCA pump, etc. Further, our model generates predictions about the effects of blocking calcium channels/pumps; for instance, blocking SOC channels is expected to eliminate Plateau and Long-Lasting responses. Finally, using the model, we hypothesize that di fferences in the response type distribution between astrocyte subcompartments can be attributed to diff erences in IP3 rise durations (between all three subcompartments) and flux rates through calcium pumps (between somata and large processes).

Inferring Synaptic Plasticity Rules From the Statistics of Neuronal Responses to Sets of Novel and Familiar Stimuli

Abstract not submitted.

Emergent dynamics from network connectivity: a minimal model

Many networks in the brain display internally-generated patterns of activity -- that is, they exhibit emergent dynamics that are shaped by intrinsic properties of the network rather than inherited from an external input. While a common feature of these networks is an abundance of inhibition, the role of network connectivity in pattern generation remains unclear.

In this talk I will introduce Combinatorial Threshold-Linear Networks (CTLNs), which are simple "toy models" of recurrent networks consisting of threshold-linear neurons with binary inhibitory interactions. The dynamics of CTLNs are controlled solely by the structure of an underlying directed graph. By varying the graph, we observe a rich variety of emergent patterns including: multistability, neuronal sequences, and complex rhythms. These patterns are reminiscent of population activity in cortex, hippocampus, and central pattern generators for locomotion. I will present some theorems about CTLNs, and explain how they allow us to predict features of the dynamics by examining properties of the underlying graph. Finally, I'll show examples illustrating how these mathematical results guide us to engineer complex networks with prescribed dynamic patterns.

Dynamic (deterministic) neuronal models: an overview
Neuronal responses to stimulus pairs as probability mixtures of responses to single stimuli

A fundamental question concerning the way the visual world is represented in our brain is how a cortical cell responds when its classical receptive field contains more than a single stimulus object. It is a statistically challenging problem how to infer such behavior and distinguish between different explanatory models from neurobiological data. Particular challenges are that data are partially observed, highly noisy and autocorrelated. A standard way to deal with noisy data is to average over trials. In this talk I will argue that this might blur or entirely remove essential characteristics and mechanisms, which are fundamental for understanding brain function.

Two opposing models have been proposed in the literature. In the response-averaging model [1], the firing rate of the cell to a pair of stimulus objects is a weighted average of the firing rates to the individual objects. By contrast, in the probability-mixing model [2], the cell responds to the pair of objects as if only one of the objects was present in any given trial. Here we compare the abilities of the two models to account for spike trains recorded from single cells in the middle temporal visual area (MT) of rhesus monkeys, using point process techniques. The results support the probability-mixing model [3].

References:
[1] Reynolds, J. H., Chelazzi, L. & Desimone, R. Competitive mechanisms subserve attention in macaque areas V2 and V4. Journal of Neuroscience 19, 1736â€“1753 (1999).
[2] Bundesen, C., Habekost, T. & KyllingsbÃ¦k, S. A neural theory of visual attention: bridging cognition and neurophysiology. Psychological Review 112, 291 (2005).
[3] Li, K. , Kozyrev, V., KyllingsbÃ¦k, S., Treue, S., Ditlevsen, S. and Bundesen, C.: Neurons in primate visual cortex alternate between responses to multiple stimuli in their receptive field. Submitted.

Statistical neuronal models: an overview
Spontaneous intracellular membrane voltage fluctuations in cortical pyramidal cells are inconsistent with balanced excitation and inhibition

Analyses of cortical neuron intracellular membrane voltage fluctuations resulting from sensory evoked activity, as well as up-down state transitions observed in anaesthetized rodents, indicates that these events are generated through a joint increase in excitatory and inhibitory currents. Accordingly, an increase in excitation is quickly followed and matched with an increase in inhibition. This form of balanced and correlated excitation and inhibition results from feedforward- and feedback-inhibition present in cortical circuits. It remains unclear, however, if spontaneous voltage fluctuations under awake conditions are also the product of balanced and temporally correlated excitatory and inhibitory synaptic activity. Unlike sensory evoked depolarizations or up-states, spontaneous voltage fluctuations lack an external reference point from which to measure changes in currents and calculate synaptic conductances. Nevertheless, if balanced and temporally correlated synaptic activity underlie spontaneous fluctuations, we would expect that mutual cancelation of excitatory and inhibitory events is maximal near the net synaptic reversal potential. For this reason, the relationship between the standard deviation of membrane voltage fluctuations and the mean holding voltage should be non-monotonic; fluctuations are smallest near the net synaptic reversal potential but grow larger at voltages above or below the net synaptic reversal value. To test this hypothesis, we carried out visually-guided intracellular patch-clamp recordings of layer II somatosensory neurons in awake, head-fixed mice using 2-photon microscopy. Using tdTomato expression in CaMK2-postive neurons, we targeted pyramidal cells and carried out recordings in both current- and voltage-clamp across a range of subthreshold voltage values. Contrary to predictions from models of balanced and correlated synaptic activity, the standard deviation and power of fluctuations increases monotonically with depolarization from -90 to -40 mV. Further analysis indicates that spontaneous voltage fluctuations are largely the product of excitatory synaptic currents that are amplified by a voltage-dependent increase in pyramidal cell membrane input resistance.

Inferring dynamics of neurons from field potentials in the auditory brainstem

Membrane potential (Vm) is the standard representation of neural activity at the single-cell level. Vm represents the difference between intracellular voltage and extracellular voltage (Vm = Vi-Ve), where extracellular voltage (Ve) is generated by the combined activity of many neurons. I will explore the connection between these cell-level and population-level signals. I use a combination of mathematical modeling and data analysis to draw insights regarding the dynamics of neurons and synaptic currents in the auditory brainstem. In addition, I use simulations to show that extracellular voltage -- often thought of as a byproduct of neural activity -- may in fact modulate neural activity and influence neural computations.

This work is in collaboration with John Rinzel (New York University) and the Laboratory of Auditory Neurophysiology at the University of Leuven (director: Philip Joris)

Some Statistical Issues in Spike Train Analysis

A now-standard framework for statistical analysis of spike train data, based on point process regression using the modern methodology of generalized linear models (GLMs), is closely related to the leaky integrate and fire conception. I will review some of the work I've been involved with in developing point process regression, and will then spend some time discussing what I consider to be the biggest outstanding problem we face in analyzing spike train data.

Sequences of attractor-like states and the prediction of consumption behavior in single trials

Neural responses to taste administration are highly dynamicâ€”single-neuron responses reflect first taste presence, then taste identity, and then taste palatability (an experience-dependent variable that is intimately tied to consumption/rejection decisions), all within the first second following administration. Our ensemble analysis reveals the firing-rate transitions between these response â€œepochsâ€? to be precipitous, near-instantaneous, and coherent across populations of simultaneously-recorded cortical neural ensembles; the sequence of attractor-like population states is highly reliable, but the timing of any particular transition varies from trial to trial. Furthermore, the onset of the late, palatability-related state provides a high-quality prediction of decision-related behavior (the latency of which also varies widely from trial to trial). Together, our data reveal a dynamical characterization of the taste system in action.

Introduction: Interplay of data and models

TBD

Statistical neuronal models: an overview
Allen Brain Observatory: 2P Calcium Imaging Inference

As a portion of the Allen Brain Observatory, a major initiative at the Allen Institute for Brain Science, head fixed adult mice are shown both artificial and natural visual stimuli while high-throughput 2- photon calcium imaging data are recorded from the mouse visual cortex, allowing assessment of stimulus processing across cortical layers, cortical areas, and Cre lines. In support of this effort, to assess the accuracy of statistical inference, cell-attached electrophysiological (ephys) data are collected simultaneously with two-photon calcium imaging (ophys). This latter experiment reveals, particularly when imaging at the lower magnification required to cover the mouse cortex, that a large number of spikes are not represented by the fluorescence signal, and conversely, an upward transient in the fluorescence signal does not always correspond with the occurrence of a neuron action potential. Despite considerable methodological effort, it remains a challenge to associate fluorescence signal with neural spiking. In this presentation, I will describe a a three-part approach to estimate neural receptive- fields and filters: (i) generalized linear models of spiking are estimated directly from fluorescence signal without direct knowledge of spike times, (ii) methods of estimation make explicit use of the calcium moments to facilitate (iii) the incorporation of calcium models derived from the joint ephys/ophys experiment.

Optimizing deep brain stimulation for treatment of epilepsy and Parkinson's disease

Deep brain stimulation is used for treating Parkinsonâ€™s Disease, essential tremor and epilepsy, and there is interest in using it for many other neurological diseases. Deep brain stimulation devices have many parameters, such as voltage and stimulation frequency, that can be adjusted by the clinician. Adjusting the parameters is a lengthy and somewhat haphazard process. New devices allow us to monitor neural activity from deep brain stimulation leads not being used for stimulation. This provides us with the opportunity to use machine learning algorithms to optimize stimulation parameters based on biomarkers. In this talk I will discuss how we have been using closed-loop control algorithms to develop stimulation waveforms that suppress neural activity in epilepsy models, and tune stimulation parameters for Parkinsonâ€™s Disease.

Attractor dynamics in the navigation system

The head-direction (HD) system functions as a compass, with member neurons robustly increasing their firing rates when the animalâ€™s head points in a specific direction. HD neurons may be driven by peripheral sensors or, as computational models postulate, internally generated (attractor) mechanisms. We addressed the contributions of stimulus-driven and internally generated activity by recording ensembles of HD neurons in the antero-dorsal thalamic nucleus and the post-subiculum of mice by comparing their activity in various brain states. The temporal correlation structure of HD neurons was preserved during sleep, characterized by a 60Â°-wide correlated neuronal firing (activity packet), both within and across these two brain structures. During rapid eye movement sleep, the spontaneous drift of the activity packet was similar to that observed during waking and accelerated tenfold during slow-wave sleep. These findings demonstrate that peripheral inputs impinge on an internally organized network, which provides amplification and enhanced precision of the HD signal.

Inferring synaptic conductances from spikes using a biophysically inspired extension of the Poisson generalized linear model
Mescoscale Structures in Functional Neuronal Networks

I will discuss mesoscale structures in functional neuronal networks. My talk will include ideas from community structure, core-periphery structure, and persistent homology. I will introduce these ideas and illustrate how they can be used to obtain insights from neuroimaging data.

Dynamic (deterministic) neuronal models: an overview
A computational model of spatial learning in rodent hippocampus

Place cells of the rodent hippocampus fire action potentials when the animal traverses a particular spatial location in a given environment. Therefore, for any given trajectory one will observe a repeatable sequence of place cell activations as the animal explores. Interestingly, when the animal is quiescent or sleeping, one can observe similar sequences of activation, although at a highly compressed rate, known as "replays". It is hypothesized that this replay underlies the process of memory consolidation whereby memories are "transferred" from hippocampus to cortex. However, it remains unclear how the memory of a particular environment is actually encoded in the place cell activity and what the mechanism for replay is.

Here we study how spike-timing dependent plasticity (STDP) during spatial exploration shapes the patterns of synaptic connectivity in model networks of place cells. We show how an STDP rule can lead to the formation of attracting manifolds, essentially patterns of activity which represent the spatial environment learned. These states become spontaneously active when the animal is quiescent, reproducing the phenomonology of replays. Interestingly, the attractors are formed most rapidly when place cell activity is modulated by an ongoing oscillation. The optimal oscillation frequency can be calculated analytically, is directly related to the STDP rule, and for experimentally determined values of the STDP window in rodent slices gives values in the theta range.

A major prediction of these model is that the structure of replay during sharp-wave/ripples should undergo a transition during exploration. Specifically, at a critical time the correlation of replay with trajectories in the currently explored environment should increase. We look for this increase by examining the activity of hundreds of simultaneously recorded hippocampal cells in rats exploring a novel environment from the laboratory of Eva Pastalkova.

Temporal patterns of intermittent neural synchronization

Synchronization of neural activity in the brain is involved in a variety of brain functions including perception, cognition, memory, and motor behavior. Excessively strong, weak, or otherwise improperly organized patterns of synchronous oscillatory activity appear to contribute to the generation of symptoms of different neurological and psychiatric diseases. However, neuronal synchrony is frequently not perfect, but rather exhibits intermittent dynamics. So the same synchrony strength may be achieved with markedly different temporal patterns of activity (roughly speaking oscillations may go out of the synchronous state for many short episodes or few long episodes). I will discuss this situation from two perspectives: the phase-space perspective and associated considerations of dynamical systems theory and time-series analysis perspective. I will then proceed with the application of this analysis to the neurophysiological data in healthy brain, Parkinson's disease, and in drug addiction disorders.

Fragility in the human decision making system: when irrationality hijacks logic

Decision-making links cognition to behavior and is a key driver of human personality, fundamental for survival, and essential for our ability to learn and adapt. It has been well established that humans make logical decisions where they maximize an expected reward, but this rationality is influenced by their internal biases (e.g. emotional state, preferences). Psychiatric patients who have dysfunctional cognitive and emotional circuitry frequently have severe alterations in decision-making. Unfortunately, the function of relevant neural circuits in humans is largely uncharted at fine temporal scales, severely limiting the understanding of changes underlying disruption associated with age or psychiatric diseases. In this study, we localize neural populations, circuits, and their temporal patterns on a millisecond scale that are critically involved in human decision-making.

Twelve human subjects, implanted with multiple depth electrodes for clinical purposes, performed a gambling task while we recorded local field potential neural activity from deep and peripheral brain structures. We propose a dynamical system model to explain the individual variability in decision making. We then identify neural correlates of model variables. Our models suggest a spectrum of decision-makers that range from irrationally to logical, and analyses of the neural data suggest that, specific oscillations in brain structures, including anterior insula, amygdala and cingulate cortex are shown to influence betting behavior (what you bet and how quickly you make the bet) in a profound way. These findings provide new insight into how humans link their internal biases (e.g. emotions) to decisions.

Model-based Observation and Control for the Brain: From Control of Seizures and Migraines, to Reducing Infant Brain Infections in Africa

Since the 1950s, we have developed mature theories of modern control theory and computational neuroscience with little interaction between these disciplines. With the advent of computationally efficient nonlinear Kalman filtering techniques (developed in robotics and weather prediction), along with improved neuroscience models that provide increasingly accurate reconstruction of dynamics in a variety normal and disease states in the brain, the prospects for synergistic interaction between these fields are now strong. I will show recent examples of the use of nonlinear control theory for the assimilation and control of single neuron and network dynamics, a control framework for Parkinsonâ€™s disease, and the potential for unification in control of spreading depression and seizures. Recent results help explain why the subtle and deep intersection of symmetry, in brains and models, is important to take into account in this transdisciplinary fusion of computational models of the computational brain with real-time control. Lastly, I will describe how such symmetries apply to network optimization and control for the prevention of infant brain infections in Africa.

From fluctuations in ion channel gating to spike time precision with gating phase response curves

Neurons with finite numbers of ion channels are intrinsically noisy, which renders the timing of spikes variable, even if the input is not [1, 2]. Certain ion channels have been experimentally shown to influence this variability:

• A decrease in D-type K+ channels increases spike precision [3].
• Persistent Na+ and HCN channels modulate spike reliability [4].

Moreover, even densely expressed ion channels, which do not by themselves give rise to significant conductance fluctuations, can still influence the susceptibility of the spiking dynamics to noise from other types of ion channels. A comprehensive understanding of the relation between spike jitter and individual ion channel stochasticity requires tools from sensitivity analysis. Such an analysis is of interest in the following contexts:

1. It can reviel which ion channels need to be regulated to control spike jitter. So far, models of homeostatic plasticity mechanisms only target mean firing rates, but in principle the regulation also influences the spike variability in a predictable way.
2. The sensitivity analysis can be used to identify state transitions in ion channel Markov models without bearing on the spike statistics and can thus be ommited in simulations.

This poster describes a method to map ion channel noise to spike jitter based on the phase response curves (PRCs) of all different ion channel gates.
The relative scaling of the different gating PRCs is derived for neurons with a SNIC onset bifrucation, as well as close to the SNL bifurcation.

[1] R. H. Cudmore, E. Schneidman, B. Freedman and I. Segev. Neural Computation 10: 1679 1703, 1998.
[2] S. Schreiber, J-M. Fellous, P. Tiesinga and T. Sejnowski. J Neurophysiol 91: 184 205, 2004.
[3] R. H. Cudmore, L. Fronzaroli-Molinieres, P. Giraud, and D. Debanne. The Journal of Neuroscience 30 (38): 1288595, 2010.
[4] T. Kiss, T. Acta Biologica Hungarica 59 (0): 112, 2008.

Building bridges between model and experiment to obtain an essence of theta rhythm generation

Oscillatory activities are hallmarks of brain output that are linked to normal and pathological functioning. Thus, determining mechanisms for how brain oscillations are generated is essential. However, the multi-scale, nonlinear nature of our brains makes them highly challenging to understand. In particular, theta oscillations (3-12 Hz) were discovered almost 80 years ago and are one of the most robust oscillations in the brain, including the hippocampus where they are associated with exploration. Although several cellular-based network models of varying levels of complexity have been developed, it is still unclear how theta oscillations are generated in the hippocampus. In this talk, I will describe the development of our cellular-based network models where we have taken advantage of an *in vitro* whole hippocampus preparation that spontaneously generates theta rhythms. Using theoretical insights and biological constraints, our developed models can produce theta rhythms, thus suggesting the underlying essence of their generation.

Neural manifolds for the control of movement

The fundamental question of how the dynamics of networks of neurons implement neural computations and information processing remains unanswered. The problem is formidable, as neural activity in any specific area not only reflects its intrinsic population dynamics, but must also represent both inputs to and outputs from that area. The analysis of neural dynamics in sensory cortices in response to controlled stimuli, or in motor cortices in relation to controlled movements, has consistently revealed the existence of low dimensional manifolds, spanned by latent variables that capture a significant fraction of neural variability. We have focused on motor cortex and explored the relation between manifolds associated with several motor tasks. We have identified remarkable similarities across such manifolds, and investigated how these geometric similarities affect the low dimensional dynamics constrained to these manifolds, and the manner in which information related to task conditions and muscle activity is encoded by these dynamics.

Explanation of variability in data through optimal transport

A methodology based on the theory of optimal transport is developed to attribute variability in data sets to known and unknown factors and to remove such attributable components of the variability from the data. Denoting by $x$ the quantities of interest and by $z$ the explanatory factors, the procedure transforms $x$ into filtered variables $y$ through a $z$-dependent map, so that the conditional probability distributions $ho(x|z)$ are pushed forward into a target distribution $mu(y)$, independent of $z$. Among all maps and target distributions that achieve this goal, the procedure selects the one that minimally distorts the original data: the barycenter of the $ho(x|z)$.

We will discuss the relevance of this methodology to medicine and biology, including the amalgamation of data sets and removal of batch effects, the analysis of time series, the analysis of dependence among variables and the discovery of previously unknown variability factors.

Challenges in Parameter Estimation for Conductance-Based Models

Stochastic effects, arising from the random gating of ion channels, complicate efforts to estimate conductance based model parameters, such as channel conductances and kinetics, from electrophysiological data. Channel noise is not always harmful, however. Some parameters that are not identifiable in a deterministic model can be estimated within a stochastic model. For example, channel noise may facilitate estimation of the numbers of channels in a given cell. On the other hand, unlike their deterministic idealizations, neurons with stochastic conductances do not produce periodic orbits, thus presenting a moving target for trajectory-based parameter estimation. We will discuss these and related challenges in estimating parameters of conductance-based models in the presence of channel noise.

### Posters

Effects of connectivity delay on the subthreshold resonance in neural networks

We consider a network of nodes connected via gap junctions (electrical synapses). The coupling may have a time constant delay that represents the time of propagation of the signal along nodes, for example, along dendrites and axons. We analyze the subthreshold response of the above described network to a periodic input. In particular, we study the subthreshold resonance behavior in response to time oscillatory inputs with different frequencies and representative delay times. We show that the time delay in the coupling may generate multiple subthreshold resonances of amplitude and phase. In particular, in a two nodes network we determine the existence of multiple resonances and we show that the results are in agreement with numeric computations.

Investigating Experimental Variations in Astrocytes with a Mathematical Model of Calcium Dynamics

Astrocytes are the most common glial cells in the brain, communicating via calcium transients, and possibly modulating neuronal signals. We explain experimental variability with a new open-cell mathematical model by studying the various fluxes through calcium channels. We show the surprising result that fluxes that do not play an active role during calcium transients still have the ability to affect the underlying phase space of the system, and thus the shape of the calcium transients.

Increased temperature facilitates neuronal synchronization by switching the firing-onset bifurcation

About 2 to 5% of young children experience seizures during high fever, so-called febrile seizures. While changes in pH seem to play a role in the induction of these convulsions, our modeling results suggest that the increase in temperature preceding a seizure could also directly contribute to seizure induction.

We investigated the influence of temperature on type I model neurons and identified a critical temperature range where even mild temperature increases (≈1°C) alter single-cell dynamics such that network synchronization increases strongly. This critical temperature range lies close to a codimension-two bifurcation: the saddle-node-loop bifurcation (SNL). At the SNL point, the dynamics of firing onset switch from a saddle-node bifurcation (typical for type I neurons) to a homoclinic orbit bifurcation and, importantly, the phase response curve (PRC) changes from the canonical, symmetric shape to an asymmetric one, strongly favoring synchronization. Such a drastic change in synchronization caused by small increases in temperature constitutes a possible mechanism for the induction of febrile seizures. Our model analysis provides several predictions including the temperature dependence of PRCs and spike shapes, which we test experimentally in recordings of hippocampal pyramidal cells.

We further analyze whether the proposed mechanism can explain genetic predispositions for strong febrile seizures: prevalence of febrile seizures should increase if a genetic mutation brings a neuron closer to the SNL point. Exploring established models of febrile-seizure-related ion-channel mutations, we find that this is indeed the case. Mutations lower the critical temperature of the SNL point, where synchronization and excitability explode. We conclude that the temperature-induced changes in neuronal dynamics could indeed contribute to the induction of febrile seizures. Because also other parameters can shift neuronal dynamics towards the SNL point, the relevance of the proposed mechanism could extend to other systems marked by a sudden increase in synchronization, such as other forms of epilepsy.

Examining spatial aspects of state-dependent information processing with voltage imaging

Both thalamus and cortex exhibit state-dependent dynamics that shape the nature of information represented and processed by the thalamocortical circuit. Here we look at the state-dependence of sensory information representation in primary somatosensory cortex using voltage imaging. We then explore the suitability of voltage imaging for measurement of spontaneous activity and cortical state.

Stochastic Vesicle Release in Synaptic Transmission

Noise is not only a source of disturbance, but it also can be beneficial for neuronal information processing. The release of neurotransmitter vesicles in synapses is an unreliable process, especially in the central nervous system. Here we show that the probabilistic nature of neurotransmitter release directly influences the functional role of a synapse, and that a small probability of release per docked vesicle helps reduce the error in the reconstruction of desired signals from the time series of vesicle release events.

TBD

Using data assimilation tools built via methods of statistical physics, we discuss +ideas and applications to experimental data for estimating properties of Hodgkin-Huxley models of individual neuron biophysics. The experimental data comes p

Oscillatory activities are hallmarks of brain output that are linked to normal and pathological functioning. Thus, determining mechanisms for how brain oscillations are generated is essential. However, the multi-scale, nonlinear nature of ou

Decision-making links cognition to behavior and is a key driver of human personality, fundamental for survival, and essential for our ability to learn and adapt. It has been well established that humans make logical decisions where they maxi

Neural responses to taste administration are highly dynamic€”single-neuron responses reflect first taste presence, then taste identity, and then taste palatability (an experience-dependent variable that is intimately tied t

Since the 1950s, we have developed mature theories of modern control theory and computational neuroscience with little interaction between these disciplines. With the advent of computationally efficient nonlinear Kalman filtering techniques (devel

As a portion of the Allen Brain Observatory, a major initiative at the Allen Institute for Brain Science, head fixed adult mice are shown both artificial and natural visual stimuli while high-throughput 2- photon calcium imaging data are recorded

Membrane potential (Vm) is the standard representation of neural activity at the single-cell level. Vm represents the difference between intracellular voltage and extracellular voltage (Vm = Vi-Ve), where extracellular voltage (Ve) is genera

Temporal patterns of intermittent neural synchronization
Leonid Rubchinsky

Synchronization of neural activity in the brain is involved in a variety of brain functions including perception, cognition, memory, and motor behavior. Excessively strong, weak, or otherwise improperly organized patterns of synchronous osci

Attractor dynamics in the navigation system

The head-direction (HD) system functions as a compass, with member neurons robustly increasing their firing rates when the animal€™s head points in a specific direction. HD neurons may be driven by peripheral sensors or, as

A methodology based on the theory of optimal transport is developed to attribute variability in data sets to known and unknown factors and to remove such attributable components of the variability from the data. Denoting by $x$ the quantitie

Videos

### Print

Full Schedule Participant List