In human psychophysics, it is well known that after a brilliant desensitizing flash, cone flicker sensitivity first increases but then, paradoxically, decreases with a time course paralleling rod dark adaptations. This interaction between rods and cones is called suppressive rod-cone interaction (SRCI). Analogous physiological effects involving rod and cone signals occur in horizontal cells and bipolar cells. For example, in cat, dim backgrounds can enhance small-spot flicker responses of retinal horizontal cells. This is called background-induced flicker enhancement. We formulate a biophysically based model to simulate background-induced flicker enhancement and its spatial properties. In this model we assume that depolarized horizontal-cell dendritic terminals, in a feedback effect, decrease the entry of calcium into the cone terminal. Hyperpolarization of the horizontal cell reduces this effect, allowing calcium to enter the terminal, stimulating transmitter release by the cone presynaptic apparatus. The result is an increase in synaptic gain. This accounts for how peripheral rod-induced, horizontal-cell hyperpolarizations, conducted centripetally to the horizontal-cell dendritic terminals via gap junctions, can enhance postsynaptic cone responses. Background-induced flicker enhancement also depends on the size of the test stimulus. We explore this with a spatial model that includes the thousands of horizontal cell processes (dendritic spines) entering cone pedicles.
The role of the synapse as the center of learning and memory in neuronal systems is now widely appreciated. Synaptic plasticity occurs in both presynaptic and postsynaptic regions of the synapse, and takes place over a range of time scales. This plasticity can be viewed as a means for filtering information in the form of electrical signals. At the presynaptic terminal, synaptic depression and various forms of facilitation due to plasticity in the transmitter release mechanism are well-known sources of short-term plasticity. Another form of plasticity has been the focus of a great deal of experimental work over the past few years. This involves receptor-induced activation of G-proteins in the presynaptic terminal, which regulate calcium channels and thus the release of neurotransmitters. This form of plasticity has been observed in many nerve cells, and can be induced by a wide range of neurotransmitters and neurohormones. We will discuss the mechanism of G-protein regulation of transmitter release, and demonstrate how this mechanism can be used to filter information in a neuronal circuit. Using a minimal model suitable for neural network simulations, we will demonstrate that G-protein regulation can provide synaptic depression or facilitation, and can be the mechanism for network-based bursting oscillations.
Electrophysiological experiments have shown that activity-dependent synaptic modifications may depend on the precise timing of pre- and postsynaptic action potentials (spikes). Such spike timing-dependent plasticity (STDP) represents a quantitative extension of the Hebb's rule and has profound implications in the development and function of neuronal circuits. This talk will summarize experimental studies on STDP, including the description of spike-timing windows in cell culture and other systems, as well as the most recent development on the issue of STDP temporal integration.
In many neuronal networks ranging in diversity from the crustacean STG to the CA3 region of the hippocampus, neurons are capable of maintaining phase relationships despite large changes in network frequency. In this talk, we show how short-term synaptic depression may act to promote phase maintenance.Using a simple model of an oscillator coupled to a follower neuron by a depressing inhibitory synapse, we show how the time to firing of the follower is a function of synaptic strength. For a depressing synapse, synaptic strength changes as a function of frequency. As a result, we obtain a network in which phase maintenance is roughly achieved over a 4-fold change in frequency. We will contrast the ability of our network to maintain phase against those with non-depressing synapses, and also those in which intrinsic currents play a prominent role.
One of the major simplifying assumptions in many large-scale models of cortical tissue is that the interactions between cell populations are invariant under the action of the Euclidean group of rigid body motions in the plane. Euclidean symmetry plays a key role in determining the types of activity patterns and waves that can be generated in these cortical networks. However, the assumptions of homogeneity and isotropy are no longer valid when the detailed microstructure of cortex is taken into account. In fact, cortex has a distinctly crystalline-like structure at the mm length-scale, as exemplified by the patchy nature of long-range horizontal (and feedback) connections in primary visual cortex. These patchy connections are correlated with a number of periodically repeating feature maps, in which local populations of neurons respond preferentially to stimuli with particular properties such as orientation, spatial frequency and left/right eye (ocular) dominance. In this talk we present some recent analytical results regarding the large-scale dynamics of cortex in the presence of periodically modulated long-range interactions.
How is the instantaneous firing rate modulated by inputs with sinusoidal components at arbitrary frequencies in presence of realistic noise? This question can been addressed analytically in several spiking neuron models.
The firing rate modulation of the leaky integrate-and-fire neuron shows a resonant peak at the background firing rate of the neuron that disappears at high noise, and the high frequency behavior is shown to depend strongly on the correlation time constant of the noise. The response decays at 1/sqrt(f) at high frequencies for white noise, while it stays finite for colored noise.
Several real neurons and several models based on the Hodgkin-Huxley formalism exhibit subthreshold resonance properties that are not present in integrate-and-fire neurons. To understand how this subthreshold resonance affects the firing rate modulation, a reduced two-variable generalized integrate-and-fire neuron which exhibit such sub-threshold resonances is introduced. The computation of its firingrate response shows that it has a resonant peak at the subthresholdpreferred frequency only in the regime of strong noise.
Last, the influence of spike generation mechanisms can also be studied analytically using a non-linear version of the leaky integrate-and-fire model. A simplified 'fast sodium current' with instantaneous kinetics and a non linear dependence on voltage is used. The model can be seen as a generalization of the `quadratic' neuron. The high frequency response of such models decays as 1/f to a power that depends on the non-linearity of the 'active current'. The quadratic model decays as 1/f^2 while a model with an exponential 'active current' decays as 1/f.
Finally, the consequences of these results on the synchronization properties of large recurrent networks in presence of noise are discussed briefly.
Experiments show that synapses can either increase their strength (LTP), decrease their strength (LTD) or do nothing at all, depending on the temporal relation between pre- and post-synaptic spiking activity. It remains a puzzle as to how neurons are able to discriminate spike-timing so precisely if at all. Here, I will first discuss some constraints that must be satisfied by any biophysical model aiming to explain STDP. I will then present a model of neuronal ionic and molecular dynamics that seems to account for the experimental results.
There are two types of synapses in the nervous system: chemical synapses, which use diffusible extracellular molecules to transmit signals between one cell and another, and electrical synapses, which are comprised of intercytoplasmic channels that allow ionic current to flow between cells. Chemical synapses are ubiquitous in the mammalian brain. Electrical synapses had seemed to be quite rare, but new molecular and physiological data suggest that electrical synapses are far more widespread than suspected just a few years ago. Electrical synapses now seem to be a major feature of the neural circuitry in, among other things, the neocortex, hippocampus, thalamus, striatum, cerebellum, retina, hypothalamus, brainstem, and spinal cord. I will describe studies of electrical synapses between four distinct sets of neurons in the neocortex, the thalamus, and the inferior olive of the brainstem. The molecular and biophysical characteristics of these four sets of electrical synapses are surprisingly similar. Now that we appreciate their presence and properties, the greatest challenge is to identify the functions of electrical synapses. I will discuss some of the possibilities, namely that they serve to coordinate the subthreshold and spiking activity of specific sets of neurons, and that they play a role in the generation and synchrony of neuronal rhythms.
The minimal integrate-and-fire-or-burst (IFB) neuron model reproduces the salient features of experimentally observed thalamocortical relay neuron response properties, including the temporal tuning of both tonic spiking (i.e., conventional action potentials) and post-inhibitory rebound bursting mediated by a low-threshold calcium current. I will talk about the observed stimulus dependence of burst versus tonic response of the periodically forced IFB neuron model using the language of Arnol'd tongues. I will also discuss a spatially structured network of IFB neurons, which may be interpreted as a model for excitatory thalamocortical neurons and inhibitory neurons in the thalamic reticular nucleus. A firing rate reduction of the spiking IFB system is used to elucidate the mechanisms for rhythm generation, the response to drifting gratings and wave propagation in such a network.
Plasticity in neural oscillators has not been explored in much detail. In this talk I describe two different types of plasticity and its role in facilitating the synchronization of coupled oscillators. In the first part of the talk, I discuss a problem motivated by the synchronization of certain species of fireflies. Pteroptyx malaccae is known to synchronize its flash to a strobe light in such a way that the phase-lag is eventually zero. It does this by altering its intrinsic frequency through a slow process. We show the consequences of this in a large locally coupled network of oscillators with a range of intrinsic frequencies and demonstrate how this leads to global synchrony. In the second part of the talk, we show that under rather general circumstances a spike-time dependent plasticity rule acting on the connection strengths of weakly coupled oscillators can lead to global synchrony.
In the mammalian retina, highly correlated activity is present weeks before vision in the form of spontaneous waves of action potentials recorded from retinal ganglion cells. This activity is required for the normal patterning of retinal ganglion cell axon arbors in the developing thalamus. Recordings from retinal cells participating in the waves demonstrate that wave generation requires synaptic activation, indicating that the developing network consists of various cell types connected through excitatory chemical synapses. Fluorescence imaging has revealed that the propagating activity consists of spatially restricted domains of activity that form a mosaic pattern over the entire retina. The spatial properties of waves are not determined by fixed structural units within the retina, rather they are determined by the past history of wave activity. A biophysical model of the network based on known anatomical and physiological properties of the developing retina reproduces the same spatiotemporal properties measured experimentally, and that these properties are determined by a single variable which describes the local excitability of the network. Consistent with this hypothesis, pharmacological manipulations that alter local excitability also alter the spatiotemporal properties of waves. This approach to describing the developing retina provides unique insight into how the organization of a neural circuit can lead to generation of complex, correlated activity patterns required for the normal development of the nervous system.
We study the propagation of traveling solitary pulses in one-dimensional networks of excitatory and inhibitory neurons. Each neuron is represented by the integrate-and-fire model, and is allowed to fire only one spike. Two types of propagating pulses are observed. During fast pulses, inhibitory neurons fire a short time before or after the excitatory neurons. During slow pulses, inhibitory cells fire well before neighboring excitatory cells, and potentials of excitatory cells become negative and then positive before they fire. Fast pulses can propagate at low levels of inhibition, are affected by fast excitation but are almost unaffected by slow excitation, and are easily elicited by stimulating groups of neurons. In contrast, slow pulses can propagate at intermediate levels of inhibition, and are difficult to evoke. They can propagate without slow excitation, but slow excitation makes their propagation substantially more robust. We suggest that the fast and slow pulses observed in our model correspond to the fast and slow propagating activity observed in experiments on neocortical slices.
The existence of electrical synapses (ES) has been recently assessed in many regions of the mammalian brain. It has been also found that the spikes fired by interneurons interconnected with ES may get tightly synchronized. Here we investigate theoretically the conditions of emergence of synchronous activity in large networks of neurons coupled with ES. We consider two models. In the first one, which is analytically tractable, the neurons are fully connected and they are modeled with the "quadratic integrate-and-fire" dynamics which is a good approximation for the subthreshold behavior of a large class of neurons. The second model consists of randomly connected conductance-based neurons in which the voltage time course and the shapeof the linear response function of the neuron to small persturbations can be controlled by potassium currents and a persistent sodium current. We investigate analytically and numerically how the stability of the asynchronous state (AS) depends on the size of the action potentials fired by the neurons, on the after-hyperpolarization which follows it and on the duration of the refractory period. We predict that potassium currents promote synchrony mediated by ES whereas sodium currents oppose it.
The dendrites of hippocampal CA1 pyramidal neurons receive inputs from tens of thousands of excitatory and inhibitory synapses. The dendrites must coordinate and blend these inputs to produce an output in what is called synaptic integration. The dendrites also participate in the dynamic adjustment of the synaptic strengths of these inputs during synaptic plasticity. Dendrites were previously thought to be mostly passive structures that provided some form of algebraic summation of excitatory and inhibitory inputs. Using new techniques of dendritic patch clamp recordings and fluorescence imaging, a great deal of new information is now available concerning how dendrites perform synaptic integration and participate in synaptic plasticity. Using cell-attached patch recordings from dendrites of CA1 neurons, we have mapped the distribution and characterized the properties of voltage-gated Na+, Ca2+, and K+ channels along the apical dendrites. We found that the density of Na+ channels is approximately the same from the initial segment of the axon, through the soma, and up to at least the first 350 Ám of the apical dendrites. The total density of voltage gated Ca2+ channels is also about the same from the soma up to 350 Ám from the soma. There are at least 5 dierent types of Ca2+ channels, however, and we found that these were distributed dierentially along the soma-dendritic axis. For example, the Land N-types are at a higher density in the soma and proximal dendrites while the R- and T-types are at a higher density in the distal dendrites. We also studied dendritic K+ channels. We found that there is a fast, transient, A-type K+ channel in the dendrites. Surprisingly, the density of this channel increases dramatically with distance from the soma so that its density at 350 Ám is about 5-fold higher than that in the soma. This channel activates rapidly and limits the amplitude of back-propagating, dendritic action potentials as well as synaptic potentials. Recently, we found that this K+ channel is modulated by several protein kinases. PKA and PKC both shift the voltage range of activation of the channel to more positive potentials thereby reducing the activity of these K+ channels at any given membrane potential and increasing the amplitude of synaptic potentials and back-propagating action potentials. The actions of both of these kinases appears to be upstream of MAPK. We also found that the K+ channels can be inactivated by brief trains of synaptic input. Synaptic input can thus produce an increase in the amplitude of back-propagating action potentials on the specific branch receiving the input. If the synaptic input is appropriately timed with the dendritic action potential, long-term potentiation is induced. We thus hypothesize that these K+ channels play a role in spike-timing dependent LTP. Furthermore, we have found local increases in excitability following the induction of LTP, which may be partly responsible for the phenomenon of E-S potentiation, and we hypothesize that this increase in excitability is due to local decreases in K+ channel activity. In conclusion, dendrites are not passive structures, but contain a vast array of voltage-gated ion channels. These channels play important roles in synaptic integration and both the induction and expression of various forms of synaptic plasticity.
The sensory system of animals is of limited value without the participation of the elaborate motor apparatus that moves the sensors into useful positions. I will focus on behavioral and computational aspects of the vibrissa somatosensory system in rat, and review the experimental evidence for phase-sensitive detection as a model for discriminating contact with an object and as a means to control the position of the vibrissae. A theme of the talk is that principles from communication and control engineering provide a framework to guide experiments.
The nervous system produces many different rhythms asociated with different behavioral contexts. This talk focuses on the different biophysical mechanisms associated with coherence of the different rhythms and transitions among them.
Fast-spiking interneurons in the cortex are connected by both inhibitory synapses and electrical synapses. We are only beginning to understand how intrinsic properties and two types of coupling interact to produce network dynamics.In this talk, I will consider oscillating pairs of leaky integrate-and-fire (LIF) cells that are connected by inhibition and electrical coupling, and I will describe how phase-locked states depend on intrinsic frequency and relative coupling strengths. The phase-locking results for the integrate-and-fire model will be compared to preliminary in vitro experiments on pairs of fast-spiking cells (from the laboratory of Dr. Barry Connors). Finally, I will discuss the possible implications of the results for the function of fast-spiking interneuronal networks.
When depressing synapses are embedded in a circuit composed of a pacemaker neuron and a neuron with no autorhythmic properties, the network can show two modes of oscillation. In one mode the synapses are mostly depressed and the oscillations are dominated by the properties of the oscillating neuron. In the other mode, the synapses recover from depression and the oscillations are largely controlled by the synapses. We demonstrate the two modes of oscillation in a hybrid circuit consisting of a biological pacemaker and a model neuron, reciprocally coupled via model depressing synapses. We show that across a wide range of parameter values this network shows robust bistability of the oscillation mode, and that it is possible to switch the network from one mode to the other by injection of a brief current pulse in either neuron. The underlying mechanism for bistability may be present in many types of circuits with reciprocal connections and synaptic depression.
Reconstructing the connectivity patterns of neural networks in higher organisms has been a formidable challenge. Most neurophysiology data consist only of spike times, and current analysis methods are unable to resolve the ambiguity in connectivity patterns that could lead to such data. We present a new method that can determine the presence of a connection between two visual neurons from the spike times of the neurons in response to spatiotemporal white noise. The method successfully distinguishes such a direct connection from common input originating from other, unmeasured neurons. Although the method is based on a highly idealized linear-nonlinear approximation of neural response, we demonstrate via simulation that the approach can work with a more realistic, integrate-and-fire neuron model. We propose that the approach exemplified by this analysis may yield viable tools for econstructing visual neural networks from data gathered in neurophysiology experiments.
I will present a set of integrodifferential equations, derived from known biophysical properties of cerebral cortex, and with solutions that describe activity waves that occur under some pathological conditions. One approach for establishing the existence of wave solutions uses singular perturbation analysis, which assumes that the dynamics underlying each stage of the wave is approximately independent from the others. In the laboratory, this assumption becomes an explicit experimental prediction. I will present data demonstrating that real seizure-like activity waves, measured in vitro using cortical slices, consist of three stages - initiation, propagation, and termination - each governed by a distinct set of dynamics within the underlying neural circuitry. Examining the data more closely will reveal new possible avenues of investigation for understanding the dynamics of each stage individually. I will also present several intriguing experimental results suggesting other new directions for analysis of the original system of equations.
Many developing circuits show spontaneous oscillations. We study models for the slow episodic population rhythms (time scale, mins) that are seen in chick embryonic spinal cord. We use mean field models for the population firing rate in a recurrent network of excitatory-coupled cells. Geometric singular perturbation methods are used to analyze the models. The primary candidate for the slow negative feedback mechanism that sets the burst period is synaptic depression. The individual units have simple tonic firing properties. Specific predictions based on the model about how the rhythm is affected due to brief stimuli that switch the system from the quiescent to the active phase have now been confirmed in experiments. A positive correlation was found between episode duration and the preceding inter-episode interval, but not with the following interval, suggesting that episode onset is stochastic while episode termination occurs deterministically, when network excitability falls to a fixed level. We also predicted, and confirmed experimentally, that during glutamatergic blockade the interepisode interval increases and the network operates in a range of lessened depression, ie at increased network excitability. We also formulate and analyze a minimal model that demonstrates the plausibility of a specific mechanism for depression: the slow modulation of the synaptic reversal potential (for the GABA synapses, which are depolarizing at this stage of development). Preliminary results show that a cell-based network (integrate-and-fire units) with synaptic depression can also alternate between phases of active firing and quiescence. (with J Tabak, M O'Donovan, B Vladimirski)
From experimental data, one can attempt to extract spike-timing-dependent plasticity (STDP) "rules" that operate on synapses in various systems. One can then apply these rules in models and derive information about the rules' consequences, such as asymptotic limits for synaptic weights and neuronal firing patterns in the systems. I will discuss how different rules lead to different consequences in some cases, and to similar consequences in others. I will use these examples to argue that biological details must be understood before the computational implications of STDP can be fully appreciated.
Neocortical neurons recorded in vivo are subject to a considerable synaptic "noise", which reflects the activity of the network, and which may profoundly impact on the integrative properties of these cells. We examined this issue by using models based on morphological reconstructions of neocortical pyramidal neurons and biophysical representations of synapses and voltage-dependent currents. Results from intracellular recordings during active states were used to constrain models of synaptic noise caused by the presynaptic network activity. These experiments show that in vivo conditions are characterized by a stochastic intracellular activity which markedly shapes the neuronal dynamics. We analyze the integrative mode of the neurons in these conditions and examine issues such as the impact of dendritic structure on efficiency of synaptic inputs, coincidence detection and the detection of correlations in the synaptic noise. We conclude that cortical neurons function in a radically different integrative mode in vivo, which may have profound consequences on the type of information processing taking place in neocortex.
Interesting things happen in the time evolution of visual responses of neurons in V1 cortex. V1 neurons studied individually exhibit time-dependent sensitivity and selectivity for orientation and spatial frequency. This implies an important role for inhibitory interactions in the production of selectivity. From a theory of the network dynamics, one finds that the V1 network causes neurons to be "overdamped" in a high conductance state during visual stimulation, making these neurons into coincidence detectors. Nevertheless, the time-averaged spike rate of a V1 neuron is an approximately linear function of its net synaptic input.
So-called Simple cells in the primary visual cortex (V1) respond to visual stimulation in a roughly linear way, while Complex cells do not. This longstanding classification -- the basis for the influential hierarchical model of Hubel & Wiesel -- is far from sharp; Recent experiments show that most cortical cells lie somewhere in a continuum between being Simple or Complex. I and my collaborators have posed and studied an "egalitarian" model of V1, based on the local architecture of a V1 hypercolumn, where all cortical cells are coupled nonspecifically within the network. I show that by requiring the total synaptic weight on each cell to be constant, though divided between between geniculate and network couplings, leads to broad response distributions like those found in experiment, and rationalizes several aspects of the experimental data.
One of the first models for bursting electrical activity was developed by Chay and Keizer. It was based on the behavior of insulin secreting pancreatic beta-cells but has been extended and modified to cover a number of neural systems, including pacemaker cells of the pre-Botzinger complex (Butera et al), thalamic neurons (Hindmarsh and Rose; Rush and Rinzel), pituitary somatotrophs (Li, Van Goor, Stojilkovic), and hippocampal pyramidal cells (Pinsky and Rinzel; Wang and Kepecs). The unifying feature of these models is hysteresis of steady states. However, one of the key predictions of the model, a slowly rising and falling intracellular calcium concentration, has not held up for the very slowly bursting beta-cells. We show how the spirit of the model can be retained, but with important differences in detail, by introducing one or more additional internal calcium compartments.
Experimental and modeling studies of the neural oscillator generating the rhythm of breathing in the mammalian brainstem are providing insights into cellular and network-level mechanisms generating rhythms in motor pattern generation networks. We have developed a hybrid pacemaker-network model of the respiratory oscillator that represents a synthesis of cellular and network mechanisms derived from experimental and modeling studies. This model incorporates a rhythm-generating neuronal kernel, located in the pre-B÷tzinger complex of the ventrolateral medulla, consisting of a network of excitatory neurons with state (voltage)-dependent, oscillatory bursting/pacemaker-like properties. This kernel has been experimentally isolated in several in vitro preparations from neonatal rodents including thin brainstem slices with a functionally intact, active rhythm-generating network. We have exploited these in vitro systems for analysis of cellular biophysical mechanisms and population-level dynamics in the kernel by a combination of single-cell patch-clamp electrophysiological recording, activity-dependent neuron/population imaging and recording of population activity. Simulations with mathematical models of the pacemaker cell network are consistent with a number of features of measured cell and population rhythmic behavior that will be discussed in the talk, including the following. (1) Cellular biophysical mechanisms of oscillatory burst generation. Electrophysiological studies show that candidate rhythm-generating cells exhibit intrinsic voltage-dependent bursting behavior with burst frequencies spanning over an order of magnitude (.05 to ~1Hz), providing a mechanism for cellular-level frequency control. This behavior is mimicked by our biophysically minimal models incorporating Hodgkin-Huxley-like membrane conductances, where bursting arises via fast activation-slow inactivation of a subthreshold voltage-activating persistent sodium current (INaP) that dynamically interacts with a potassium-dominated leak current. Our voltage-clamp measurements have demonstrated INaP in bursting cells and dynamic clamp studies incorporating our modeled INaP in neurons confirm that this mechanism is sufficient for voltage-dependent oscillatory burst generation. (2) Synaptic coupling and burst synchronization. Electrophysiological and imaging studies indicate that cellular burst synchronization in the kernel arises from fast, glutamatergic excitatory synaptic coupling. Modeling studies of heterogeneous populations of synaptically-coupled bursting neurons (as described above) indicate that burst synchronization across the population is promoted by burst-generating currents and can occur to produce stable rhythms even when only a small fraction of the cells in the population are intrinsically bursting. Population bursting frequency is modulated by synaptic coupling strength. (3) Cellular/population frequency control and dynamic range. Experimentally tonic excitation regulates single cell and population bursting frequency; population bursting exhibits a wider dynamic range of frequency control by tonic excitation. Population model simulations mimic this and indicate that heterogeneity of cellular bursting parameters and excitatory coupling synergistically combine to determine dynamic range. (4) Multiple oscillatory modes and quasiperiodic dynamics. Measurements of population activity combined with nonlinear system dynamics analysis indicate that the kernel intrinsically exhibits multiple periodic states as frequency is driven experimentally by tonic excitation. Stable periodic behavior occurs with low excitation, progresses to mixed mode-oscillations, and transitions to quasiperiodic behavior at high excitation levels. Population simulations indicate that weak synaptic coupling and extreme parameter heterogeneity, leading to partial desychronization of cellular bursting, can give rise to mixed mode oscillations and quasiperiodic states.
Activity patterns in excitatory-inhibitory networks are analyzed using geometric singular perturbation methods. The networks are motivated by models for thalamic sleep rhythms and neuronal activity in the basal ganglia. The analysis is used to reduce the rather complicated neuronal models to simpler systems. Propagating patterns in two-dimensional networks are considered.
A major outstanding problem in sensory physiology is to understand how the response of a retinal rod to a single photon manages to have such little variation in amplitude and kinetics, despite the fact that it is mediated by the activity of a single molecule of activated rhodopsin (R*). In the dark, there is a circulating current across the rod membrane, which flows inward through channels in the outer segment membrane gated by cyclic-GMP, and flows outward across the inner segment membrane. The modulation of the voltage across the rod membrane in response to light is a consequence of the reduction of this dark (light-sensitive) current by the following mechanism. When rhodopsin absorbs a photon it is converted into an activated enzyme (R*) that initiates a cascade of biochemical reactions: G-protein is activated by R*; G-protein activates phosphodiesterase; phosphodiesterase hydrolyzes cyclic-GMP; the cyclic-GMP concentration drops, resulting in closure of some of the light-sensitive channels; the reduction of inward current causes the rod membrane voltage to become hyperpolarized. The sequence of events following the activation of rhodopsin continues until R* is inactivated. The regularity of the single-photon response implies that the lifetime of R* is controlled with high precision. Numerous theories have been proposed. All are based in part on known biochemical elements of the transduction cascade, and some also include additional hypothetical mechanism. Liebman and Gibson recently proposed a theory based on their biochemical experiments. The idea is that R* is partially deactivated by multiple steps of phosphorylation catalyzed by rhodopsin kinase, followed by an irreversible "capping" reaction in which R* is completely inactivated by arrestin. In this theory, three molecules, rhodopsin kinase, G-protein and arrestin, compete in a mutually exclusive manner for R*. Every time R* is phosphorylated, its affinity for G-protein (i.e. its catalytic activity) is reduced, its affinity for rhodopsin kinase is reduced, and its affinity for arrestin is increased. I will explain how some statistics of the single-photon responses can be derived analytically in this theory. I will also demonstrate by Monte Carlo simulation the extent to which the proposed mechanism reduces variability in the single-photon response. Deficiencies of the theory will be demonstrated, and alternative mechanisms will be discussed and evaluated.
In 1998, it was hypothesized that gap junctions existed between the axons of hippocampal pyramidal cells. This hypothesis was suggested by two experimental observations: the occurrence of 200 Hz population oscillations in neuronal networks in which synaptic transmission was blocked, but where the oscillations required gap junctions; and the shape of putative coupling potentials in principal neurons, which were too fast to be generated by gap junctions located on somata or dendrites. There is now electrophysiological and dye-coupling evidence that such gap junctions exist, and are located roughly 100 microns from the soma. Modeling shows that gap junctions in this location can give rise to very fast oscillations in networks of principal neurons, as well as to 200 Hz "ripples" (as seen in vivo, and consisting of IPSPs), when interneurons are also in the circuit. In addition, axonal gap junctions can underlie the generation of 40 Hz oscillations, in the presence of cholinergic agonists or of kainate. Modeling predicts, and experiments confirm, that in such conditions, the oscillation spectrum contains both 40 Hz and also very fast (>80 Hz) components.
Visual stimuli evoke waves of activity that propagate throughout the visual cortex of freshwater turtles. These waves have been visualized using both multielectrode recording and voltage sensitive dye methods. This talk will discuss the use of a large-scale model of turtle visual cortex to study the cellular mechanisms underlying the propagation of the wave and to suggest that information about visual stimuli is encoded in the temporal dynamics of the waves.
The model consists of approximately 1,000 geniculate and cortical neurons. It is based upon the anatomical distribution of neurons in turtle visual cortex and the biophysics of individual types of cortical neurons. The model suggests that waves originate near the rostrolateral pole of the cortex due to a high density of geniculocortical synapses at that point. It reproduces features of the dynamics of the wave, such as its velocity and tendency to reflect at the caudal border of visual cortex. Analysis of real and simulated waves using a principal components method (Karhunen-Loeve decomposition) indicates that information about the position of stimuli in visual space is encoded in the dynamics of the wave in the sense that stimulus position can be reliably estimated from the dynamics of the wave using Bayesian estimation methods.
Over the last ten years we have gained significant insight in the role of synaptic interactions in the synchronization of neuronal networks. A crucial first in these investigations was the study of extremely simplified networks, all-to-all coupled networks of indentical neurons. The mathematical tools developed to analyse both the asynchronous and fully synchronized state in such networks were subsequently extended to study networks with more realistic architectures. However, long term behavior in spatially extended networks of synaptically coupled neurons, in which the coupling strength decreases with distance, have not yet received much attention. In this talk I will consider a network of identical integrate-and-fire neurons, positioned on a 1-D ring. I will show that strongly coupled networks of oscil- lators behave qualitatively differently from weakly coupled ones, and also differ qualitatively from rate based models. Such networks can evolve, depending on the coupling parameters, evolve to either an asynchronous state, or to a traveling wave state. I will show how the existence and stability of these states can be analyzed in this simple model. For fast excitatory synapses a third state co-exist with the travelling wave state. In this state the activity is highly complex and the symmetry is broken. So far, no analytical treatment of this state has been found for this state.
Visual oscillations can occur in response to certain ambiguous stimuli, and both oscillations and travelling waves occur in binocular rivalry and migraine auras. After presenting relevant data, neural models at both the individual action potential level and at the spike rate level will be developed to interpret and explain these phenomena. These models include a two-level model for binocular rivalry in which the first level can be dynamically defeated by appropriate stimulus manipulation.