A minimal model for synaptic integration in simple neurons
Abstract
Synaptic integration is a prominent aspect of neuronal information processing. The detailed mechanisms that modulate synaptic inputs determine the computational properties of any given neuron. We study a simple model for the summation of excitatory inputs from synapses and illustrate its use by characterizing some functional properties of postsynaptic neurons. In this regard, we study the response of postsynaptic neurons as defined by the model to two well known noise driven processes: stochastic and coherence resonance. The model requires a small number of parameters and is especially useful to isolate the role of integration mechanisms that rely on summation of inputs with little dendritic processing.
keywords:
Neurons , synaptic integration , stochastic resonance , coherence resonance1 Introduction
Information processing in neurons is a complex phenomenon that is affected by, among other things, the kind of stimulus encoded, type of nerve cells involved, ion channels and dendritic morphology. Individual neurons are typically specialized enough to be treated as computational units, capable of performing a variety of tasks [1]. The most prominent of these is the integration of information from presynaptic inputs. Each synapse, when stimulated by the appropriate neurotransmitter, gives rise to excitatory postsynaptic potentials (EPSP) which propagate down the dendrites and are “summed” in the cell body of the postsynaptic neuron [2, 3, 4]. Subsequently, an action potential is generated when the membrane potential crosses a threshold. For many important classes of neurons, additional processes modulate synaptic integration. These include nonlinearities in dendritic responses to postsynaptic currents [5], variability in dendritic structure, inputs from inhibitory synapses and the active properties of dendrites [6]. The rules of summation usually involve nonlinearities themselves in that the net activation in the cell body of the neuron may be slightly more or less than the arithmetic sum of the processed dendritic inputs [7, 8]. Understanding how these factors affect synaptic integration requires detailed biophysical simulations incorporating several spatial and temporal scales. Nevertheless, synaptic integration in its bare minimum role as a summation of dendritic impulses can still demonstrate the functional advantages of postsynaptic neurons. In this article, we pursue such an approach and describe a minimal model of synaptic integration. The model is minimal in the sense that synaptic currents apply simple perturbations to the phase space of the postsynaptic neuron which simplifies the analysis of the resulting dynamics. Moreover, a small number of biophysically relevant parameters are used. We make use of the well known FitzHugh-Nagumo (FN) equations to simulate single neurons. The FN model is prototypical of many excitable systems and is able to capture different modes of neuronal spiking behaviour. In his seminal work, FitzHugh [9] showed that the phase space of the FN model could be partitioned into distinct regions, each of which corresponded to a certain physiological “state” of the neuron such as active, refractory, etc. He also showed that the FN equations could be reduced from the famous Hodgkin-Huxley model [10] which is a realistic description of neuron electrophysiology. Thus, the FN model trades detail for simplicity, yet retains the essential features of neuronal excitability.
As a first approximation to modelling excitatory postsynaptic currents, we make use of rectangular pulses. These are obtained from presynaptic membrane potentials by simple application of a Heaviside function. The Heaviside function is suitably parameterized to control the height and width of the pulse current. Each synapse thus contributes a series of rectangular pulses to the total current stimulating the postsynaptic neuron. In the next section, we briefly introduce the dynamics of the FN equations and investigate the excitation threshold for different numbers of synapses. Next, we formulate the model and explore its validity and limitations. Then, we illustrate a possible application by exploring two noise driven processes: stochastic and coherence resonance. Finally, we highlight the scope for extending the model to incorporate other aspects of synaptic integration without compromising the simplicity of implementation. Biologically relevant observations are highlighted throughout. In starting with a minimum set of parameters and a simple model for synaptic inputs, we take a bottom-up approach that allows us to isolate the functional properties of synaptic integration made possible by summation of excitatory inputs alone (as opposed to other mechanisms like dendritic processing for instance).
2 Excitation threshold in the FN model
The FN equations consist of two state variables whose dynamics operates on significantly different timescales. The fast “voltage” variable, , mimics a neuron’s membrane potential and is complemented by a slow “recovery” variable, . The dynamical equations are given by:
(1) |
(2) |
where the parameters and have values and respectively and represents the ratio of the two timescales and is typically set to . A time dependent current injection into the neuron is modelled by the function . In contrast to the all-or-none nature of neuronal firing patterns, the FN model does not exhibit a well defined threshold for excitability [11, 12]. That is to say, action potentials with a range of amplitudes are possible depending on the choice of the current term . However, there is a sense in which one can approximate a firing threshold in the FN model and this requires the analysis of a family of trajectories known as canards [13, 14, 15, 16, 17]. We assume that the initial condition lies on the intersection of the nullclines and that the current term is null at (Fig. 1(a)). Physiologically, this corresponds to a neuron at its resting membrane potential. If at some arbitrary time, a current is applied in the form of a rectangular step stimulus (of infinite duration), the cubic nullcline shifts upwards and the formerly stable initial condition is now below the new fixed point. Depending on how much the nullcline shifts upward, the resulting trajectory can be either a small-amplitude excitation (Fig. 1(b)), an intermediate amplitude excitation (Fig. 1(c)), or a large amplitude excitation that resembles an action potential (Fig. 1(d)).

The height of the instantaneous step current therefore determines the amplitude of the resulting excitation. Moreover, the transition from small amplitude excitations to large amplitude excitations occurs in an exponentially small () region of the parameter range . Within this range, trajectories known as canards track the middle branch of the cubic nullcline for a relatively long () time (Fig. 1(c)). The middle branch of the cubic nullcline is called the repelling slow manifold since nearby trajectories diverge sharply from it. The right and left branches in contrast are the attracting slow manifolds since nearby trajectories converge onto them. Informally, canards are trajectories that follow both the attracting and repelling branches of the slow manifold for a considerable amount of time [15]. Fig. 1(c) shows a maximal canard, i.e. a trajectory that tracks the unstable branch of the cubic nullcline for the longest time. This trajectory passes through the local maximum of the cubic nullcline. The maximal canard is also called the quasithreshold separatrix since neighbouring trajectories diverge sharply to its left or right [9]. The quasithreshold separatrix can loosely be thought of as the neuron’s threshold if we treat all trajectories to its left as subthreshold responses and trajectories to its right as action potentials.

If the parameter is made sufficiently large so that the fixed point lies on the middle branch of the cubic nullcline, a stable limit cycle results (via a supercritical Hopf bifurcation). Here too, for an exponentially small range of values beyond the Hopf bifurcation point, small amplitude limit cycles transition into large amplitude limit cycles in much the same way that small amplitude trajectories transitioned into large amplitude ones in Fig. 1. For the rest of this paper, we assume that the fixed point lies on the left branch of the cubic nullcline. In other words, we will only be concerned with the FN model in its excitable regime (as opposed to the limit cycle regime beyond the Hopf bifurcation point).
Suppose that an instantaneous step current of infinite duration (as applied in Fig. 1) is applied to the system. We can obtain a stimulus-response curve (Fig. 2) by plotting the voltage difference from the resting value () that a given step current is able to generate. We notice a sharp transition just beyond from small to large amplitude excitations. This is sometimes referred to as a canard transition/explosion. The sharpness of the transition allows us to approximate neuronal firings in the FN model as all or none, since intermediate amplitude excitations are sandwiched into a small threshold manifold that is rarely ever encountered by trajectories.

Depending on the neurotransmitter and number density of its receptor, an action potential in the presynaptic neuron will lead to a small current injected into the postsynaptic dendritic membrane. This in turn generates a small amplitude excitatory postsynaptic potential (EPSP). The amplitude of a single EPSP in most cases is insufficient to elicit postsynaptic firing. This is largely due to attenuation in passive dendrites. Typical EPSP values from individual synapses range anywhere from to of the firing threshold [18]. Figure 2 shows that there is a monotonic relationship between current amplitude and the EPSP height for subthreshold responses (before the canard transition). Thus, if a sequence of rectangular pulses of identical heights overlap in step, at some point the current will exceed the threshold manifold. We scale the height of each current pulse such that a minimum number, , of current pulses can generate an action potential. A threshold value for firing is chosen (to the right of the canard transition in Fig. 2), denoted . The height of each current pulse is therefore . We denote the width of the pulse by . A higher value for indicates a longer effective time available for a sequence of pulses to overlap and cross the firing threshold. As in real neurons, the current pulses must overlap quickly enough to prevent the subthreshold membrane potential from falling. We can visualize this by letting pulses each of height , overlap with an identical time gap denoted by . Such a “staircase current” is an idealization but the width of the staircase () can be thought of as the average time between successive overlaps in the real case. Figure 3 shows the corresponding threshold curves for different settings of . We notice that there is effectively a maximum time gap exceeding which, the pulse current sequence fails to evoke a postsynaptic action potential. The maximum time gap is seen to decrease as the number of synaptic inputs () increases. This is because a large number of small height pulses apply tiny upward shifts or nudges to the cubic nullcline. Consequently, the trajectory responds by drifting towards the new fixed point (since the net current is still subthreshold). For higher , the nullcline undergoes many small magnitude, upward nudges. The smaller the nudge, the better the trajectory is able to track the shifted nullcline. This means that the component of the trajectory’s drift in the -direction is higher for high . As a result, by the time the final pulse arrives, the voltage of a neuron with larger would have climbed less than one with a lower (due to increased drift in the -direction). This leads to the canard transition for higher occurring earlier than that for lower .
Low time gap requirements are especially relevant to neurons that operate as coincidence detectors [1, 2]. The temporal resolution of coincidence detection is largely determined by parameters intrinsic to the neuron such as its membrane time constant. Figure 3 shows that even with these parameters remaining constant (no change in , and ), modulating synaptic currents towards lower values can improve this temporal resolution.
Neurons are also stimulated by inhibitory, hyperpolarizing currents called anodal currents. In some cases, it is possible for an anodal current to cause an action potential. The mechanism by which this happens is referred to as anodal break excitation [19]. In the FN phase space, anodal break excitation is possible because the cubic nullcline shifts downwards when a hyperpolarizing current is applied (negative values for ). The trajectory responds by settling onto the new fixed point. When the anodal pulse terminates, the nullcline shifts upwards to its former position which can leave the state point sufficiently below the original fixed point to evoke firing. Although our focus here is on the analysis of excitatory stimuli, threshold curves analogous to Fig. 3 can be obtained for anodal currents. This is shown in Fig. 4.

The observations are qualitatively similar to the excitatory case. There are certain features of the threshold curves in Fig. 3 that we have not explained yet. For instance, we notice that in the suprathreshold regime (low values of ), the threshold curve goes through an inverted S-shape/inflection point (just before the canard transition). This can be explained as follows. The maximum voltage () attained by any trajectory corresponds to the point where the trajectory crosses the cubic nullcline from the left (eg. Fig 1d), since to the right of this point, the flow on the v-axis reverses direction. For low enough values of , the trajectory hits the nullcline corresponding to the sum of all pulses. However, for values of near the canard transition, the first pulse ends before the trajectory manages to hit the final nullcline. Now, is determined by values that range from the trajectory’s intersection with the penultimate nullcline, to the trajectory’s intersection with the final nullcline. This is what causes the small inflection point. In the subthreshold regime (beyond the canard transition), we notice points where the monotonic decrease of is non-smooth. This can be explained similarly: is determined by the penultimate and lower nullclines. In other words, the trajectory reverses direction after crossing the nullcline of the pulse and traverses far enough that all pulses fail to get it past its previously attained maximum voltage.
Synaptic integration of inputs exhibits two modes known as spatial and temporal summation [2, 3, 4]. Spatial summation involves the simultaneous overlap of two or more dendritic inputs and a subsequent firing. In temporal summation, high frequency stimuli from the same dendritic input can summate to evoke firing. This requires that the time lag between stimuli be much smaller than the characteristic decay time of the postsynaptic potential. We remark here that the model we use can only exhibit spatial summation. This stems directly from the dynamics of the FN system, which belongs to a class of neuronal models termed resonators [15, 20]. Real neurons can temporally integrate subthreshold excitatory currents if these currents arrive with sufficiently low time gaps. In our model though, even if two or more subthreshold pulses follow each other with zero time lag, then that simply corresponds to a single subthreshold pulse of finite duration which anyway cannot cause firing.
3 Noisy FN model for synaptic input
We shall treat voltages obtained from FN trajectories as action potentials whenever a threshold value, , is crossed from below. We choose which is close to the action potential peak. Because we include noise in the presynaptic FN equations, it is possible that during the course of an action potential may cross several times by virtue of stochastic fluctuations. To avoid such overcounting, we specify that a low voltage () be crossed from below in between two crossings of .
Equation (1) is modified to include an additive noise term as follows:
(3) |
where is white noise corresponding to a normalized Wiener process with autocorrelation , and (which we shall call the noise intensity) is the standard deviation of the Wiener increments. Equation (2) remains unchanged. How the presynaptic neurons themselves are stimulated depends on the type of phenomenon studied. For instance, we can let the presynaptic neurons be driven by a sinusoidal current stimulus as follows:
(4) |
Denote the firing times of the neuron by the set . The sequence of current pulses that the synapse generates is therefore
(5) |
where is the pulse width and denotes the Heaviside function. The net stimulus from all synapses is then
(6) |
where is the firing threshold described in section 2, is the total number of synapses and is the minimum number of presynaptic firings requires to elicit a postsynaptic action potential. In other words, if pulses overlap in close succession, then the total postsynaptic current equals which evokes firing. Postsynaptic neurons are kept noiseless for the simulations in this paper (i.e., ). We choose this because as stated in the introduction, we are concerned with synaptic integration in its minimum role as a summation of excitatory stimuli. This will however require us to state assumptions on the source of noise that justify our choice of keeping postsynaptic neurons noiseless. For instance, we may choose to study synaptic integration of inputs from sensory neurons whose stimuli are affected by environmental noise. This would then justify keeping the corresponding postsynaptic neuron noiseless as it has access only to its presynaptic neurons and not the external environment.
In a sequence of input pulses overlapping to generate an action potential, there is always a time lag between one pulse and the next. These time lags cause a net drift of the trajectory in the leftward region of the quasithreshold separatrix of the nullcline set by the final pulse. The final pulse pushes it to the right of the separatrix and causes firing. While setting , we must avoid values close to the canard transition in Fig. 2. It is possible that for such values of , even if pulses overlap in very close succession, the net drift can still bring the trajectory to the left of the quasithreshold separatrix of the final nullcline and prevent firing.
The stochastic differential equations corresponding to Equations(3,2) respectively are:
(7) |
(8) |
The numerical integration of Equation(7) requires a choice of interpretation of the stochastic integral used to integrate the noise term (called the Itô and Stratonovich interpretations). For additive white noise however, both interpretations evaluate to the same stochastic integral so this does not affect the choice of which algorithm we use. We choose the simple first order Euler-Maruyama scheme [21, 22] which gives:
(9) |
(10) |
where is sampled from a Gaussian distribution with unit variance and . Because the Euler-Maruyama method is stiff, interspike interval histograms of neuronal firings were compared with those obtained using a second order stochastic Runge-Kutta method [23] for typical noise intensities used in this paper. The same was repeated for step size an order of magnitude lower. No significant differences were found. Interspike interval histograms (normalized) are a good approximation to the probability distribution functions of waiting times between firings. For obtaining postsynaptic trajectories, we use a standard deterministic Runge-Kutta method ( order) with step size (since they are kept noiseless). Sample trajectories are shown in Fig. 5 for a postsynaptic neuron stimulated by synapses.

There are two caveats with defining the postsynaptic input function as in Equation(6). Suppose we choose a relatively high value for , say and that . Now if a single pulse alone stimulates the postsynaptic neuron, then will be which is suprathreshold (see Fig. 2). Thus, a single pulse will cause firing when it is not supposed to. The second problem arises when stimulating the neuron with high number of inputs, . Suppose , and that . If inputs undergo spatial summation, then which is well beyond the Hopf bifurcation point for (leading to limit cycles and further on, into an unstable node). Thus, large values of and small values of can lead to bifurcations in the dynamics. This is not necessarily undesirable from a modelling perspective. For example, a Hopf bifurcation caused by large and small might help model interesting features such as bursting dynamics [18]. We will however not deal with such a scenario here and assume that the neuron is always in its excitable regime (where the fixed point lies on the left branch of the cubic nullcline). It is possible to artificially correct for high values of by composing it with another Heaviside function. However, we avoid this and simply point towards the validity of the model by outlining the constraints on the parameters , and : First, we would like to prevent any number of presynaptic inputs less than from exciting the postsynaptic neuron. The worst case corresponding to this is when inputs fire (nearly) simultaneously. Then, which we would like to limit to some subthreshold value (chosen appropriately from Fig. 2). Thus, we have
(11) |
If bifurcations are to be strictly avoided, then we limit the maximum value of to the value where the Hopf-bifurcation occurs ():
(12) |
This constraint can be relaxed if such high values for postsynaptic currents are rare, or if limit cycle spikes are desired. We have said that is a positive integer. There is no particular reason why should be an integer except that doing so would provide a natural interpretation to the current term for postsynaptic neurons. The value of along with and the summation term in Eq. (6) determines how much the cubic nullcline shifts upward and so a non-integer value for should also work. However, we will use integer values for throughout. In the biological context, constraints (11-12) might correspond to simple neurons such as cerebellar granule cells which on average receive only 4 excitatory inputs from short dendrites [24].
In the following section, we explore the effects of excitatory stimuli only. However, by the simple addition of a term to equation (5), we can incorporate the effects of inhibitory synapses. For a neuron with excitatory inputs and inhibitory inputs, the postsynaptic current will be as follows:
(13) |
where and are the sequence of Heaviside pulses from excitatory and inhibitory synapses respectively, and are the minimum number of presynaptic firings required to elicit a postsynaptic action potential, when excitatory or inhibitory inputs are considered alone. The parameters , are the corresponding firing thresholds. How to set the inhibitory firing threshold, , depends on the kind of phenomena we would like to investigate. For example, if we want to explore firings generated by inhibitory synapses via anodal break excitation, then can be chosen appropriately from Fig. 4. On the other hand, if we are concerned only with the role of inhibiting/ hyperpolarizing currents, then can be replaced with a single parameter that describes the height of each inhibiting pulse.
To illustrate some use cases of the model, we briefly investigate two well known noise driven phenomena: stochastic and coherence resonance. We compare presynaptic and postsynaptic responses in these effects and identify postsynaptic features that may be biologically significant.
4 Postsynaptic responses to noisy presynaptic neurons
4.1 Stochastic resonance
Noise is a ubiquitous presence in neurons and can play a significant role in certain sensory modalities. The functional role of a variety of stochastic fluctuations in interspike variability has been documented [25, 26, 27]. The FN model has been shown to account for several noise driven phenomena such as stochastic resonance [28, 29, 30], coherence resonance [31, 32], resonant activation and noise enhanced stability [33]. Multiple stochastic FN neurons can be coupled to produce noise-induced synchronization [34] and phase locking [35].
Experimental recordings of sensory neurons are often visualized in terms of interspike interval histograms (ISIH). Typical ISIH recordings have a multimodal shape with peaks at integer multiples of the external stimulus’ time period. This indicates that firing occurs during a preferred phase of the external stimulus and that sometimes, a random number of cycles are skipped between successive firings [28]. This pattern of firing is especially interesting due to its relevance to the well known phenomenon of stochastic resonance. A classic result in the theory of stochastic processes is that noise can increase the synchronization between the response of some nonlinear systems and an external periodic signal. When this happens, the system is said to undergo stochastic resonance (SR) [36]. A simple example of the SR effect is an overdamped Brownian particle in a double well potential [37]. Modulating the potential with a weak periodic signal alternately raises and lowers the two wells, although the signal by itself is insufficient to cause inter-well transitions. Adding white noise to the external forcing however enables the particle to switch wells. Intuitively, this is understood as the result of a time-scale matching condition [36]: the average waiting/residence time of the particle between transitions is comparable with half the period of the driving signal. Consequently, for a given value of the driving signal’s frequency, there exists a corresponding noise intensity that achieves the highest tuning between noise induced transitions and the driving signal.
SR has been documented extensively in experimental systems, notably in sensory neurons [38, 39]. Moreover, residence-time distributions of periodically driven bistable systems with noise resemble the ISIH obtained from experimental recordings. This is consistent with the fact that neuron models like FN can be approximated as bistable systems. For the FN model in particular, the voltage equation (Eq. 3) can be cast into a form equivalent to Langevin dynamics in a one-dimensional double well potential [29]. This approximation works essentially due to the large separation of timescales set by so that on the fast (nearly straight line) branch of the excursion along . A noise induced ”escape” from the fixed point is analogous to a switching between wells. The trajectory returns to the fixed point from the other well via a different degree of freedom (i.e., ) [29].
The SR effect can make detection of a weak external stimulus possible, but for the stimulus to be reliably encoded and transmitted further requires some sort of amplification mechanism.

Clearly, postsynaptic neurons play a significant role in this. In Fig. 6, we compare the ISIH obtained from a presynaptic neuron and a postsynaptic neuron with synapses. The signal chosen is subthreshold, i.e. unable to cause firing in the absence of noise. In both cases, the dominant peak occurs at the stimulus time period, s. The picture is qualitatively the same if we choose to plot the probability density by normalizing the ISIH (peaks become rescaled then). The contribution of the dominant peak has implications in the reliability of information transmission. For instance, the area under the peak may correspond to how much information about the stimulus’ period is conveyed [28]. If this interpretation is followed, then Fig 6 shows that it is possible for subthreshold signals to be conveyed with increased reliability. Notwithstanding the encoding mechanism, we remark here that an important application of the model would be to quantify postsynaptic firing statistics using information-theoretic measures [40]. This makes quantitative, the reliability of postsynaptic spikes to encode weak stimuli rendered detectable by noise.
There are different measures used to quantify stochastic resonance. Typically, the signal to noise ratio (SNR) is computed from the system response’s power spectrum and plotted as a function of noise intensity [36]. A peak in the SNR plot represents the point of maximum resonance with the signal. For neuronal firings, the power spectrum of the point process corresponding to firings is typically used. Other measures include the power spectral density of the response at the signal frequency, or even the magnitude of the corresponding Fourier coefficient. We plot instead, the peak of the ISIH corresponding to the stimulus period, , as in [28]. ISIH plots are an intuitive way to visualize how the firing statistics are distributed across intervals relevant to both the external stimulus and dynamics intrinsic to the neuron. Although ISIH peaks may be less useful for quantitative arguments than measures like the SNR, they are a good starting point and easily interpretable.

As indicated in section 3, we are required to indicate what kind of system corresponds to our results in order to justify keeping the postsynaptic neuron noiseless. A possible application would be sensory neurons which are stimulated by an external signal and then relay their firings onto a postsynaptic neuron. Noise is then assumed to originate externally in the environment. Figure 7 shows a comparison of the stochastic resonance effect for such a system. The curve for the postsynaptic case rests above its presynaptic counterpart. The significantly higher postsynaptic ISIH peaks, also grow faster for low noise intensities. This suggests a dual functionality for postsynaptic neurons conveying sensory information (eg. interneurons from sensory afferents): They amplify the information in the intervals corresponding to , and they could be more sensitive to an increase in noise, within the low noise range. The former ensures robustness against a presynaptic neuron failing to fire. The latter is an issue that requires further investigation.
Because the ISIH is multimodal (consisting of several peaks), it is possible to demonstrate stochastic resonance using the second or third peak as well (corresponding to and respectively). This becomes necessary if the first peak occurs at a noise intensity that is too high: In such a case, the condition that the neuron fires at a preferred phase is no longer valid and noise induced firings take place throughout the rising phase of the stimulus cycle [28]. For Fig. 7 however, the stochastic resonance peaks for both pre and postsynaptic neurons were found to occur at noise intensities at which the dominant contribution to the interspike intervals is still from the stimulus period, . There is also a small secondary peak at that corresponds to limit cycle spikes when the neurons are driven above their Hopf bifurcation point.
The postsynaptic parameters , and affect the ISIH peak at the stimulus time period, . Increasing or (all else being constant) increases the peak around because of increased stimulation or increased effective time available to input pulses for overlapping. Decreasing also increases the peak because less number of stimuli are now required to cause firing.
4.2 Coherence resonance
If we let the FN system be driven purely by a noise term (so that ), then the system can still tune to an optimal noise intensity at which the firings are most coherent. That is, at the optimal noise intensity, the characteristic correlation time of the sequence of firings is maximum (rate of decay of correlations is slowest). This effect is called coherence resonance (CR) [31, 32]. CR has been analysed and predicted by the FN model using the standard Langevin approximation for the state variable to which noise is added [32]. Intuitively, the CR effect can be traced to the existence of two different characteristic times in the FN system: the activation time (time required to elicit firing) and the excursion time (duration of action potential). Depending on the noise intensity, one of these two times dominates the average pulse duration (defined as the interspike interval ). For small values of noise intensity, and for large values of noise intensity, . Coherence resonance occurs when the noise intensity is large enough () so that excitations are frequent while at the same time fluctuations in the excursion time are low (high coherence). To quantify CR, the coefficient of variation (CV) of the interspike intervals is typically used. That is, we first compute the sequence of firings of a long spike train and from it, the interspike intervals where . The coefficient of variation is defined as:
(14) |
The CV is a measure of the normalized fluctuations of the interspike intervals. Coherence resonance manifests as a minimum in the graph of CV versus noise intensity. Figure 8 plots CV versus noise intensity for a presynaptic neuron and a postsynaptic neuron with synapses.

The presynaptic plot shows the expected shape [32] whereas the postsynaptic plot deviates significantly from it as the noise intensity increases. The explanation is as follows: For presynaptic neurons at high noise intensities, CV is determined by fluctuations of the characteristic excursion time, (since is very low). In particular, [32]. Since depends weakly on for high noise intensities, CV increases linearly with for presynaptic neurons as observed in Fig. 8. For postsynaptic neurons however, the excursion time is effectively constant as they are kept noiseless (i.e., the action potential of a postsynaptic neuron is unperturbed by noise). The CV is therefore determined purely by fluctuations in the activation time, of the postsynaptic neuron. This is very low since the presynaptic neurons themselves exhibit low fluctuations of and they independently stimulate the postsynaptic neuron. As a result, CV is nearly flat at high noise. Care must be taken while using high noise intensities: In such cases, the dynamics is almost entirely dominated by noise and action potentials become indistinguishable. As a result, the spike detection procedure outlined at the beginning of section 3 can miss rapid spikes that do not reach in between. For Fig. 8, this starts to affect the CV after approximately .
Theoretically the use of high noise is unproblematic. However, we are required to state assumptions of where the noise possibly comes from to ensure biological interpretability. For the coherence resonance scenario, we assume the noise to be external (making the whole setup analogous to the sensory system assumed in section 4.1). Other interpretations may be possible but require justification for not only keeping the postsynaptic neuron noiseless but also the use of white noise for presynaptic neurons (so far we have assumed the noise to be external and therefore environmental white noise is a reasonable approximation).
The more coherent postsynaptic response in Fig. 8 at high noise could be relevant to neurophysiological mechanisms for amplifying order in spontaneously (random) firing networks.
5 Discussion and Conclusions
We have investigated a simple model for synaptic integration using rectangular pulses. The model is minimal since the phase space of the FN system is perturbed by a finite sequence of instantaneous changes (via rectangular pulses) as opposed to continuous changes in parameter values. We have illustrated the use of the model in probing noise assisted/noise driven phenomena where the noise source is assumed to be external to the system and relatively larger than intrinsic sources of noise such as channel noise and synaptic noise [26]. A natural extension would be to let the postsynaptic responses be affected by noise as well. This is particularly relevant to internal neurological systems such as cortical networks. Channel noise for the postsynaptic case can be simulated with nonzero in Eq. 3. Synaptic noise can be implemented by using an additional noise term to perturb the pulse width, . This lets the effective time gap for spatial summation fluctuate, mimicking the stochastic opening and closing of ion channels.
The neurocomputational properties of a model depend on the types of bifurcations it undergoes [15, 20]. The FN model is capable of exhibiting interesting features such as tonic spiking, phasic spiking, class-1 excitability, rebound spikes, accommodation, etc [41]. A further point of inquiry would be to see how postsynaptic responses vary for each of these properties as a function of parameters , , . For neurons stimulated by suprathreshold signals, different phase locking patterns are possible (where spikes occur every cycles) [42]. It would be interesting to see how postsynaptic neurons respond to these deterministic phase locking patterns of presynaptic neurons.
For the stochastic resonance simulations, we have used signals that are in phase with each other. This is under the assumption that the presynaptic (sensory) neurons are spatially close so that they are effectively stimulated by identical signals. We included phase differences so that where is randomly sampled and independently set for the presynaptic neurons. If is not small, the ISIH peaks of the postsynaptic response was found to decrease. Thus, anatomical features become relevant for the ability to exploit background noise if the neurons are separated enough.
We have used uncorrelated, white noise for our simulations. It has been shown both with experiment and analysis that coloured noise such as and noise can exhibit stochastic resonance and moreover enhance it [43, 30]. Implementing this to the postsynaptic case is an important extension of the results shown here.
It is known that single spikes produce unreliable synaptic transmission, with the probability of transmission ranging from to [18]. This suggests the functional importance of bursting patterns which can enhance transmission. A neural code that involves presynaptic neurons firing coincident bursts is thus likely to be robust. Bursts can be simulated in the model by adding a constant term to . This places the fixed point of the FN system closer to the Hopf bifurcation point. For stimulus frequencies that are smaller than the frequency of the limit cycle, we can have several spikes per stimulus cycle, mimicking the bursting dynamics. Note that will have to be obtained from the threshold plot computed by including the constant term added to . Probabilistic synaptic transmission is incorporated by introducing a random variable : Spikes are generated when the neurons cross the firing threshold and when (uniform sampling) where is the probability of transmission.
References
- [1] C. Koch, I. Segev, The role of single neurons in information processing, Nat Neurosci 3 (2000) 1171–1177.
- [2] E. R. Kandel, J. H. Schwartz, T. M. Jessell, S. A. Siegelbaum, A. J. Hudspeth, Principles Of Neural Science, McGraw-Hill, 2013.
- [3] B. Alberts, A. Johnson, J. lewis, D. Morgan, M. Raff, K. Roberts, P. Walter, Molecular Biology Of The Cell, Garland Science, 2014.
- [4] D. Purves, G. J. Augustine, D. Fitzpatrick, W. C. Hall, A. LaMantia, J. O. McNamara, S. M. Williams, Neuroscience, Sinauer Associates, 2004.
- [5] G. Stuart, N. Spruston, Dendritic integration: 60 years of progress, Nat Neurosci 18 (2015) 1713–1721.
- [6] N. Spruston, Pyramidal neurons: dendritic structure and synaptic integration, Nat Rev Neurosci 9 (3) (2008) 206–221.
- [7] T. B. P. Poirazi, B. W. Mel, Arithmetic of subthreshold synaptic summation in a model ca1 pyramidal cell, Neuron 37 (6) (2003) 977–987.
- [8] J. Hao, X.-d. Wang, Y. Dan, M.-m. Poo, X.-h. Zhang, An arithmetic rule for spatial summation of excitatory and inhibitory inputs in pyramidal neurons, Proceedings of the National Academy of Sciences 106 (51) (2009) 21906–21911.
- [9] R. FitzHugh, Impulses and physiological states in theoretical models of nerve membrane, Biophysical Journal 1 (6) (1961) 445–466.
- [10] A. L. Hodgkin, A. F. Huxley, A quantitative description of membrane current and its application to conduction and excitation in nerve, The Journal of Physiology 117 (4) (1952) 500–544.
- [11] M. Desroches, M. Krupa, S. Rodrigues, Inflection, canards and excitability threshold in neuronal models, J. Math. Biol. 67 (2013) 989–1017.
- [12] J. Mitry., M. McCarthy, N. Kopell, M. Wechselberger, Excitable neurons, firing threshold manifolds and canards, J. Math. Neurosc 3 (2013) 12.
- [13] F. D. E. Benoit, J. L. Callot, M. Diener, Chasse au canard, Collect. Math. 32 (1981) 37–119.
- [14] M. Diener, The canard unchained or how fast/slow dynamical systems bifurcate, The Mathematical Intelligencer 6 (1984) 38–49.
- [15] E. M. Izhikevich, Dynamical systems in neuroscience: the geometry of excitability and bursting, The MIT Press, 2007.
- [16] J. Durham, J. Moehlis, Feedback control of canards, Chaos: An Interdisciplinary Journal of Nonlinear Science 18 (1) (2008) 015110.
- [17] N. Popovi, Mixed-mode dynamics and the canard phenomenon: Towards a classification, J. Phys.: Conf. Ser. 138 (2008) 012020.
- [18] J. E. Lisman, Bursts as a unit of neural information: making unreliable synapses reliable, Trends in Neurosciences 20 (1) (1997) 38–43.
- [19] R. FitzHugh, Anodal excitation in the hodgkin-huxley nerve model, Biophysical Journal 16 (3) (1976) 209–226.
- [20] E. M. Izhikevich, Neural excitability, spiking and bursting, International Journal of Bifurcation and Chaos 10 (6) (2000) 1171–1266.
- [21] P. E. Kloeden, E. Platen, Numerical Solution of Stochastic Differential Equations, Springer, New York, 1992.
- [22] H. C. Tuckwell, R. Rodriguez, Analytical and simulation results for stochastic fitzhugh-nagumo neurons and neural networks, J Comput Neurosci 5 (1998) 91–113.
- [23] P. E. Kloeden, R. A. Pearson, The numerical solution of stochastic differential equations, The Journal of the Australian Mathematical Society. Series B. Applied Mathematics 20 (1) (1977) 8–12.
- [24] N. Spruston, Dendritic signal integration, in: L. R. Squire (Ed.), Encyclopedia of Neuroscience, Academic Press, Oxford, 2009, p. 445.
- [25] R. Stein, E. Gossen, K. Jones, Neuronal variability: noise or part of the signal?, Nat Rev Neurosci. 6 (2005) 389–397.
- [26] A. Faisal, L. Selen, D. Wolpert, Noise in the nervous system, Nat Rev Neurosci. 9 (2008) 292–303.
- [27] M. McDonnell, L. Ward, The benefits of noise in neural systems: bridging theory and experiment, Nat Rev Neurosci. 12 (2011) 415–425.
- [28] A. Longtin, Stochastic resonance in neuron models, J Stat Phys 70 (1993) 309–327.
- [29] J. J. Collins, C. C. Chow, T. T. Imhoff, Aperiodic stochastic resonance in excitable systems, Phys. Rev. E 52 (1995) R3321–R3324.
- [30] D. Nozaki, Y. Yamamoto, Enhancement of stochastic resonance in a fitzhugh-nagumo neuronal model driven by colored noise, Physics Letters A 243 (5) (1998) 281–287.
- [31] B. Lindner, L. Schimansky-Geier, Analytical approach to the stochastic fitzhugh-nagumo system and coherence resonance, Phys. Rev. E 60 (1999) 7270–7276.
- [32] A. S. Pikovsky, J. Kurths, Coherence resonance in a noise-driven excitable system, Phys. Rev. Lett. 78 (1997) 775–778.
- [33] D. Valenti, G. Augello, B. Spagnolo, Dynamics of a fitzhugh-nagumo system subjected to autocorrelated noise, Eur. Phys. J. B. 65 (2008) 443–451.
- [34] J. A. Acebrón, A. R. Bulsara, W.-J. Rappel, Noisy fitzhugh-nagumo model: From single elements to globally coupled networks, Phys. Rev. E 69 (2004) 026202.
- [35] E. N. Davison, Z. Aminzare, B. Dey, N. Ehrich Leonard, Mixed mode oscillations and phase locking in coupled fitzhugh-nagumo model neurons, Chaos: An Interdisciplinary Journal of Nonlinear Science 29 (3) (2019) 033105.
- [36] L. Gammaitoni, P. Hänggi, P. Jung, F. Marchesoni, Stochastic resonance, Rev. Mod. Phys. 70 (1998) 223–287.
- [37] B. McNamara, K. Wiesenfeld, Theory of stochastic resonance, Phys. Rev. A 39 (1989) 4854–4869.
- [38] J. Douglass, L. Wilkens, E. Pantazelou, F. Moss, Noise enhancement of information transfer in crayfish mechanoreceptors by stochastic resonance, Nature 365 (1993) 337–340.
- [39] D. Russell, L. Wilkens, F. Moss, Use of behavioural stochastic resonance by paddle fish for feeding, Nature 402 (1999) 292–294.
- [40] A. Borst, F. Theunissen, Information theory and neural coding, Nat Neurosci 2 (1999) 947–957.
- [41] E. Izhikevich, Which model to use for cortical spiking neurons?, IEEE Transactions on Neural Networks 15 (5) (2004) 1063–1070.
- [42] M. Guevara, L. Glass, Phase locking, period doubling bifurcations and chaos in a mathematical model of a periodically driven oscillator: A theory for the entrainment of biological oscillators and the generation of cardiac dysrhythmias, J. Math. Biology 14 (1982) 1–23.
- [43] D. Nozaki, D. J. Mar, P. Grigg, J. J. Collins, Effects of colored noise on stochastic resonance in sensory neurons, Phys. Rev. Lett. 82 (1999) 2402–2405.