Criticality and Chaos in Auditory and Vestibular Sensing
Abstract
The auditory and vestibular systems exhibit remarkable sensitivity of detection, responding to deflections on the order of Angstroms, even in the presence of biological noise. Further, these complex systems exhibit high temporal acuity and frequency selectivity, allowing us to make sense of the acoustic world around us. As this acoustic environment of interest spans several orders of magnitude in both amplitude and frequency, these systems rely heavily on nonlinearities and power-law scaling. The behavior of these sensory systems has been extensively studied in the context of dynamical systems theory, with many empirical phenomena described by critical dynamics. Other phenomena have been explained by systems in the chaotic regime, where weak perturbations drastically impact the future state of the system. We first review the conceptual framework behind these two types of detectors, as well as the detection features that they can capture. We then explore the intersection of the two types of systems and propose ideal parameter regimes for auditory and vestibular systems.
I Introduction and Overview
Auditory and vestibular systems perform tasks that are crucial for the survival of an animal, enabling it to navigate in space, detect signals from predators or prey, identify and attract potential mates, and communicate with members of the same species. To achieve these tasks, the sensory system must detect extremely weak signals, extract them from noisy environments, distinguish tones of closely spaced frequencies, and precisely parse temporal information. Specifically, near our threshold of hearing, we are able to detect displacements of the eardrum smaller than the width of the hydrogen atom [1]. This detection occurs in the presence of internal thermal fluctuations of equal or higher magnitude. Further, temporal resolution of humans typically reaches 10 microseconds [2, 3], enabling sound localization through interaural time differences [4]. Finally, we are able to resolve tones that differ in frequency by only a fraction of a percent [5]. In parallel, the auditory system achieves immense dynamic range in both amplitude and frequency of acoustic detection. We are able to detect sounds that span over 12 orders of magnitude in intensity and 3 orders of magnitude in frequency. These broad ranges are reflected in our logarithmic decibel scale for sound intensity and logarithmic spacing of musical intervals.
The remarkable features of the auditory system rely heavily on two factors: active amplification and nonlinear response. First, the system has been shown to violate the fluctuation dissipation theorem, indicating that it cannot be governed by equilibrium statistical mechanics [6]. Vast amounts of empirical evidence obtained in vivo demonstrate that the inner ear expends energy to amplify signals [7]. Second, compressive nonlinearities in the response to external stimuli enable the extensive dynamic range, while maintaining sensitivity to weak signals [8, 6]. These nonlinearities have been detected at all scales measured, from individual sensory cells to in vivo phenomena known as phantom tones [9, 10, 11]. How the auditory system utilizes energy expenditure and nonlinearities to achieve its remarkable detection characteristics, however, remains unknown after more than 7 decades of research [12, 13, 14].
The theoretical framework for auditory detection that was developed over the past 20 years is based on the notion of a dynamical system poised near criticality [15], on the verge of autonomous oscillation. The models apply the normal form equation for the supercritical Hopf bifurcation to describe the auditory system, and elegantly capture the mechanical sensitivity, frequency selectivity, and amplitude-compressive response, thus reproducing a broad range of experimental results [16, 17]. In the vicinity of the bifurcation, the system’s sensitivity to external signals increases, while the frequency selectivity sharpens, thus pointing to criticality as the optimal regime for signal detection [18].
However, while proximity to a bifurcation yields many advantages, the description contains an innate constraint: at criticality, the system becomes infinitely slow, with transient times diverging as a result of critical slowing down. This sluggish behavior poses an undesirable tradeoff between sensitivity of a detector and its speed, and is inconsistent with the high temporal acuity exhibited by our auditory system. Furthermore, at the critical point, the system is very sensitive to the effects of stochastic fluctuations, which limits some of the advantages observed in the deterministic models. As noise is a ubiquitous component of any biological system, and its effects specifically near criticality are not negligible, this constraint limits the advantages of tuning a system near a bifurcation point.
An alternate theoretical framework, developed to reconcile the requirement for high sensitivity of detection with the need for a rapid temporal response, is based on the notion of chaotic dynamics [19]. When a system is poised in a chaotic regime, even infinitesimally weak external perturbations trigger large changes in the subsequent trajectory, thus yielding high sensitivity. As a result of this exponential divergence, the system also responds and resets rapidly. Hence, a chaotic system avoids the inherent tradeoff observed with criticality. In a prior study, we demonstrated experimentally the presence of a chaotic attractor in the innate and driven oscillations exhibited by sensory cells in vitro [20]. Using mathematical methods from dynamical systems literature, we confirmed that the oscillator contains a deterministic component and is not completely dominated by biological noise and other stochastic processes. Further, we showed chaos to be beneficial in this system, as the instabilities from which chaos arises enhance the sensitivity and temporal resolution of the response [19].
We note that chaos is typically considered a harmful element in applied mathematics literature, as it limits control and predictability. However, it has been proposed to play a potentially helpful role in certain biological systems, as it enhances their dynamical complexity [21, 22]. As chaotic regimes can arise in the presence of three or more degrees of freedom, we predict that many more examples of living systems utilizing chaotic dynamics will be uncovered in the future.
In the present work, we aim to compare the relative advantages of chaos versus criticality in the theoretical description of auditory and vestibular detection. In particular, we explore the interface of these two theoretical models: a system poised near the supercritical Hopf bifurcation and one poised in the chaotic regime, as well as the continuum describing transitions between the regimes. To assess separately each of the key features exhibited by these sensory systems, in each section, we characterize the sensitivity, frequency selectivity, temporal acuity, and power-law amplitude response of a Hopf oscillator, near and far from criticality, in the presence and absence of chaos. As stochastic fluctuations play a non-negligible role in dynamical systems, we find the parameter regimes that optimize these detection characteristics in the presence of noise. We then combine these independent metrics to propose a simple conceptual framework for auditory and vestibular detection.


II Hopf bifurcation
The inner ear of vertebrates contains a number of end organs that specialize in either auditory or vestibular detection, the latter including both translational and rotational movement. While the pathways by which external signals reach the inner ear vary, they ultimately result in mechanical vibrations of internal structures; hence, the two sensory systems exhibit many features in common. Conversion of mechanical energy of a sound, vibration, or acceleration into electrical energy in the form of ionic currents is performed by specialized, sensory hair cells of both the auditory and vestibular systems. The hair cell gets its name from the rod-like, inter-connected stereocilia that protrude from the apical surface, which are collectively named the hair bundle. An incoming stimulus pivots the stereocilia and modulates the open probability of the transduction channels embedded in the tips of the stereocilia [23, 24]. In several species, these hair bundles have been shown to exhibit active limit-cycle oscillations in the absence of applied stimulus [25, 26, 27, 6]. Though the role of these spontaneous hair-bundle oscillations in vivo has yet to be established, they serve as an experimental probe for studying this active system, as they lead to sub-nanometer thresholds in vitro [28, 29].
The Hopf oscillator has been extensively used for modeling and understanding the phenomenon of active, spontaneous hair-bundle oscillations, as well as more global features of the auditory and vestibular systems [18]. When the system is poised on the verge of instability (near the Hopf bifurcation), it becomes extremely sensitive to small perturbations. In the absence of noise, the amplitude gain of the response diverges as the system approaches criticality. The model was shown to capture active amplification and power-law amplitude response observed both in vivo and in vitro on a number of different species [5].
A Hopf oscillator is described by time-dependent complex variable, , which is governed by a normal form equation, which in its simplest version, takes the form:
(1) |
where and are the control parameter and characteristic frequency of the detector, respectively. represents the external forcing on the system, while is a stochastic variable, representing thermal noise. This variable is complex, with independent real and imaginary parts, both of which have statistics of Gaussian white noise: , , and , where is the complex conjugate of and defines the noise strength.
In the absence of forcing and noise (), the system can be more easily understood in polar coordinates, by letting , thereby separating the complex variable into two real variables. This results in the pair of equations,
(2) |
which describe the amplitude and phase dynamics of the system, respectively. Notice that the instantaneous frequency is constant, having no dependence on the oscillator’s amplitude, . This defines an isochronous oscillator.
The amplitude dynamics are determined by the control parameter, . For , the system displays a stable fixed point with increasing stability for more negative values of the control parameter. As the control parameter approaches the critical point at , the system loses stability and spontaneous limit-cycle oscillations emerge. For , as this parameter increases in the positive regime, the spontaneous oscillations grow larger and more stable. Precisely at the critical point, the system becomes infinitely sensitive in the absence of noise, as an infinitesimal perturbation can cause large amplitude displacements. For both positive and negative values of the control parameter, the amplitude returns to its steady-state exponentially, with characteristic time scale proportional to (see the Appendix). In the vicinity of the bifurcation, the linear term becomes vanishingly small, and the system returns to steady-state very slowly, with perturbations diminishing according to a power-law (Fig. 1A). As the control parameter approaches the Hopf bifurcation, this time scale diverges without bound at , a dynamical systems phenomenon known as critical slowing down.
In the presence of noise, a system near the critical point becomes very susceptible to stochastic fluctuations. In Fig. 1B, we show how the amplitude of limit cycles oscillations of the system depend on the control parameter, at different levels of additive noise. As can be seen, the infinite sensitivity that is present in the deterministic limit is removed by the presence of noise, and the bifurcation point is smeared. Noise also obscures or even removes the effects of critical slowing down. We see that the long transient associated with power-law decay at the critical point is shortened, as the steady-state amplitude is no longer zero (Fig. 1C). The level of noise required to obscure the effect () significantly smears out the bifurcation diagram, removing the critical point. Different levels of noise therefore result in different tradeoffs between sensitivity and speed of the response.


III Nonisochronicity and Chaos
In several studies of auditory and vestibular systems, a more general version of the Hopf oscillator was considered [31, 19]. The equation takes the form
(3) |
where and are introduced to characterize the nonlinearity of the system, while all the other parameters carry the same meaning as in the isochronous case previously described. In the absence of stimulus and noise (), this equation can be written in polar coordinates as
(4) |
For , a stable limit cycle exists at radius . The frequency at this limit cycle is , where is the frequency at the Hopf bifurcation. This more general description reduces to the traditional, isochronous form when and . We restrict our analysis to systems with and vary to control the level of nonisochronicity. This new parameter causes the frequency of oscillation to depend on the amplitude of oscillation (Fig. 2C).
By coupling the radial and phase degrees of freedom, the presence of nonisochronicity () introduces significant complexity to the dynamics of this system. It distorts the Lorentzian shape of the frequency response curve [32] and can even produce branching of this curve, discontinuities, and bistability for a range of frequencies, as well as hysteretic behavior in response to frequency sweeps [33]. The presence of nonisochronicity also makes the system susceptible to chaotic dynamics. Chaos in dynamical systems is characterized by extreme sensitivity to initial conditions and exponential divergence of neighboring trajectories. In the nonisochronous Hopf oscillator, chaos can arise from sinusoidal or impulsive external forcing [19, 34], as well as systems driven purely by stochastic white noise. To distinguish exponentially diverging, chaotic trajectories from simple diffusion induced by noise, it is useful to introduce identical realizations of noise (common noise) to systems with slightly different initial conditions (Fig. 2A). If the distance between neighboring trajectories diminishes with time or remains the same, the system is non-chaotic. However, if the common noise stimulus causes neighboring trajectories to diverge exponentially, then the system exhibits noise-induced chaos [35, 36]. We note that the exponential divergence is observed only over short time scales; as a chaotic system is bounded in phase space, the distance between trajectories plateaus with time (Fig. 2D).
The rate of divergence of neighboring trajectories is expected to follow , where is the Euclidean distance between two nearby phase-space trajectories, and is the Lyapunov exponent. Chaotic systems are often defined by . For quiescent, non-chaotic systems, the Lyapunov exponent is negative and characterizes the system’s stability, or the rate at which the system returns to equilibrium following a perturbation. For a Hopf oscillator with and no forcing or noise, the Lyapunov exponent can be found by expanding around the stable fixed point (see the Appendix). One finds that in this simple case, , which indicates that the system becomes more stable for more negative values of .
In the absence of noise, as the control parameter crosses from the quiescent to the oscillatory regime, the Lyapunov exponent increases to zero and remains there even for (). Since perturbations in the phase neither diverge nor converge, trajectories are neutrally stable along the angular direction of the limit cycle. In the presence of noise-induced chaos, analytic approximations often becomes intractable, but numerical simulations yield values of the Lyapunov exponent that can be positive, negative, or zero. For a noisy nonisochronous Hopf oscillator, an analytic approximation can be made in the regime of sufficiently stable limit cycles. In this regime, the Lyapunov exponent shows a simple dependence on both the control and nonisochronicity parameters [19],
(5) |
This approximation is particularly useful in weakly chaotic regimes, where numerical simulations are computationally expensive. We show the robustness of this approximation for sufficiently stable limit cycles, as well as the simpler approximation for quiescent systems. The analytic approximation however breaks down as the system approaches criticality from the oscillatory side, because the assumption of a sufficiently stable limit cycle is no longer valid (Fig. 3A). We provide a map of the Lyapunov exponent calculated numerically throughout the parameter space (Fig. 3C). These calculations serve to show the degree of chaos as the system crosses the Hopf bifurcation, a regime not explored analytically.

IV Mechanical Sensitivity and Information Transfer
Near the Hopf bifurcation, a noiseless system displays immense sensitivity, with the amplitude gain of the response diverging precisely at the bifurcation [18]. However, this compliance also makes the system susceptible to stochastic fluctuations. In the oscillatory regime, the detector becomes more resistant to noise, but also harder to entrain by external signals. Chaos can assist detectors in synchronizing to external signals; however, this effect can likewise increase susceptibility to external noise and off-resonance stimulus frequencies. We hence determine the sensitivity of the Hopf oscillator throughout the -plane and identify the optimal regime for several types of stimulus.
We first consider a weak, single-tone, on-resonance stimulus in the presence of noise. We employ the linear response function, , to characterize the sensitivity of the system (see Numerical Methods). For systems that exhibit autonomous oscillations, this measure exhibits a spurious non-zero value if the response does not synchronize to the signal. To avoid this issue, we ensure that only the phase-locked component is included in the calculation. We do this by averaging the responses over many systems, each prepared with different initial phases, thereby averaging out any oscillatory component that does not synchronize to the stimulus. It has previously been shown that the isochronous Hopf oscillator detects this signal best when poised as far into the oscillatory regime as possible [37]. Our results are consistent with this finding, with even further improvement when the system is weakly chaotic (Fig. 4). Further, we show that detectors in this regime outperform those in the critical regime for all levels of noise considered.
Next, we consider stochastic modulations in the amplitude (AM) and frequency (FM) of this signal (see Numerical Methods). Unlike the pure-tone stimulus, these signals carry information in their modulations. We therefore think of the detector not only as a mechanical resonator, but also as an information-theoretic receiver. We employ transfer entropy as the measure of information captured by the receiver [38]. This measure is particularly useful, as it carries no assumptions about which features of the external signal are important. Instead, it directly measures the amount of information transmitted from one process to another, and can even be used to establish causality between two processes. For both types of modulation, we find the oscillatory, chaotic regime to yield optimal detection of information carried by the signal. We also demonstrate the robustness of this measure to stochastic fluctuations and show that this regime is preferred over the critical, non-chaotic regime (Fig. 4).

V Temporal Acuity
High temporal acuity is essential for a sensory system to be responsive to brief signals. Further, localization of sound by vertebrates relies on interaural time differences as small as a few microseconds, which correspond to temporal differences of just a fraction of a single stimulus cycle. When such small differences are biologically meaningful, the system’s response to a stimulus and its subsequent return to steady state must occur rapidly. To characterize this temporal acuity, we apply a step stimulus and measure the time the system requires to reach its steady state response, as well as the time it takes to return to its unstimulated steady state after cessation of the signal.
We previously found that nonisochronicity, and therefore the degree of chaos, greatly increases the speed of response to a step stimulus. These results are consistent with experimental measurements [19]. In the present work, we extend our analysis to variations in the control parameter and determine the temporal acuity of the Hopf oscillator in the -plane. We apply a step stimulus to the detector (Fig. 5A), averaging over many simulations with different initial conditions and realizations of noise. This method of calculating the mean response averages out the stochastic fluctuations, as well as any autonomous oscillations that would otherwise obscure slow modulations to the mean-field response.
We define the response time () to be the time it takes the system to settle to and remain within 4 standard deviations of its steady-state mean value following the onset of the step stimulus. Likewise, we define the return time () to be the time it takes the mean response of the system to become indistinguishable (within a standard deviation) from its value prior to the stimulus. We consider the limiting factor () to be the maximum of the two time constants. For all parameters tested, the return time was the limiting factor (Fig. 5D-F). For simplicity, we therefore drop the subscripts and let .
Stochastic fluctuations not only obscure the mean-field response, but can also distort it. As the noise level increases, so does the average amplitude of the system. This increase in baseline amplitude reduces the distance traveled by the returning mean response, effectively increasing the speed () according to our definition. We measure the speed over five orders of magnitude in noise strength and find that the chaotic, oscillatory system is faster to return than the critical system (Fig. 5B). For extremely large levels of noise, the effects of critical slowing down are removed. However, the levels of noise sufficient to equalize the speed of the chaotic and non-chaotic response are so large, leading to a vast reduction in the sensitivity of the system in both regimes (Fig. 4A, C, and E).

VI Frequency Selectivity vs Broadband Detection
Although chaos can make dynamical systems sensitive to small perturbations and external signals, these systems tend to synchronize to a broad range of frequencies. This may be a beneficial effect for broadband detectors, such as vestibular systems, as it would increase the energy transmitted to the system and likely result in lower thresholds of detection. However, for frequency selective detectors, like those of auditory systems, this effect may be harmful.
We stimulate the Hopf oscillator with additive Gaussian white noise and measure the power spectra of the response throughout the parameter space (Fig. 6A). As these curves indicate how sharply the system filters the white noise, we use them to characterize the frequency selectivity [36]. We observe that the frequency selectivity increases with increasing and decreasing . Near the Hopf bifurcation, we see that the resonance frequency depends on . This is a consequence of the noise altering the steady-state amplitude, and thereby the frequency, in this compliant regime. In the deterministic system, the limit-cycle frequency is independent of by definition (. Instead, is varied along with the other parameters, so as to keep constant.
To characterize the frequency selectivity of the Hopf oscillator, we employ the quality factor of the response by estimating the full width at half the maximum () of the response curves. The unitless quality factor is defined as , where is the peak frequency. As expected, the most frequency selective parameter regime occurs at high values of and low values of (Fig. 6B). The quality factor is useful for characterizing the frequency selectivity of a single-frequency or narrowband detector.
As some auditory and most vestibular systems are responsible for detecting a broader range of frequencies, we utilize the threshold bandwidth () for characterizing multi-frequency or broadband detection, a metric suggested in [37]. We estimate this measure by taking the range of frequencies whose Fourier components have magnitudes exceeding a given threshold. We choose the threshold to be , which corresponds to approximately a factor of above the noise floor of the critical system. The threshold bandwidth increases with increasing , as the energy from the spontaneous oscillations amplifies the signal (Fig. 6C). Further, the threshold bandwidth initially increases with increasing due to the broadening of the frequency response curves. However, the then diminishes for very large values of , as the energy becomes so spread out in frequency space that few components exceed the threshold.
VII Power-law Scaling of Response
The human auditory system can detect a range of stimulus amplitudes spanning over 6 orders of magnitude in pressure. This process relies on nonlinearities in order to both amplify weak sounds and attenuate loud sounds, thereby protecting the system from damage. These nonlinearities have been measured in vivo through otoacoustic emissions and through laser measurements of basilar membrane motion. Further, the attenuation of large amplitudes has been observed in vitro on active hair bundles [16]. At weak forcing, hair cells display a linear response; as the stimulus amplitudes increase, the amplitude response scales as . The linear and 1/3-power-law responses are reproduced well by the Hopf oscillator, as either the linear or the cubic term dominates in different regimes. Near criticality, the range over which the power law is observed increases.
We have demonstrated that the nonisochronicity parameter, , improves the sensitivity to weak signals. We here show that this parameter also causes stronger attenuation of the response at large stimulus amplitudes. This increased attenuation can be understood by calculating the response in the strong-forcing limit, where the cubic term dominates (see the Appendix). In this limit, the response amplitude is scaled by . The combination of these two effects yields an increase in the range of stimulus levels over which amplitude compression is observed and, hence, increases the dynamic range of the system. In Fig. 7, we show that a chaotic oscillator with compresses the response amplitude by approximately an order of magnitude more than the isochronous oscillator.

VIII Optimal Signal Detector
The optimal parameter regime depends heavily on the application of the signal detector and the desired specifications. For single-frequency or narrowband detectors, it may be favorable to have the quality factor of the response as high as possible. However, for a broadband detector, a large quality factor would be unfavorable, as it would attenuate frequencies that should be captured. We note that there exists a spectrum of desired capabilities of signal detectors and no single metric can fully characterize the performance. We therefore incorporate all six measures into a single score, which can be weighted in accordance with the application of the system. We first define the vector
(6) |
where the elements of this vector are the on-resonance linear response, transfer entropy from FM stimulus, transfer entropy from AM stimulus, quality factor, threshold bandwidth, and speed of response, respectively. Each element is linearly scaled such that its range runs from to at the minimum and maximum values in the parameter space. To better illustrate the optimal regime, we scale the measures such that they saturate at the 90th percentile of their distributions. We then define the detection index to be the sum of the elements in , scaled by relative weights of importance,
(7) |
where is a vector whose values represent the relative weights of importance of the measures.
The detection index could optimize at several different places, depending on the desired specifications. From the biological perspective, not only do auditory and vestibular sensors have different demands, but moreover, the same end organs from different species are likely to be optimized for different environments and various conspecific calls. Furthermore, even the same system may self tune into different regimes when placed in different acoustic surroundings. These variations can readily be captured by the appropriate selection of the weighting factor, . We show a possible selection of weights that illustrates how a narrowband auditory organ may be optimized.
This narrowband detector should prioritize the linear response function at resonance, the quality factor of the response, the speed of response, and the transfer entropy from a narrowband FM signal. Since a large threshold bandwidth would be harmful to this detector’s purpose, we set its weight to zero and use the weight vector, . Using these weights, we find that the optimal regime resides far from the Hopf bifurcation, plateauing at large values of , and with a small amount of chaos (Fig. 8C). For a broadband, vestibular organ, we predict that the optimal regime is in the oscillatory regime, far from the Hopf bifurcation and with a higher degree of chaos (see the Appendix).
IX Discussion
We have determined the performance of the Hopf oscillator as a signal detector throughout its parameter space by varying the proximity to criticality and the degree of chaos. To the best of our knowledge, the intersection of these two properties has not previously been studied in the context of optimal signal detection. We first calculated the Lyapunov exponent of the system in the -plane, near the Hopf bifurcation, characterizing the level of chaos induced by noise. We showed the breakdown of a previous analytic approximation as the system approaches the Hopf bifurcation. Instead of diverging, the Lyapunov exponent decreases to zero continuously and becomes negative for , as is expected for dynamics near a fixed point. This allows us to correlate the degree of chaos with the nonisochronicity parameter, .
Providing the oscillator with several types of stimulus, we demonstrated that the sensitivity and temporal acuity of the chaotic system can exceed that of the critical system, regardless of the level of external noise. Further, we showed that the chaotic system compresses the response of the system to large-amplitude signals, simply due to the increase in magnitude of the cubic parameter, . This increases the dynamic range of stimulus amplitudes the system can detect before damage occurs. All three of these measures (sensitivity, temporal acuity, and amplitude compression) have been shown to be important aspects for signal detection by the auditory and vestibular systems.
The optimal parameter regime for frequency selectivity, when assessed in isolation, depends heavily on the application of the detector. The tuning curves broaden as the system approaches criticality from the oscillatory side, and when the level of nonisochronicity is increased. The most sharply-tuned detectors can be found when the system is isochronous and as far into the oscillatory regime as possible. Hence, the only one of our metrics that is degraded by chaos is the quality factor. Note, however, that it is very weakly reduced by small amounts of nonisochronicity. The other five metrics are improved by chaos (Fig. 8A-B and Fig. 9A-B). Temporal acuity monotonically increases with the degree of chaos, while the other metrics (linear response function, transfer entropy, and threshold bandwidth) are maximized at a finite level of chaos.
When considering all measures together, the optimal parameter regime depends on the application of the detector and the desired specifications. We consider two scenarios when proposing optimal regimes. The first is a single-frequency or narrowband detector. When incorporating all of the relevant measures for this detector, we find that the optimal regime is in the oscillatory state, far from the Hopf bifurcation, with weakly chaotic dynamics. We propose that this regime would be well-suited for frequency-selective auditory systems. For the second scenario (see the Appendix), we consider a system designed to detect as broad a range of frequencies as possible, and capture information from transient stimuli, which tend to contain many frequencies. In this case, we show that the optimal regime resides in the oscillatory regime, with a large degree of chaos. We propose that this regime is well-suited for vestibular systems.
Overall, we show that all the relevant detection properties are enhanced by the presence of chaos. Specifically, the issue of critical slowing down is removed, while sensitivity is enhanced. Hence, this tradeoff inherent in critical systems is resolved, and evaluating different regimes of detection - broadband versus frequency selective - changes only the optimal level of chaos. Finally, optimal performance is achieved, not at a specific point, but rather in a broad range of parameter space, which would endow the biological system with flexibility and robustness and obviate the need for extremely precise fine-tuning of parameters.
The current study considers only individual Hopf oscillators, representing single sensory elements. In certain species, the sensory hair bundles are free-standing [39]; however, hair bundles of most vestibular and auditory organs studied display some degree of mechanical coupling to each other. The strength and extent of this coupling varies greatly for different specializations of the sensory organ [40]. Considering the diverse configurations of mechanical coupling, we have focused this study on the ability of individual Hopf oscillators to detect signals. We speculate that the performance of the coupled system will correlate with that of its individual elements. However, we point out two issues of individual detectors that have been shown to resolve in the coupled system. The first is the smearing out of the critical point in the presence of noise (Fig. 1B). It has been shown that, although noise removes criticality in the individual Hopf oscillator, it can be restored in the coupled system [41]. Second, although chaos deteriorates the frequency selectivity of individual detectors, it can be restored in arrays of coupled detectors, provided that the elements synchronize to each other [42].
Nonisochronicity is often excluded from numerical models of these sensory systems. It greatly increases the complexity of the system, leading to multi-stability and a chaotic response to various types of signals [34, 19], including white noise. Further, the nonisochronous term in the Hopf oscillator was shown to lead to violation of a generalized version of the fluctuation dissipation theorem, breaking any simple relations between the system’s sensitivity to stimulus and susceptibility to stochastic fluctuations [43]. In the isochronous picture, poising a system near the Hopf bifurcation can be beneficial. The transfer entropy is maximized (Fig. 4), and the system can achieve large, entrained responses to weak signals, provided that the noise is sufficiently weak. However, to make the system robust to noise, it must reside in the oscillatory regime, thereby utilizing the energy of the autonomous motion to amplify the signal. This results in a tradeoff along the control-parameter axis, leading to an optimal value for , where the system is close enough to the bifurcation to entrain to a signal, but oscillatory enough to be robust to noise [37].
When nonisochronicity and chaos are introduced, the optimal regime moves deeply into the oscillatory regime, while the entrainability can then be controlled by the level of chaos. This nonisochronicity parameter can be tuned to optimize the detection capabilities of the system, with different optimal values existing, depending on the application of the system. We also note that this parameter controls the degree of synchronization in an array of coupled Hopf oscillators [44]. Along with the proximity to criticality, we speculate that the degree of chaos is an important characteristic of other biological systems responsible to detecting signals and those that exhibit synchronization of their active components.
Numerical Methods
Stochastic differential equations were solved using Heun’s method with time steps ranging from to .
Response Amplitude & Linear Response Function
To determine phase-locked amplitudes we first compute the average response to a sinusoidal stimulus of 64 systems each prepared with different initial phases uniformly spaced across the deterministic limit cycle. This method ensures that any non-synchronized oscillations at the stimulus frequency will average to zero, and only signals that lock to the stimulus will be counted toward the response. We then fit a sinusoid to the mean response with frequency fixed to the stimulus frequency. We define the phase-locked amplitude as the amplitude of this fit. We then compute the linear response function by dividing this response amplitude by the forcing amplitude.
Frequency-Modulated (FM) and Amplitude-Modulated (AM) Stimuli
The frequency-modulated forcing takes the form
(8) |
where is the instantaneous phase of the stimulus, and we set . The instantaneous stimulus frequency is centered at , with additive stochastic fluctuations,
(9) |
where is low-pass filtered Gaussian white noise (pink noise) with a brick-wall cutoff frequency of . We let the standard deviation of this variable equal . We can then calculate the instantaneous phase of the stimulus,
(10) |
Information production is determined solely by the frequency modulator, . This method of generating the signal does not influence the amplitude and allows us to examine the effects of information transmission through frequency modulation alone.
Similarly, amplitude-modulated signals take the form
(11) |
where is low-pass filtered Gaussian white noise (pink noise) with a brick-wall cutoff frequency of . We set the mean and standard deviation of this stochastic amplitude modulator to be and , respectively.
Transfer Entropy
The transfer entropy [38] from process to process is defined as
(12) |
where are the most recent states of process . Therefore, is the conditional probability of finding process in state at time , given that the previous states of process were and that the previous states of process were . The summation runs over all points in the time series and over all accessible states of both processes. The transfer entropy measures how much one’s ability to predict the future of process is improved upon learning the history of process . The measure is asymmetric upon switching and , as information transfer between two processes is not necessarily symmetric. For stimulus-response data, this asymmetry allows for control tests by measuring the transfer entropy from the response to the stimulus, which should be zero. The choice of and has little effect on the results, so we select , and sample the 5 points such that they span one mean period of the system. We discretize the signal into 4 amplitude bins, however, similar results were obtained when using 2 bins.
Return Time
We determine the return time, , by calculating the average response from 64 simulations to a large-amplitude step stimulus,
(13) |
where is the Heaviside step function, and we set , , and . We define the return time as the time it takes the mean response of the system to return to a value within a standard deviation of the mean steady-state amplitude, as measured by the data prior to the step onset.
Quality Factor & Threshold Bandwidth
To estimate the quality factor of the system response, we stimulate with additive white Gaussian noise (), and calculate the average power spectrum over 60 simulations, each with different initial conditions and realizations of noise. This produces a smooth curve, which can then be used to estimate the full width at half maximum and the quality factor of the response. We then use this curve to calculate threshold bandwidth by determining the range of frequencies for which the Fourier amplitudes exceed a threshold of . This threshold was chosen as it is approximately an order of magnitude above the noise floor.
Acknowledgments
The authors gratefully acknowledge the support of NSF Biomechanics and Mechanobiology under Grant No. 1916136.
References
- Dalhoff et al. [2007] E. Dalhoff, D. Turcanu, H.-P. Zenner, and A. W. Gummer, Distortion product otoacoustic emissions measured as vibration on the eardrum of human subjects, Proceedings of the National Academy of Sciences 104, 1546 (2007).
- Leshowitz [2005] B. Leshowitz, Measurement of the Two-Click Threshold, The Journal of the Acoustical Society of America 49, 462 (2005).
- Brughera et al. [2013] A. Brughera, L. Dunai, and W. M. Hartmann, Human interaural time difference thresholds for sine tones: The high-frequency limit, The Journal of the Acoustical Society of America 133, 2839 (2013).
- Grothe et al. [2010] B. Grothe, M. Pecka, and D. McAlpine, Mechanisms of Sound Localization in Mammals, Physiological Reviews 90, 983 (2010).
- Hudspeth [2014] A. J. Hudspeth, Integrating the active process of hair cells with cochlear function, Nat Rev Neurosci 15, 600 (2014).
- Martin et al. [2001] P. Martin, A. J. Hudspeth, and F. Jülicher, Comparison of a hair bundle’s spontaneous oscillations with its response to mechanical stimulation reveals the underlying active process, Proceedings of the National Academy of Sciences 98, 14380 (2001).
- Hudspeth [2008] A. J. Hudspeth, Making an Effort to Listen: Mechanical Amplification in the Ear, Neuron 59, 530 (2008).
- Eguíluz et al. [2000] V. M. Eguíluz, M. Ospeck, Y. Choe, A. J. Hudspeth, and M. O. Magnasco, Essential Nonlinearities in Hearing, Phys. Rev. Lett. 84, 5232 (2000).
- Walker [1978] D. P. Walker, Studies in Musical Science in the Late Renaissance (Univ. London Press, 1978) pp. 136–145.
- Jaramillo et al. [1993] F. Jaramillo, V. S. Markin, and A. J. Hudspeth, Auditory illusions and the single hair cell, Nature 364, 527 (1993).
- Barral and Martin [2012] J. Barral and P. Martin, Phantom tones and suppressive masking by active nonlinear oscillation of the hair-cell bundle, Proceedings of the National Academy of Sciences 109, E1344 (2012).
- Gold and Pumphrey [1948] T. Gold and R. J. Pumphrey, Hearing. I. the cochlea as a frequency analyzer, Proceedings of the Royal Society B 135, 462 (1948).
- Gold [1948] T. Gold, Hearing. II. the physical basis of the action of the cochlea, Proceedings of the Royal Society B 135, 492 (1948).
- Reichenbach and Hudspeth [2014] T. Reichenbach and A. J. Hudspeth, The physics of hearing: Fluid mechanics and the active process of the inner ear., Reports on progress in physics 77, 076601 (2014).
- Duke and Jülicher [2008] T. A. J. Duke and F. Jülicher, Critical Oscillators as Active Elements in Hearing, Active processes and otoacoustic emissions in hearing , 63 (2008).
- Martin and Hudspeth [2001] P. Martin and A. J. Hudspeth, Compressive nonlinearity in the hair bundle’s active response to mechanical stimulation., Proceedings of the National Academy of Sciences 98, 14386 (2001), 3057286 .
- Robles and Ruggero [2001] L. Robles and M. A. Ruggero, Mechanics of the Mammalian Cochlea, Physiological Reviews 81, 1305 (2001).
- Choe et al. [1998] Y. Choe, M. O. Magnasco, and A. J. Hudspeth, A model for amplification of hair-bundle motion by cyclical binding of Ca2+ to mechanoelectrical-transduction channels, Proceedings of the National Academy of Sciences 95, 15321 (1998).
- Faber and Bozovic [2019a] J. Faber and D. Bozovic, Chaotic Dynamics Enhance the Sensitivity of Inner Ear Hair Cells, Sci Rep 9, 18394 (2019a).
- Faber and Bozovic [2018] J. Faber and D. Bozovic, Chaotic Dynamics of Inner Ear Hair Cells, Sci Rep 8, 3366 (2018).
- Kaplan et al. [1996] D. T. Kaplan, J. R. Clay, T. Manning, L. Glass, M. R. Guevara, and A. Shrier, Subthreshold Dynamics in Periodically Stimulated Squid Giant Axons, Phys. Rev. Lett. 76, 4074 (1996).
- O’Byrne and Jerbi [2022] J. O’Byrne and K. Jerbi, How critical is brain criticality?, Trends in Neurosciences 45, 820 (2022).
- LeMasurier and Gillespie [2005] M. LeMasurier and P. G. Gillespie, Hair-Cell Mechanotransduction and Cochlear Amplification, Neuron 48, 403 (2005).
- Vollrath et al. [2007] M. A. Vollrath, K. Y. Kwan, and D. P. Corey, The Micromachinery of Mechanotransduction in Hair Cells, Annual Review of Neuroscience 30, 339 (2007).
- Martin et al. [2003] P. Martin, D. Bozovic, Y. Choe, and A. J. Hudspeth, Spontaneous oscillation by hair bundles of the bullfrog’s sacculus., J. Neurosci. 23, 4533 (2003).
- Benser et al. [1996] M. E. Benser, R. E. Marquis, and A. J. Hudspeth, Rapid, active hair bundle movements in hair cells from the bullfrog’s sacculus, J Neurosci 16, 5629 (1996).
- Crawford and Fettiplace [1985] A. C. Crawford and R. Fettiplace, The mechanical properties of ciliary bundles of turtle cochlear hair cells., The Journal of Physiology 364, 359 (1985).
- Narins and Lewis [1984] P. M. Narins and E. R. Lewis, The vertebrate ear as an exquisite seismic sensor, The Journal of the Acoustical Society of America 76, 1384 (1984).
- Martin and Hudspeth [1999] P. Martin and A. J. Hudspeth, Active hair-bundle movements can amplify a hair cell’s response to oscillatory mechanical stimuli, Proceedings of the National Academy of Sciences 96, 14306 (1999).
- Fredrickson-Hemsing et al. [2012] L. Fredrickson-Hemsing, C. E. Strimbu, Y. Roongthumskul, and D. Bozovic, Dynamics of Freely Oscillating and Coupled Hair Cell Bundles under Mechanical Deflection, Biophysical Journal 102, 1785 (2012).
- Roongthumskul et al. [2021] Y. Roongthumskul, J. Faber, and D. Bozovic, Dynamics of Mechanically Coupled Hair-Cell Bundles of the Inner Ear, Biophys J 120, 205 (2021).
- Gleeson and O’Doherty [2006] J. P. Gleeson and F. O’Doherty, Non-Lorentzian Spectral Lineshapes Near a Hopf Bifurcation, SIAM J. Appl. Math. 66, 1669 (2006).
- Zhang and Golubitsky [2011] Y. Zhang and M. Golubitsky, Periodically Forced Hopf Bifurcation, SIAM J. Appl. Dyn. Syst. 10, 1272 (2011).
- Faber and Bozovic [2019b] J. Faber and D. Bozovic, Noise-induced chaos and signal detection by the nonisochronous Hopf oscillator, Chaos 29, 043132 (2019b).
- Goldobin and Pikovsky [2005] D. S. Goldobin and A. Pikovsky, Synchronization and desynchronization of self-sustained oscillators by common noise, Physical Review E 71, 15 (2005).
- Neiman et al. [2011] A. B. Neiman, K. Dierkes, B. Lindner, L. Han, and A. L. Shilnikov, Spontaneous voltage oscillations and response dynamics of a Hodgkin-Huxley type model of sensory hair cells, The Journal of Mathematical Neuroscience 1, 11 (2011).
- Ó Maoiléidigh and Hudspeth [2018] D. Ó Maoiléidigh and A. J. Hudspeth, Sinusoidal-signal detection by active, noisy oscillators on the brink of self-oscillation, Physica D: Nonlinear Phenomena 378–379, 33 (2018).
- Schreiber [2000] T. Schreiber, Measuring information transfer, Physical review letters 85, 461 (2000).
- Manley and Gallo [1997] G. A. Manley and L. Gallo, Otoacoustic emissions, hair cells, and myosin motors, The Journal of the Acoustical Society of America 102, 1049 (1997).
- Ó Maoiléidigh and Ricci [2019] D. Ó Maoiléidigh and A. J. Ricci, A bundle of mechanisms: Inner-ear hair-cell mechanotransduction, Trends in Neurosciences 42, 221 (2019).
- Risler et al. [2004] T. Risler, J. Prost, and F. Jülicher, Universal critical behavior of noisy coupled oscillators, Physical Review Letters 93, 175702 (2004).
- Faber et al. [2021] J. Faber, H. Li, and D. Bozovic, Synchronization and chaos in systems of coupled inner-ear hair cells, Phys. Rev. Res. 3, 013266 (2021).
- Sheth et al. [2021] J. Sheth, D. Bozovic, and A. J. Levine, Violation of generalized fluctuation-dissipation theorem in biological limit cycle oscillators with state-dependent internal drives: Applications to hair cell oscillations, Phys. Rev. Res. 3, 023150 (2021).
- Faber and Bozovic [2021] J. Faber and D. Bozovic, Chimera states and frequency clustering in systems of coupled inner-ear hair cells, Chaos 31, 073142 (2021).
Appendix

IX.1 Critical Slowing Down
We calculate below the time scale associated with transient solutions of the normal form equation, following a weak perturbation or stimulus. At steady-state, the stable solution to Eq. 4 is for . We find solutions in the vicinity of this limit cycle by letting
(14) |
where is the perturbation induced by the stimulus. Plugging this in and utilizing , we find, to first order,
(15) |
which has solutions that decay with characteristic time
(16) |
This calculation can also be done in the quiescent regime, which yields .
IX.2 Phase-Locking Range
We consider sinusoidal forcing in the absence of noise (, ) and express the solution in polar coordinates, , where is the phase difference between stimulus and response. Eq. 3 becomes
(17) |
We consider weak forcing, where and . We can then assume that the stimulus only weakly perturbs the amplitude of oscillation, where . Plugging this in and assuming constant , we find the steady-state amplitude,
(18) |
Inserting this into the phase equation leads to
(19) |
where is the limit-cycle frequency in the absence of stimulus. This equation has solutions for only when
(20) |
We have thus found the condition for synchronization to the stimulus. Notice that any nonzero value for improves this detectors ability to mode-lock to the external signal. Using this equation, we can also calculate the bandwidth over which the detector will synchronize to the signal:
(21) |
which increases with increasing .
IX.3 Nonlinear Response Amplitude
Beginning with the deterministic, sinusoidally-driven Hopf oscillator,
(22) |
we calculate the steady-state response at the stimulus frequency by assuming , where is the phase difference between the response and the stimulus. Plugging this into the differential equation yields,
(23) |
We then multiply by the complex conjugate to find,
(24) |
In the limit of large , we keep only the leading term in and find
(25) |
IX.4 Optimal Regime (Broadband Detector)
We now consider a broadband detector, where the threshold bandwidth is important instead of quality factor. The detector should be sensitive to signal modulations and be able to detect transient signals composed of many frequencies. In this case, the information captured by signal modulations is valuable, as well as the speed of response. We therefore choose the weight vector, . This detector is optimized in a manner similar to the narrowband detector in that the ideal regime is deeply into the oscillatory regime. However, a higher degree of nonisochronicity, and therefore chaos, is preferable (Fig. 9).