Show Summary Details

Page of

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, NEUROSCIENCE ( (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).

date: 18 February 2018

Multisensory Integration and the Perception of Self-Motion

Summary and Keywords

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.

Keywords: multisensory, vestibular, proprioception, navigation, spatial orientation, computation, corollary discharge, efference copy, internal model, head direction cell


As we go about our everyday activities, it is vital that our brain computes robust estimates of both of our orientation relative to gravity and our motion relative to the world. This computation depends on the integration of sensory cues as well as motor information in order to let us know whether we are moving through the world or if, instead, our visual surround is moving. The vestibular system encodes self-motion and spatial orientation by means of the five sensory organs within the vestibular apparatus of the inner ear: the utricle, saccule, and three semicircular canals. Specifically, the utricle and saccule detect gravity (information in a vertical orientation) and linear movement, and the semicircular canals detect rotational movement. However, it is well appreciated that in everyday life, our perception of self-motion and sense of spatial orientation is based on the integration of vestibular and extra-vestibular cues, including proprioceptive and visual information. For example, if motion is passively generated (consider a passenger in a car accelerating after a traffic light turns green), the visual system provides retinal-image motion cues (optic flow) in addition to the linear acceleration provided by the otolith organs of the vestibular system. Moreover, during self-generated motion, the brain has access to other sources of information because extra-vestibular information is provided not only by the visual system, but also by the proprioceptive sensors of the muscles, tendons, and joints, which sense the relative position of neighboring parts of the body. Further, when motion is self-generated, information related to the motor-command itself can be integrated with other existing information to contribute to the brain’s estimate of self-motion.

In darkness, humans can use vestibular otolith information to estimate passive linear displacement (Guedry & Harris, 1963; Israël & Berthoz, 1989; Mittelstaedt & Glasauer, 1991; Israël, André-Deshays, Charade, & Berthoz, 1993; Israël, Grasso, Georges-Francois, Tsuzuku, & Berthoz, 1997; Berthoz, Israël, Georges-Francois, Grasso, & Tsuzuku, 1995; Ivanenko & Grasso, 1997; Grasso, Ivanenko, & Lacquaniti, 1999) and vestibular canal information to estimate passive rotational motion (Valko, Lewis, Priesol, & Merfeld, 2012). Nonetheless, in the absence of actual self-motion (and thus absence of vestibular input), visual image motion can provide forceful sensations of linear displacement (Gibson, 1950) and rotation (Benson, Kass, & Vogel, 1986; Benson, Hutt, & Brown, 1989; Grabherr, Nicoucar, Mast, & Merfeld, 2008; Soyka, Giordano, Beykirch, & Bülthoff, 2011; Valko et al., 2012). There are also many reasons to believe that the mechanism underlying the brain’s computation of self-motion and spatial orientation during voluntary movements additionally makes use of proprioceptive or motor-related signals. First, we are better at estimating both the distance we have traveled and velocity of our motion during locomotion (i.e., active translation) than comparable motion that is externally applied (Becker, Nasios, Raab, & Jürgens, 2002; Jürgens & Becker, 2006; Frissen, Campos, Souman, & Ernst, 2011). Second, while patients with vestibular sensory loss have poorer self-motion perception when compared to normal subjects in a passive motion task, their performance during active motion improves significantly (Worchel, 1952; Glasauer, Amorim, Vitte, & Berthoz, 1994; Glasauer, Amorim, Viaud-Delmon, & Berthoz, 2002).

Our current understanding of how the brain integrates vestibular, proprioceptive, and motor-related signals to encode self-motion during everyday activities is reviewed. Recent findings have challenged the traditional view of neural code that is used to represent sensory information in vestibular pathways. In addition, they have provided strong support for the view that vestibular pathways are inherently multimodal at the earliest stages of processing. Finally, recent studies have firmly established that the encoding of self-motion information differs when motion is self-generated versus externally applied. This progress is highlighted and the implications are considered of this self-motion coding strategy for ensuring accurate motor control and stability during self-motion.

The Statistics of Natural Self-Motion and Coding by Early Vestibular Pathways

The Statistics of the Self-Motion Experienced in Everyday Life

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 1. Statistics of natural self-motion and early vestibular processing. (A) Representation of a subject wearing the MEMS module (gold box), and example linear acceleration signal during different everyday activities. (B) Population-averaged power spectra (black) of these signals with corresponding SDs (dark gray bands) and power law fits over the low-frequency (green line) versus high-frequency (blue line) ranges. The power law exponents (i.e., slopes) and the transition frequencies (i.e., the frequency at which the power law fits intersect) are also shown. (C, D) Gain and mutual information density for populations of irregular (blue) versus regular (green) vestibular afferents.

The prevailing view is that sensory systems are adapted to optimally process stimuli that are experienced in everyday life. The power spectra of naturally experienced visual and auditory stimuli decay as a power law (i.e., as 1/fα‎) (for review, see Simoncelli & Olshausen, 2001). However, this is not the case for natural vestibular stimuli. Instead, the power of vestibular input experienced during everyday life decreases more slowly at lower frequencies (< 5 Hz, in humans, and < 10 Hz in monkeys) and more rapidly at higher frequencies for all motion dimensions (Carriot, Jamali, Chacron, & Cullen, 2014; Figure 1A, B). This unique stimulus structure is the result of active motion, as well as passive biomechanical filtering occurring before any neural processing. If the view that sensory systems have developed coding strategies that are constrained by the statistics of natural stimuli is correct, this then raises the question of how do the neural coding strategies used by the vestibular system relate to the statistics of the self-motion experienced in everyday life?

Regular and Irregular Canal Afferents Effectively Comprise Two Parallel Information Channels

Self-motion information is first detected by the receptor cells (i.e., type I and II hair cells) of the semicircular canals and otoliths and then carried by the vestibular-nerve afferents to the brain. Over the range of frequencies typically experienced during everyday behaviors (i.e., up to 20 Hz) (Armand & Minor, 2001; Huterer & Cullen, 2002), canal afferents encode head velocity, whereas otolith afferents encode linear acceleration (Angelaki & Cullen, 2008; Goldberg, 2000).

Vestibular afferents from both the semicircular canals and otoliths can each be divided into two distinct classes on the basis of the regularity of their resting discharge (Goldberg, Smith, & Fernandez, 1984). Notably, units displaying low variability in their interspike intervals are sensibly termed regular afferents, while those with higher variability are termed irregular afferents. These two classes of afferents further differ in their morphological properties. Regular afferents, projecting from both the otoliths and semicircular canals, have relatively small axon diameter, and provide bouton endings to type II hair cells located at the periphery of the vestibular neuroepithelium. In contrast, irregular units preferentially innervate the calyx endings of type I hair cells. The results of recent studies have revealed that this intrinsic variability across vestibular-nerve afferents plays an important role in the strategy used to encode self-motion, with important implications for the functional roles of regular versus irregular afferents.

The Encoding of Rotational Motion by Vestibular Afferents

To date, most analyses of the information coded by vestibular afferents have focused on those afferents originating in the semicircular canals—most notably the horizontal semi-circular canals. Quantification of individual afferent responses to sinusoidal rotations revealed important differences in the dynamics of regular versus irregular afferent activity. Notably, irregular afferents are more sensitive to rotation when compared to regular afferents, over the physiological frequency range of natural head movements (Goldberg, 2000; Haque et al., 2004; Hullar et al., 2005; Ramachandran & Lisberger, 2006; Sadeghi et al., 2006, Sadeghi, Chacron, Taylor, & Cullen, 2007; Sadeghi, Minor, & Cullen, 2007; Figure 1C). For example, irregular afferents are twofold more sensitive to head motion at 15 Hz than regular afferents (Hullar et al., 2005; Ramachandran & Lisberger, 2006; Sadeghi, Minor, & Cullen, 2006, Sadeghi, Chacron, et al., 2007; Sadeghi, Minor, & Cullen, 2007). This raises the question: Why do we have regular vestibular afferents? The results of recent experiments using information theoretic measures (Sadeghi et al., 2006; Massot, Chacron, & Cullen, 2011) to study canal afferents have provided an answer to this question. On average, regular afferents transmit twofold more information about head motion compared to irregular afferents over the physiological frequency range (Figure 1D). Consistent with this finding, regular afferents are also twice as sensitive in detecting head motion as are irregular afferents (detection thresholds approximately 4 versus 8 deg/s). Thus, regular and irregular canal afferents effectively comprise two parallel information channels (Figure 2A)—one that encodes high-frequency/intensity transient stimuli with higher gains (i.e., irregular afferents), the other that transmits information about the detailed time course of the stimulus (i.e., regular afferents).

The results of a recent study of the natural vestibular signals experienced by monkeys provide further insight into the functional roles of these two parallel information channels in the vestibular periphery (Schneider, Cullen, & Chacron, 2015). During high-intensity activities like running, jumping, and climbing, head motion reaches large intensities in all six motion dimensions, similar to what has been reported in humans (Carriot et al., 2014). Well-established linear models of early vestibular processing (e.g., see review by Goldberg, 2000) cannot predict semicircular canal or otolith afferent responses to these stimuli; instead, linear–nonlinear cascade models are required (Massot, Schneider, Chacron, & Cullen, 2012). Furthermore, irregular otolith and semicircular canal afferents, due to their higher sensitivities, are better optimized to process such natural stimuli (Schneider et al., 2015). In contrast, regular afferents are likely better suited to estimate the detailed time course of lower intensity activity stimuli (Goldberg, 2000; Sadeghi, Minor, et al., 2007; Massot, Chacron, & Cullen, 2011). Interestingly, such parallel processing of sensory information via two parallel information channels is a common strategy in early sensory pathways, including: the auditory (Takahashi, Moiseff, & Konishi, 1984; Oertel, 1999; Gelfand, 2004), visual (Marr, 1982; Livingstone & Hubel, 1987; Merigan & Maunsell, 1993), and electrosensory (Bell & Maler, 2005; Kawasaki, 2005) systems.

The Representation of Rotational Self-Motion at the First Central Stage of Vestibular Processing

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 2. Early vestibular processing and the sensory coding of self-motion: afferents versus central neurons. (A) Vestibular afferents project to the vestibular nuclei, which in turn comprises three main classes of neurons: (i) PVPs and FTNs, which control and modulate the vestibulo-ocular reflex to ensure gaze stability during everyday life, and (ii) VO neurons that control posture and balance, and also project to higher order structures involved in the estimation of self-motion. (B) Comparison of average neural detection threshold values of VO neurons (gray), with those of irregular (blue) and regular (green) afferents. Superimposed are human behavioral thresholds (red; Grabherr et al., 2008), and an estimate of the information transmitted by a population of 12 VO neurons (black). Side bands show +/− SEM. Data replotted from Sadeghi, Chacron, et al. (2007). (C) Population-averaged performance for irregular (blue) and regular (green) afferents versus central vestibular neuron (gray) as a function of frequency (top). The shaded bands show the standard error. (Insets) Average mutual information transmitted for each population by precise spike timing (top) versus firing rate (bottom). Data replotted from Jamali, Chacron, and Cullen (2016).

How do the two channels of afferent input combine at the next stage of vestibular processing, namely the vestibular nuclei? A distinct population of neurons in the vestibular nuclei—termed Vestibular-Only (VO) and alternatively called non–eye movement neurons—constitutes a critical relay for neural computations that underlie the perception of self-motion. VO neurons receive direct afferent input, are reciprocally interconnected with the nodulus/uvula of the cerebellum (Reisine & Raphan, 1992), and also likely project to the vestibular-sensitive neurons in thalamus and cortex (Grüsser, Pause, & Schreiter, 1990a, 1990b; Lang, Büttner-Ennever, & Büttner, 1979). Accordingly, these neurons are a key stage of vestibular processing in the pathways that mediate higher-order vestibular functions such as the perception of self-motion and spatial orientation (Figure 2A).

Recent studies have provided insight into how self-motion information is encoded by the VO neurons of the vestibular nuclei over the physiologically relevant frequency range (Massot et al., 2011). Although their response gains are generally greater than those of individual afferents, VO neurons actually transmit less information than even irregular afferents. Consistent with this latter finding, the neural detection thresholds of VO neurons are significantly worse than those of irregular as well as regular afferents (Figure 2B). Indeed, in order to approach the detection thresholds measured in behavioral experiments (~2.5 vs. 0.5–1°/s; Massot et al., 2011), the responses of a population of > 20 VO neurons must be pooled. The apparent discrepancy between the precision of coding at sequential stages of vestibular processing and the brain’s ability to estimate self-motion suggests that vestibular processing has adapted to extract particular features from self-motion.

Consistent with this idea, VO neurons demonstrate two important nonlinear behaviors. First, VO neurons do not simply sum their afferent input due to a nonlinearity in the input-output relationship (Massot et al., 2011). This coding feature, termed a static boosting nonlinearity, is potentially beneficial in that it extends the coding range of central vestibular neurons such that they are less likely to demonstrate cut-off or saturation. As a result of this same nonlinearity, VO neurons preferentially extract the high-frequency features of self-motion when embedded with low frequency motion during natural movements. Second, a recent report demonstrated that VO neurons can also reliably discriminate between different stimulus waveforms through differential patterns of precise spike timing (i.e., a temporal code; Jamali, Chacron, & Cullen, 2016; Figure 2C). Temporal coding on the same timescale was reported for irregular but not regular afferents. While it remains to be seen whether this nonlinear signal is decoded by higher areas, it seems likely that vestibular pathways would use spike timing precision as well as firing rate to represent self-motion, given that VO firing rate responses appear less sensitive than the organism. Thus, taken together, these recent findings challenge the traditional notion that the vestibular system uses a linear rate code to transmit information and have important consequences for understanding how the representation of sensory information changes across sensory pathways.

The Encoding of Translational Motion

At the level of the vestibular periphery, translation appears to be encoded using a different strategy than rotation. In particular, irregular otolith afferents are exceedingly sensitive to self-motion when compared to regular otolith afferents. Indeed, the sensitivity of irregular otolith afferents is so high that it effectively compensates for their higher trial-to-trial variability, resulting in detection thresholds that are comparable to those of regular afferents (Jamali, Carriot, Chacron, & Cullen, 2013). It remains to be established why the canal and otolith systems have evolved two distinct strategies for encoding self-motion. Nevertheless, a key difference is that otoliths, unlike the semicircular canals, also sense the constant force of gravity. Accordingly, it is likely that the neural coding strategy used by regular afferents to represent translational motion is particularly well-adapted to encode small changes in the head’s orientation to this critical reference. Further studies focusing on the central processing of otolith information experienced in everyday life, including dynamic stimuli that produce nonlinear responses, as well as persistent stimulation due to gravity, are needed to further our understanding of the functional implications of this difference.

Natural Self-Motion Is Multidimensional

The natural self-motion experienced in everyday life is multidimensional and simultaneously stimulates both the semicircular canals and otoliths. Surprisingly, however, relatively few studies have addressed the question of how the brain integrates semicircular canal and otolith information during self-motion that stimulates both types of sensory organs. It has been shown that the responses of VO neurons during passively applied combined motion cannot be predicted by the linear addition of their responses to the rotational and translational component (Dickman & Angelaki, 2002; Carriot, Jamali, Brooks, & Cullen, 2015). Instead, VO neurons sub-additively integrate their semicircular canals and otoliths afferent inputs (Carriot et al., 2015). Specifically, neuronal responses reflect the weighted sum of their responses to rotation and translation. Moreover, this weighting is frequency dependent, with canal inputs more heavily weighted at low frequencies, and otolith inputs more heavily weighted at higher frequencies. Interestingly, prior psychophysical studies of self-motion perception in humans have shown that human subjects more accurately perceived angular than linear displacement at lower frequencies (Ivanenko, Grasso, Israël, & Berthoz, 1997; MacNeilage, Turner, & Angelaki, 2010). Thus, the responses of VO neurons provide a neural correlate for this perceptional bias, and further suggest that subjects perceive linear motion more accurately than rotational motion at higher frequency of self-motion.

Multimodal Integration Within Subcortical Vestibular Pathways

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 3. The vestibular nucleus is a site of rich multisensory and motor convergence. In addition to receiving direct input from vestibular afferents of the VIII nerve, the vestibular nuclei receive inputs from numerous regions including (i) proprioceptive pathways, (ii) the vestibular cerebellum, (iii) oculomotor areas of the brainstem, and (iv) several areas of the cortex (e.g., parietoinsular vestibular cortex [PIVC], premotor areas 6, 6pa, somatosensory area 3a, and superior temporal cortex).

As we go about our daily activities, self-motion information is provided not only by the vestibular sensors, but also via inputs from other sensory modalities including the somatosensory, proprioceptive, and visual systems. In addition, during active behaviors, efference copies of the motor signals that produce movement must also be integrated so that the brain can discriminate self-generated versus passively-applied motion—a distinction that is vital for ensuring accurate postural control, as well as perceptual stability. Indeed, recent studies have emphasized that a characteristic property of the vestibular system is that it combines multimodal sensory and motor information early in processing. The schematic in Figure 3 shows the rich convergence of extra-vestibular sensory inputs as well as premotor signals related to the generation of eye and head movements that are relayed to early vestibular pathways.

Integration of Vestibular and Proprioceptive/Somatosensory Inputs

During everyday life, much of our self-motion is generated as the result of our own voluntary behavior. For example, when we walk across a room or turn to talk to someone, our head moves relative to space, activating the vestibular sensors. However, these behaviors also activate proprioceptors within the muscles that actually generate them, as well as somatosensory receptors in skin surrounding the moving joints (Lackner & DiZio, 2005; Carriot, Cian, Paillard, Denise, & Lackner, 2011). These extra-vestibular inputs reach vestibular nuclei and deep cerebellar nuclei by means of direct and indirect projections from the dorsal-root axons of the spinal cord (reviewed in Cullen & Roy, 2004). Projections from areas of the cerebellum and cortex that are sensitive to proprioceptive and somatosensory inputs also send direct projections to the vestibular nuclei (reviewed in Manzoni, 2007; Guldin & Grüsser, 1998; Wilson et al., 1999). Thus, at the first central stage of processing, the connectivity of the vestibular system strongly supports multimodal integration of vestibular with these extravestibular cues.

Historically, most studies that have focused on the integration of vestibular with proprioceptive or somatosensory inputs have employed decerebrate or anesthetized preparations. These investigations were classically performed in cats, and quantified the influence of passive stimulation of neck muscle proprioceptors on the activity of neurons in the vestibular and deep cerebellar nuclei (see Goldberg & Cullen, 2011; Wilson & Schor, 1999, for reviews). Interestingly, under such conditions, neuronal sensitivities to proprioceptive versus vestibular stimulation can be either antagonistic or agonistic. As a result, when both systems are activated concurrently by passive motion of the head-on-body, neurons show reduced and augmented modulation, respectively, as compared to their responses to vestibular stimulation alone (Anastasopoulos & Mergner, 1982; Boyle & Pompeiano, 1981). Moreover, passive stimulation of limb proprioceptors also influences the activity of a substantive proportion of neurons in the vestibular nuclei and deep cerebellar nuclei (Arshian et al., 2014; McCall, Miller, Catanzaro, Cotter, & Yates, 2015), suggesting that a substrate for the integration of vestibular and limb proprioception is required for the accurate control of posture and balance.

Recently, there has been an increased focus on investigating the integration of vestibular and proprioceptive signals in alert animals (Brooks & Cullen, 2009; Kasper, Schor, & Wilson, 1988; Luan, Gdowski, Newlands, & Gdowski, 2013; Roy & Cullen, 2004, 2001), rather than in decerebrate or anesthetized preparations. To date, one main finding is that the strategy underling the integration of vestibular and proprioceptive signals is species dependent. The passive stimulation of neck proprioceptors influences the responses of vestibular nuclei neurons in mice (Medrea & Cullen, 2013), as well as in squirrel monkey (Gdowski & McCrea, 2000) and cynomolgus monkey (Sadeghi, Mitchell, & Cullen, 2009). Additionally, vestibular nuclei neurons—at least in the alert cat—respond to passive stimulation of limb as well as neck proprioceptors (McCall, Miller, DeMayo, Bourdages, & Yates, 2016). Interestingly, in alert squirrel monkey and cynomolgus monkey, multisensory integration is usually antagonistic. Thus, when the head is moved relative to body (i.e., during a passive head turn of the head on the body), neurons generally fire less robustly than for comparable head motion produced by whole-body motion (i.e., a condition in which only the vestibular system is stimulated).

In contrast, passive stimulation of proprioceptors in normal alert rhesus monkey does not directly alter the responses of VO neurons responses in the vestibular nuclei (Roy & Cullen, 2001; Carriot, Brooks, & Cullen, 2013). Responses to proprioceptive inputs are only seen following vestibular loss, suggesting a form of homeostatic plasticity that compensates for the reduced reliability of the vestibular input (Sadeghi, Minor, & Cullen, 2010, 2011, 2012). Indeed, rhesus monkeys (and presumably humans) integrate vestibular and proprioceptive signals at the level of the vestibular nuclei only during active head movements. The sophisticated mechanism underlying this integration allows the distinction between self-generated and externally applied self-motion, and is a common strategy utilized by other species.

On the other hand, stimulation of proprioceptors in normal alert rhesus monkey alters the responses of neurons at the next levels of vestibular processing (i.e., the rostral fastigial nucleus of the cerebellum and vestibular thalamus). The rostral fastigial nucleus is the most medial of the deep cerebellar nuclei, and is reciprocally connected to the vestibular nucleus (Furuya, Kawano, & Shimazu, 1975; Shimazu & Smith, 1971). Indeed, approximately half of the neurons within the rostral fastigial nucleus encode proprioceptive inputs in normal alert rhesus monkey. These neurons, termed bimodal neurons, display comparable (but antagonistic) modulation in response to passive vestibular and proprioceptive stimulation (Brooks & Cullen, 2009). As a result, because the responses of bimodal neurons sum linearly during combined stimulation, they robustly encode passive “body-in-space motion.” Interestingly, the other main population of neurons within the rhesus rostral fastigial nucleus (termed unimodal neurons) is sensitive to vestibular but insensitive to proprioceptive stimulation. As a result, these neurons encode passive “head-in-space motion,” much like the VO neurons of the vestibular nuclei.

Figure 4A compares the responses of an example unimodal and bimodal neuron in the rostral fastigial nucleus in alert rhesus monkey. This integration of vestibular and proprioceptive information at the level of the cerebellum is essential for the accurate control of posture and balance. For example, the corrective movements produced by vestibulospinal reflexes must account for changes in the position of the head relative to the body (Kennedy & Inglis, 2002; Tokita, Ito, & Takagi, 1989; Tokita, Miyata, Takagi, & Ito, 1991). Accordingly, patients with lesions to the vestibular cerebellum show deficits in the production of compensatory postural responses that are normally evoked by galvanic stimulation (Kammermeier, Kleine, & Büttner, 2009). In addition, early-stage sensory convergence in the vestibular nuclei and cerebellum is also likely vital for generating perceptual constancy of self-motion required for higher-order functions such as self-motion perception (Miller et al., 2008), including that required to perceive body motion independently of head motion (Mergner, Siebold, Schweigart, & Becker, 1991).

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 4. In response to passive stimulation, proprioceptive and visual information is encoded at the level of the vestibular cerebellum but not within the vestibular nuclei in rhesus monkey. (A) Approximately half of the neurons within the rostral fastigial neurons respond to neck proprioceptive (center) as well as vestibular (left) stimulation (i.e., bimodal neurons). For these neurons, when the head moves relative to body (as it would during a voluntary orienting head turn) vestibular and neck proprioceptive inputs sum to produce complete response cancellation—consistent with these neurons encoding body motion. The other ~ half of the neurons within the rostral fastigial neurons respond exclusively to vestibular (left) stimulation, consistent with these neurons encoding head motion. (B) VO neurons in the vestibular nuclei are unresponsive to full field visual motion that induces a robust optokinetic eye movement response. Abbreviations: Ep = induced change in eye position, Vs = change in visual surround position, and FR = constant firing rate.

Ascending projections from both the vestibular nuclei and the rostral fastigial nucleus target the vestibular thalamus (reviewed in Lopez & Blanke, 2011). In this context, there is evidence for the integration of vestibular and proprioceptive information in the thalamic ventral posterior lateral nucleus in both squirrel and rhesus monkeys. Similar to what was described for the rostral fastigial nucleus, a substantial percentage of vestibular-sensitive neurons in this region also respond robustly to passive stimulation of neck proprioceptors (Marlinski & McCrea, 2008a, 2008b; Dale & Cullen, 2015). In rhesus monkeys, this multisensory integration is antagonistic (again similar to what is seen in the cerebellum), and as a result, neurons fire less robustly for passive motion of the head relative to the body than for comparable head motion produced by whole-body motion.

Regions of the vestibular thalamus, including the ventral posterior lateral thalamus, comprise an important relay by which vestibular and proprioceptive information reach numerous regions of cortex (for review, Shinder & Taube, 2010; Hitier, Besnard, & Smith, 2014; Brandt & Dieterich, 2015). Accordingly, the self-motion information encoded by these thalamic neurons is likely vital for the construction of our perceptual estimate of head and body motion relative to space. It is thus noteworthy that combined stimulation of vestibular and proprioceptive systems, similar to that which would occur during natural, active self-motion, actually continues to produce a reduction in overall neuronal modulation.

Integration of Vestibular and Visual Inputs

Motion of the visual world across the retina, commonly termed optic flow, is also an important sensory cue during self-motion. Indeed, optic flow stimuli are capable of generating powerful sensations of motion even when a subject is motionless. Optic flow stimuli also induce reflexive eye movements that function to help keep the visual world stable on the retina. The resultant eye movements are called the optokinetic response (OKR), and during actual self-motion these eye movements work together with the vestibular-ocular reflex to ensure stable gaze at lower frequencies.

For the last many decades, the prevailing view had been that neurons in the vestibular nuclei are driven by large-field visual input as well as vestibular stimulation (Reisine & Raphan, 1992; Waespe & Henn, 1977a). Indeed, eye movement sensitive neurons in the vestibular nuclei that drive the vestibulo-ocular reflex to stabilize gaze during head motion are also an important component of the pathways that drive the OKR (goldfish, Dichgans, Bizzi, Morasso, & Tagliasco, 1973; cat, Keller & Precht, 1979; rat, Lannou, Cazin, Precht, & Toupet, 1982; monkey, Waespe & Henn, 1977a). In contrast, the population of neurons in the vestibular nuclei most relevant to understanding the neural computations that underlie the perception of self-motion are the VO neurons, which are not sensitive to eye movements. Indeed, while it was initially thought that VO neurons of the vestibular nuclei are also driven by optokinetic as well as vestibular stimulation (Boyle, Büttner, & Markert, 1985; Buettner & Büttner, 1979; Reisine & Raphan, 1992; Waespe & Henn, 1977a, 1977b), the extent of visual-vestibular convergence is not as prevalent as initially assumed (Figure 4B; Beraneck & Cullen, 2007; Bryan & Angelaki, 2009).

Consequently, while the idea that the vestibular nuclei neurons that contribute to self-motion pathways are driven by large-field visual is theoretically appealing, the extent of visual-vestibular convergence at the first central level of vestibular processing is not as prevalent as initially assumed. These recent studies show that the convergence of visual and vestibular inputs at this level does not provide a robust neural substrate by which the brain integrates these two sensory signals to estimate self-motion. Instead, the brain integrates full-field visual and vestibular inputs for higher-level functions such as the computation and perception of self-motion at the level of the cerebellum and cortex.

Cerebellar Mechanisms: Internal Models of Self-Motion and Online Calibration

Early Vestibular Processing Can Discriminate Active From Passive Self-Motion

During everyday activities, our vestibular systems are most often activated as a result of our own actively generated self-motion. However, neurophysiological experiments to date have generally focused on the coding of self-motion that is passively applied. The ability to discriminate self-generated versus passively applied motion is vital for ensuring accurate postural control, as well as perceptual stability.

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 5. Neural mechanism for the attenuation of vestibular reafference. In order to make an active head movement, the brain must send a motor command to the neck muscles. The resultant head movement in turn produces stimulation of the vestibular stimulation system—termed vestibular reafference. While vestibular afferents encode this information, vestibular reafference is canceled at the first central stage of processing in the vestibular nuclei (blue traces), while in contrast head motion produced by passive stimulation is robustly encoded (green traces). Specifically, evidence to date is consistent with a mechanism in which a cancellation signal is sent to VO neurons in the vestibular nuclei when neck proprioceptive feedback matches the expected sensory consequence of neck motor command (red shaded box).

Recent studies in rhesus monkeys have revealed that a particularly sophisticated mechanism underlies the integration of vestibular and proprioceptive signals at the level of the vestibular nuclei during self-generated (i.e., active) self-motion. Notably, the responses of VO neurons are markedly attenuated during active head motion as compared to passive head motion (rotations: Roy & Cullen, 2001, 2004; and translations: Carriot, Brooks, & Cullen, 2013; Carriot et al., 2015). In contrast, the afferent input from the vestibular-nerve to VO neurons remains robust (and equivalent) regardless of whether self-motion is self-generated or externally applied (Figure 5; Cullen & Minor, 2002; Sadeghi, Chacron, et al., 2007; Jamali, Sadeghi, & Cullen, 2009). The mechanism that cancels afferent input due to active self-motion allows the distinction between self-generated and externally applied self-motion early in vestibular processing. Notably, this mechanisms is common across species including mice (Medrea & Cullen, 2013), squirrel monkeys (Gdowski & McCrea, 2000), and macaque monkeys (rhesus: Roy & Cullen, 2001; fasicularis: Sadeghi, Mitchell, & Cullen, 2009).

The cancellation of vestibular input resulting from active self-motion is mediated by a mechanism that compares the expected consequences of self-generated movement (computed by an internal model located in the cerebellum) and the actual sensory feedback (reviewed in Cullen, 2011, 2012). Key insights into this mechanism have been obtained through studies in which the relationship between intended and actual self-motion was experimentally controlled (Roy & Cullen, 2001, 2004; Brooks & Cullen, 2014; Brooks, Carriot, & Cullen, 2015). Specifically, a cancellation signal is generated only in conditions where the activation of proprioceptors matches the motor-generated expectation (i.e., an internal model of the brain’s expectation of the sensory consequences of the motor behavior). As a result, this cancellation mechanism ensures the selective encoding of passive self-motion at the earliest central stage of vestibular processing when both occur simultaneously. Importantly, however, in the specific condition where simultaneous active motion (termed reafference) and passive motion (termed exafference) result in activation of the same muscle proprioceptors, neurons robustly encoded the total vestibular input (i.e., responses to vestibular reafference and exafference were equally strong), rather than exafference alone (Brooks & Cullen, 2014). This is because cancellation of vestibular reafference in early vestibular processing requires an explicit match between expected and actual proprioceptive feedback.

Cerebellar Mechanisms: Internal Models of Active Self-Motion Allow the Preferential Encoding of Passive Self-Motion

As reviewed, the suppression of active motion occurs at the level of the vestibular nuclei. While both canal (Cullen & Minor, 2002; Sadeghi, Chacron, et al., 2007) and otolith (Jamali et al., 2009) afferents similarly encode active and passive rotations and translations, respectively, their target neurons in the vestibular nuclei—VO neurons—selectively respond to passive self-motion. This raises the question: What is the source of the cancellation signal that suppresses the vestibular-nerve input to vestibular-only neurons during active self-motion? To date, the available evidence suggests that the suppression of actively generated vestibular responses is mediated by a mechanism that (i) compares the actual activation of proprioceptors with a motor-generated expectation (i.e., an internal model of the expected proprioceptor activation; for review, see Cullen, 2011, 2012) and (ii) involves the cerebellum (Brooks & Cullen, 2013; Brooks et al., 2015).

It has long been known that the vestibular nuclei, deep cerebellar nuclei, and anterior vermis of the cerebellum are reciprocally interconnected (Batton, Jayaraman, Ruggiero, & Carpenter, 1977). In response to passively applied stimulation, these regions of the cerebellum also integrate vestibular and proprioceptive-related signals. For example, as reviewed, neurons in the most medial of the deep cerebellar nuclei (i.e., the rostral fastigial nucleus) integrate these two inputs such that they encode externally applied head and body-in-space motion in two distinct streams (Brooks & Cullen, 2009). In addition, single neurons in the anterior region of the cerebellar vermis (lobules I–V), which sends strong descending projections to the rostral fastigial nucleus, also integrate these two sensory inputs (Manzoni, Pompeiano, & Andre, 1998a, 1998b; Manzoni, Pompeiano, Bruschini, & Andre, 1999; Manzoini, Andre, & Bruschini, 2004).

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 6. The sensitivities of cerebellar neurons dynamically track the comparison of predictive and feedback signals. (A) Single unit recordings were made in the rostral fastigial nucleus, and the relationship between the motor command and resultant movement was altered during active head movements. (B) Head movements (top row) and activity of an example rostral fastigial neuron (middle row) during a motor learning task. Head movements and neuronal responses were first recorded before learning in passive and active conditions. Second, a load was applied and held constant for the learning phase. Third, after the learning phase, the motor was randomly turned off for single “catch” trials. Bottom row: Neurons’ normalized sensitivity to head movement during different phases of the learning sequence. During the learning phase, neuronal sensitivity gradually decreased from that measured during passive head motion to the suppressed response observed during active motion. Note that neuronal sensitivity during catch trials is comparable to the neuronal sensitivity during early learning and passive head movements.

During voluntary self-motion, however, individual rostral fastigial nucleus neurons encode an explicit and selective representation of unexpected self-motion (Brooks & Cullen, 2013). Specifically, neurons with vestibular or proprioceptive responses to passive externally applied head and body movements are unresponsive when the same motion is self-generated. Experiments, in which the relationship between the motor command and resultant movement was altered, have provided further insight into the underlying mechanism. In particular, trial-by-trial analyses have established that neuronal sensitivities dynamically track the comparison of predictive and feedback signals (Figure 6; Brooks et al., 2015), indicating that the output of the cerebellum reflects a fast and elegant computation in which the brain’s internal model of the sensory consequences of active self-motion is rapidly updated.

The rostral fastigial nucleus is a critical component of the descending pathway controlling postural reflexes and orienting behaviors; it projects to brainstem structures that control these behaviors, including vestibular-only neurons of the vestibular nuclei and spinal cord. In addition, the rostral fastigial and vestibular nuclei send important ascending projections to the vestibular thalamus (reviewed in Lopez & Blanke, 2011) and thus also play a vital role in carrying information required to compute our perception of self-motion and spatial orientation during everyday activities. The implications of findings discussed, namely that vestibular responses are canceled during active motion early in vestibular processing, are further considered.

Cerebellar Mechanisms: Internal Models of Tilt Versus Translation During Self-Motion

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 7. Cerebellar regions that receive vestibular input. Four main cerebellar regions that receive vestibular input include the (i) lobules I–V of the anterior lobe (green) and deep cerebellar nucleus (light red), (ii) nodulus and ventral uvula (lobules IX–X, blue), (iii) flocculus and ventral paraflocculus (yellow), and (iv) oculomotor vermis of posterior lobe (lobules VI–VII, purple). Flocc. flocculus; Lob p.m., paramedian lobule; Nod., nodulus; Paraflocc., paraflocculus; S. intercrur, intercrural sulcus. Adapted from Brodal (1979).

In addition to the anterior lobe and deep cerebellar nuclei, three other primary regions of the cerebellum receive either primary (i.e., from afferents) or secondary (i.e., from vestibular nuclei) vestibular input (Figure 7). These regions include the nodulus and ventral uvula (lobules X and IX), flocculus and ventral paraflocculus, and oculomotor vermis of posterior lobe (lobules VI–VII). Of these, the nodulus/uvula of the cerebellum (Wearne, Raphan, & Cohen, 1998) likely plays the most critical role in the computations that underlie self-motion perception. The nodulus/uvula has strong reciprocal connection to the vestibular nuclei, and lesion studies in monkeys have shown that damage to these cerebellar regions produce deficits in spatial processing of vestibular information (Angelaki & Hess, 1995; Wearne et al., 1998).

The computations performed within this cerebellar network are thought to play an important role in allowing the brain to distinguish head tilt from translation. Notably, based on the physics of the vestibular sensory organs, the otolith organs cannot distinguish linear accelerations that are due to head tilts (relative to gravity) from those that are the result of translational self-motion. On the other hand, the activation of the semicircular canals differs in these two conditions, because semicircular canals are stimulated by the rotations associated with head tilts but are not stimulated during translation. Accordingly, in order to distinguish between tilt and translation, the brain must theoretically integrate otolith and canal inputs (Mayne, 1974; Merfeld, 1995; Angelaki, McHenry, Dickman, Newlands, & Hess, 1999; Merfeld, Zupan, & Peterka, 1999; Merfeld, Zupan, & Gifford, 2001; Merfeld, Park, & Gianna-Poulin, 2005; Bos & Bles, 2002; Green, Shaikh, & Angelaki, 2005; Green & Angelaki, 2007; Laurens & Droulez, 2007; Laurens & Angelaki, 2011; Laurens, Strauman, & Hess, 2011; Zupan, Merfeld, & Darlot, 2002). Consistent with this view, neurons in the monkey nodulus/uvula reflect a computation consistent with the integration of otolith and semicircular canal inputs. Specifically, when head motion is passively applied, some neurons appear to preferentially encode translation (Yakusheva et al., 2007), while others appear to better encode tilt (Laurens, Meng, & Angelaki, 2013). Note, however, that to date, it is not yet known whether these same cerebellar neurons distinguish tilt versus translation when the activation of the vestibular system is self-generated, during voluntary self-motion.

Vestibular Cortex and the Perception of Self-Motion

The vestibular system plays a vital role not only in ensuring gaze and postural stability, but also in essential cognitive functions including the accurate perception of self-movement (Büttner & Henn, 1981; Guedry, 1974; Mergner, Anastasopoulos, Becker, & Deecke, 1981), spatial perception and memory (Berthoz, 1996; Berthoz, Israël, Viéville, & Zee, 1987; Berthoz, Israël, Georges-François, Grasso, & Tsuzuku, 1995; Bloomberg, Jones, Segal, McFarlane, & Soul, 1988; Israël & Berthoz, 1989; Israël, André-Deshays, Charade, & Berthoz, 1993; Nakamura & Bronstein, 1995), and navigation (Berthoz & Israël, 1996; Wiener & Berthoz, 1993). While these functions require cortical processing, it is important to note that, in contrast to the visual and auditory systems, no specific region of cortex is dedicated to vestibular processing. Notably, single neurons in areas of cortex that respond to vestibular stimulation also encode visual, somatosensory, or motor-related signals.

Vestibular information is relayed from the vestibular and deep cerebellar nuclei to the cortex via the thalamus. Notably, most ascending inputs from the vestibular nuclei project to the ventroposterior (VP) complex of the thalamus, which is also the main somatosensory nucleus of the thalamus. In turn, this thalamic region projects to areas of cortex where somatosensory and proprioceptive sensory signals are both encoded (i.e., areas 3av and parieto-insular vestibular cortex [PIVC]; reviewed in Goldberg et al., 2012). Single unit recording experiments in awake monkeys have further shown that neurons in the VP thalamic complex can also respond to passive rotations and translations (Büttner & Henn, 1976; Büttner, Henn, & Oswald, 1977; Magnin & Fuchs, 1977; Marlinski & McCrea, 2008a; Meng, May, Dickman, & Angelaki, 2007), and that this self-motion encoding is highly nonlinear such that response gains decrease markedly with increasing stimulus amplitude (Marlinski & McCrea, 2008a). Moreover, recordings from single VP neurons reveal the integration of vestibular signals with other inputs including somatosensory, proprioceptive or visual sensory information as well as motor signals (Marlinski & McCrea, 2008a, 2008b; Meng & Angelaki, 2010). Thus, vestibular processing at the level of the vestibular thalamus is already strongly multisensory in nature.

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 8. Schematic representation of vestibular cortical areas in monkey. (A) Green shaded areas of cortex receive inputs from vestibular nuclei. (B) Blue shaded areas of cortex project back to the vestibular nuclei. FEF, frontal eye field; MST, medial superior temporal; VIP, ventral intraparietal; PIVC, parieto-insular vestibular cortex. Numbers refer to specific Brodmann areas of the cerebral cortex. Striped areas are deep cortical areas, and the gray shaded region in (B) denotes the corpus callosum.

At the next stage of processing in the cortex, there are four main areas in which single neurons respond robustly to vestibular stimulation in monkeys, namely: (i) the parieto-insular vestibular cortex (PIVC) and areas 2v and 3a; (ii) the ventral intraparietal area (area VIP), (iii) the middle superior temporal area of extrastriate visual cortex (area MST), and (iv) the visuomotor areas of frontal cortex (specifically, the frontal eye field (FEFs), and supplementary eye fields (SEFs). The schematic in Figure 8 illustrates the locations of each of these regions. The results of single-unit studies in alert monkeys have led to the view that each of these areas serves a distinct but complementary role in relation to the integration of multisensory information for self-motion perception.

Parieto-Insular Vestibular Cortex (PIVC) and Areas 2v and 3a

Cortical area PIVC is commonly thought to be essential for shaping our perception of self-movement, spatial orientation, and body representation. In patients, stimulation of area PIVC produces vestibular sensations (Penfield, 1957). Correspondingly, PIVC lesions are associated with impairments in patients’ perception of subjective vertical (Brandt, Dieterich, & Danek, 1994) and increased psychophysical thresholds for heading judgements based on vestibular cues in monkeys (Chen, Gu, Liu, DeAngelis, & Angelaki, 2016). As noted, the parieto-insular vestibular cortex (PIVC) receives direct input from the vestibular thalamus. Interestingly, single neurons within area PIVC not only respond to rotations, and during optokinetic stimulation, but also can be sensitive to stimulation of somatosensory or proprioceptive receptors in the neck and occasionally limbs (Grüsser, Pause, & Schreiter, 1990a, 1990b; Guldin, Akbarian, & Grüsser, 1992). Moreover, while PIVC neurons are not sensitive to eye movements, they do respond to external visual target motion (Shinder & Newlands, 2014).

Area PIVC, in turn, is strongly interconnected with areas 3a and 2v (Guldin et al., 1992). Neurons in the vestibular region of area 3a also respond to proprioceptive as well as vestibular stimulation (Guldin et al., 1992), and send projections to motor cortex (Zarzecki, Blum, Bakker, & Herman, 1983). The vestibular properties of area 2v neurons have been characterized (Guldin et al., 1992), but much more work is still needed to understand the sensory and motor convergence that occurs in this region. Based on the information available to date, the role of the PIVC-3a-2v is currently not fully understood. Nevertheless, it has been suggested that the rich convergence of multisensory cues and external visual target motion information experienced during self-motion could be used to effectively monitor the motion of the head or body motion relative to that of an object of interest in external space (Akbarian, Grüsser, & Guldin, 1992; Guldin, Akbarian, & Grüsser, 1992; Shinder & Newlands, 2014). This in turn could be vital for shaping commands to generate accurate tracking movements, for example of the head and body, in order to successfully make contact with an object of interest as we move through our world.

The Ventral Intraparietal Area (Area VIP)

Neurons in area VIP also demonstrate considerable multisensory integration. First, neurons are modulated in response to vestibular stimulation produced by passive rotations and translations (Bremmer, Klam, Duhamel, Ben Hamed, & Graf, 2002; Klam & Graf, 2003; Schlack, Hoffmann, & Bremmer, 2002). Similar to PIVC neurons, VIP neurons also often encode somatosensory or proprioceptive information. Notably, experiments in monkeys have revealed neurons with tactile receptive fields corresponding to the head region (e.g., Avillac, Denève, Olivier, Pouget, & Duhamel, 2005; Duhamel, Colby, & Goldberg, 1998). In addition, some VIP neurons are activated in response to passive stimulation of neck proprioceptors by rotation of the body below an earth stationary head (Klam & Graf, 2006).

In addition to integrating vestibular, somatosensory, and proprioceptive inputs, VIP neurons can respond strongly to full field visual information. Notably, neurons show tuning to optic flow stimulation that corresponds to their vestibular sensitivity (Bremmer et al., 2002; Chen, Henry, DeAngelis, & Angelaki, 2007). Overall, vestibular responses in VIP are generally stronger than their optic flow responses (Chen et al., 2007). Moreover, the vestibular tuning observed in VIP neurons does not vary as a function of either eye or head positions, suggesting that these neurons include information in a body- (or world-) centered reference frame (Chen, DeAngelis, & Angelaki, 2013). Taken together these observations have led to the common view that area VIP plays a role in the computation of heading direction during self-motion. However, somewhat surprisingly, a recent report concluded that inactivation of area VIP does not produce perceptual deficits in heading discrimination task (Chen et al., 2016).

Future work will be required to understand if and how the visual-vestibular convergence observed in area VIP contributes to the perception of heading direction or instead some other feature of self-motion perception. Indeed, to date only a few studies have described the responses of neurons in area VIP (as well as neighboring area MIP) during active self-motion (Klam & Graf, 2003, 2006). While neurons in both areas respond to passive whole-body rotation, the magnitudes of their responses to active head movement are typically smaller than those to passive whole-body rotation (Klam & Graf, 2006). Interesting, this observed attenuation of self-generated vestibular inputs at the cortical level is consistent with the suppression of vestibular reafference observed at earlier levels of vestibular processing in the vestibular and cerebellar nuclei. It has been suggested that the suppression of vestibular reafference at such high levels of cortical processing is vital for perceptual stability during active self-motion. Specifically, this distinction is likely to be critical for distinguishing movement through space that is self-generated from that which is externally applied and unexpected.

Middle Superior Temporal Area of Extrastriate Visual Cortex (Area MSTd)

Dorsal medial superior temporal extrastriate cortex (area MSTd) within extrastriate visual cortex is most known for its essential role in visuomotor processing. Specifically, lesions to MSTd impair optokinetic eye movements made to compensate for translational motion of the visual scene (Dürsteler & Wurtz, 1988; Takemura, Murata, Kawano, & Miles, 2007). In addition, recent single unit recording studies using translational full field visual stimulation in combination with smooth pursuit paradigm suggest that MSTd neurons contribute to the visuomotor transformation from retinal to head-centered stimulus velocity signals for the control of visually driven eye movements such as smooth pursuit and optokinetic responses (Inaba, Shinomoto, Yamane, Takemura, & Kawano, 2007; Inaba, Miura, & Kawano, 2011; Inaba & Kawano, 2010; Brostek, Büttner, Mustari, & Glasauer, 2015).

Single unit recording studies in monkeys have further revealed that MSTd neurons show modulation during self-motion in darkness (Duffy, 1998). This led to the proposal that area MSTd plays a role in computing a representation of heading direction (Duffy, 1998; Page & Duffy, 2003), in addition to its role in visuomotor transformations. In this context, a series of recent studies in monkeys by Angelaki and DeAngelis specifically focused on understanding how single neurons integrate visual and vestibular inputs in relation to heading perception during self-motion (reviewed in DeAngelis & Angelaki, 2012). However, while these neurophysiological studies found a correlative relationship between neuronal responses and heading perception based on vestibular signals (Fetsch, Wang, Gu, Deangelis, & Angelaki, 2007; Gu, DeAngelis, & Angelaki, 2007), it seems unlikely that MSTd plays a major role in self-motion perception. Indeed, these same investigators have shown that inactivation of the MSTd has little influence on vestibular heading as compared to visual heading thresholds (Gu, DeAngelis, & Angelaki, 2012). Accordingly, the evidence to date is consistent with the original view that this cortical area plays a principal role in shaping the visuomotor transformations required for the generation of compensatory eye movements in response to visual motion. In this context, the responses observed in response to vestibular stimulation may be predominantly related to the generation of a smooth eye movement command through either a parametric or predictive mechanisms (Barnes, 1988; Barnes & Eason, 1988; Lanman, Bizzi, & Allum, 1978; Cullen, Belton, & McCrea, 1991; Lisberger, 1990) in order to suppress the vestibulo-ocular reflex responses evoked during head movements to fixate a target.

Visuomotor Frontal Cortex (the Frontal Eye Fields [FEFs] and Supplementary Eye Fields [SEFs])

Finally, it has long been appreciated that the frontal eye fields (FEFs), located within the frontal lobe of the brain, make important contributions to control of voluntary eye movements and visual attention. The finding that electrical stimulation of the contralateral vestibular nerve can produce evoked potentials in the region of the arcuate sulcus within the FEF (Ebata, Sugiuchi, Izawa, Shinomiya, & Shinoda, 2004) led to the proposal that this area is also involved in visuo-vestibular interactions. In head-restrained monkeys, neurons within the fundus of the arcuate sulcus preferentially respond to pursuit eye movements. These same neurons are typically unresponsive to passive whole-body rotation or translations in the dark but do respond when monkeys suppress their vestibulo-ocular reflex by tracking a target that moves with the head (Fukushima, Sato, Fukushima, Shinmei, & Kaneko, 2000; Fukushima, Akao, Kurkin, & Fukushima, 2005). Thus, similar to area MSTd, the FEF may mostly play a role in gaze stabilization during self-motion, likely by fine-tuning pursuit-vestibular interactions, rather than motion perception per se.

Pursuit responsive neurons can also be found within supplementary eye fields (SEFs) of frontal cortex. These neurons have relatively complex responses, and neurons can be driven by passive self-motion in the dark (Fukushima et al., 2004). This finding suggests that the SEF may have a different functional role than the FEF. Studies in patients with SEF lesions (Israël, Rivaud, Berthoz, & Pierrot-Deseilligny, 1992; Israël, Rivaud, Gaymard, Berthoz, & Pierrot-Deseilligny, 1995; Pierrot-Deseilligny, Israël, Berthoz, Rivaud, & Gaymard, 1993) can show impaired performance in a vestibular-memory-contingent saccade task. In this task, subjects are asked to estimate the amplitude of a prior passive head/body rotation by producing a saccadic eye movement (Bloomberg, Jones, Segal, McFarlane, & Soul, 1988). This suggests that the integration of information about self-motion and target location in the SEF may contribute to our ability to accurately control eye movements during every day activities.

Voluntary Behavior: Steering and Navigation

The ability to keep track of where we are in relation to where we are currently heading is essential for accurate motor control and perception. As reviewed, neurons in the earliest stages of vestibular processing—within the vestibular nuclei and deep cerebellar nuclei—show reduced responses to self-motion that is the result of voluntary behavior as compared to passively applied motion. A similar distinction is also made by vestibular-sensitive neurons at the level of the cerebellum and thalamus, which also encode other self-motion sensory cues including proprioceptive, somatosensory, and visual inputs.

The results of imaging studies in humans and experiments in which the vestibular nerve of patients was directly activated (De Waele, Baudonnière, Lepecq, Tran Ba Huy, & Vidal, 2001) or activated via caloric vestibular stimulation (Bottini et al., 1994; Friberg, Olsen, Roland, Paulson, & Lassen, 1985; Fasold et al., 2002; Lobel et al., 1998, 1999; Suzuki et al., 2001; Vitte et al., 1996) have established that the cortical areas activated are similar to those that have been identified in monkey single-unit recordings. In addition, the results of human studies further suggest that the cortical vestibular representation of vestibular information may be more widespread than is commonly assumed. For example, the regions of the prefrontal lobe and anterior portions of the supplementary motor area also show relatively short activation latencies (De Waele et al., 2001). To date, single-unit studies in monkeys have been performed in only a limited range of cortical areas, and thus whether vestibular information influences the activity of neurons in a wider range of cortical areas as is suggested by the human studies remains an open question.

In particular, there has been much recent emphasis on understanding the representation of self-motion information within areas of the medial temporal lobe including the entorhinal/perirhinal cortices and hippocampus—areas that likely play a critical role in spatial cognition and navigation (reviewed in Hitier, Besnard, & Smith, 2014). In this context, a specific class of neurons, called head direction cells, have been identified in rodents that are thought to represent the neural substrate of the perceived directional heading in the environment and enable accurate navigation. These neurons are found in numerous brain areas including the presubiculum, anterodorsal thalamus, lateral mammillary nuclei, retrosplenial cortex, lateral dorsal thalamus, striatum, and entorhinal cortex (reviewed in Taube, 2007). Common wisdom is that angular head velocity information from the vestibular nuclei plays a principal role in the generation of the head direction signal. Specifically, the head direction cell system is often thought of as an attractor network, which is updated by information of vestibular origin (McNaughton, Chen, & Markus, 1991; Skaggs, Knierim, Kudrimoti, & McNaughton, 1995; reviewed in Clark & Taube, 2012). A schematic of the proposed pathway linking the vestibular system to the head direction cell network is shown in Figure 9.

Multisensory Integration and the Perception of Self-MotionClick to view larger

Figure 9. Prevailing view of the pathway linking the vestibular system to the head direction cell network. Structures comprising neurons considered to respond during angular head motion are shaded in gray, while those comprising neurons that encode head direction are shaded in light red. Note, the nucleus prepositus hypoglossi (NPH) has been proposed to encode angular head motion, yet is complicated by the fact that these neurons are known to predominately encode eye position signals. ADN, anterodorsal thalamus; AVN, anteroventral thalamus; DTN, dorsal tegmental nucleus; LDN, laterodorsal thalamus; LMN, lateral mammillary nuclei; MEC, medial entorhinal cortex; MPF, medial prefrontal cortex; MVN, medial vestibular nuclei; PaS, parasubiculum; PoS, postsubiculum; PPC, posterior parietal cortex; RSP, retrosplenial cortex; SgN, supragenual nucleus. Adapted from Hitier, Besnard, and Smith (2014).

As reviewed, however, self-generated vestibular inputs are canceled at the first central stage of vestibular processing in rodents as well as monkeys. Thus, a more accurate view of head direction cells is that their responses reflect the integration of multiple self-motion cues. In fact, recent studies have shown that early vestibular pathways encode strong proprioceptive as well as vestibular signals in rodents (Medrea & Cullen, 2013), which could contribute to the generation of the head direction signal. This idea is consistent with reports that motor and proprioceptive influences shape head direction cell responses (Taube & Basset, 2003; Wiener, Berthoz, & Zugaro, 2002). Thus, the integration of multimodal sensory input and motor-based anticipation, a common feature of early vestibular processing, likely underlies the robust representation of directional heading in head direction cells.

To date, the coding of self-motion (i.e., vestibular) information during active locomotion has been well studied in rodents but not in primates. In primates, experiments have instead largely focused on virtual navigation tasks in which monkeys control their movement through a virtual environment. These studies have recorded the responses of neurons in cortical areas such as MSTd, a cortical region that has also been implicated in the generation of pursuit tracking movements. Notably in monkeys trained in a steering paradigm to perform a virtual navigation task, MSTd neurons show task-dependent response changes to visual stimulation including enhanced responses to the current heading direction (Page & Duffy, 2008; Jacob & Duffy, 2015). How the brain combines task-related (i.e., steering motor/motor preparation) signals with self-motion (i.e., vestibular, proprioceptive, and visual) information is not yet understood. Studies of early vestibular processing show that, in contrast to active head movements, neurons respond robustly when monkeys were trained to perform a “voluntary driving” task in which they steered a wheel to produce goal-directed head rotations (Roy & Cullen, 2001, 2002). This task, like virtual navigation, differs from natural navigation in that proprioceptive and motor-related signals that would normally be present during natural locomotion are absent. Thus, it remains to be determined if and how the responses of neurons at different levels along the self-motion pathways will respond when there is congruency between self-motion cues and motor signals, as is the case during natural navigation (i.e., locomotion).

Open Questions and Conclusions

Overall, the finding that early vestibular pathways preferentially encode passive self-motion has important implications regarding self-motion perception. When visual information is not available, the perception of self-motion can be achieved via path integration or dead reckoning (for review, see Loomis, Blascovich, & Beall, 1999; McNaughton, Battaglia, Jensen, Moser, & Moser, 2006). Path integration does not require identification of visual features in the external environment (e.g., viewing a landmark) but instead takes advantage of the integration of actively generated signals including motor efference copy, proprioceptive cues, and vestibular information (Frissen, Campos, Souman, & Ernst, 2011; Jürgens & Becker, 2006). The same neurons within the vestibular nuclei and deep cerebellar nuclei that show reduced responses to self-motion that is the consequence of voluntary behavior rather than passively applied are also the origin of vestibular input to the vestibular thalamus. Accordingly, it seems likely that the self-motion signals relayed to the higher-level cortical areas responsible for self-motion perception areas are attenuated during active motion (for review, see Cullen, 2012).

If the self-motion signals relayed to the higher-level cortical areas are suppressed during active motion, how does the brain compute an internal estimate of self-motion? Behavioral studies have shown that subjects will perceive changes in heading during active motion even in the absence of vision (Mittelstaedt & Glasauer, 1991, 1992). It is likely that, during active self-motion, the brain integrates head motion information derived from extra-vestibular inputs, including proprioceptive and motor efference copy signals (Blouin, Labrousse, Simoneau, Vercher, & Gauthier, 1998; Blouin, Okada, Wolsley, & Bronstein, 1998; Blouin, Amade, Vercher, & Gauthier, 1999). While these extravestibular inputs play a critical role in our ability to estimate head orientation, further experiments will be required to establish how vestibular sensitive neurons at subsequent stages of processing (i.e., thalamo-cortical pathways) actually encode information about the rotational and translational components of natural self-motion during actively generated versus passive self-motion.

To date, prior studies describing the information encoded by cortical areas, which likely contribute to the perception of self-motion (e.g., parieto-insular vestibular cortex, VIP, MSTd, areas 2v and 3a; Grüsser, Pause, & Shreiter, 1990a, 1990b; Fasold, Heinau, Trenner, Villringer, & Wenzel, 2008; for review, see Angelaki, Gu, & DeAngelis, 2011; Lopez & Blanke, 2011), have for the most part only considered passive stimuli. (for exceptions see Klam & Graf, 2003, 2006; Shinder & Newlands, 2014). Thus, it remains to be determined whether and how these areas distinguish actively generated from passive head movements. If vestibular inputs are attenuated, further experiments will also be required to understand the mechanism by which the brain integrates extra-vestibular cues with the attenuated vestibular signal to compute a robust estimate of heading during active self-motion.


This research study was supported by the Canadian Institutes of Health Research and the National Institutes of Health (DC2390).


Akbarian, S., Grüsser, O. J., & Guldin, W. O. (1992). Thalamic connections of the vestibular cortical fields in the squirrel monkey (Saimiri sciureus). Journal of Comparative Neurology, 326(3), 423–441.Find this resource:

Anastasopoulos, D., & Mergner, T. (1982). Canal-neck interaction in vestibular nuclear neurons of the cat. Experimental Brain Research, 46, 269–280.Find this resource:

Angelaki, D. E., & Cullen, K. E. (2008). Vestibular system: The many facets of a multimodal sense. Annual Review of Neuroscience, 31, 125–150.Find this resource:

Angelaki, D. E., Gu, Y., & DeAngelis, G. C. (2011). Visual and vestibular cue integration for heading perception in extrastriate visual cortex. Journal of Physiology, 589(4), 825–833.Find this resource:

Angelaki, D. E., & Hess, B. J. (1995). Inertial representation of angular motion in the vestibular system of rhesus monkeys: II. Otolith-controlled transformation that depends on an intact cerebellar nodulus. Journal of Neurophysiology, 73(5), 1729–1751.Find this resource:

Angelaki, D. E., McHenry, M. Q., Dickman, J. D., Newlands, S. D., & Hess, B. J. (1999). Computation of inertial motion: Neural strategies to resolve ambiguous otolith information. Journal of Neuroscience: The Official Journal of the Society for Neuroscience, 19(1), 316–327.Find this resource:

Armand, M., & Minor, L. B. (2001). Relationship between time- and frequency-domain analyses of angular head movements in the squirrel monkey. Journal of Computational Neuroscience, 11, 217–239.Find this resource:

Arshian, M. S., Hobson, C. E., Catanzaro, M. F., Miller, D. J., Puterbaugh, S. R., Cotter, L. A., . . . McCall, A. A. (2014). Vestibular nucleus neurons respond to hindlimb movement in the decerebrate cat. Journal of Neurophysiology, 111(12), 2423–2432.Find this resource:

Avillac, M., Denève, S., Olivier, E., Pouget, A., & Duhamel, J.-R. (2005). Reference frames for representing visual and tactile locations in parietal cortex. Nature Neuroscience, 8(7), 941–949.Find this resource:

Barnes, G. R. (1988). Head-eye co-ordination: Visual and nonvisual mechanisms of vestibulo-ocular reflex slow-phase modification. Progress in Brain Research, 76, 319–328.Find this resource:

Barnes, G. R., & Eason, R. D. (1988). Effects of visual and non-visual mechanisms on the vestibulo-ocular reflex during pseudo-random head movements in man. Journal of Physiology, 395, 383–400.Find this resource:

Batton, R. R., Jayaraman, A., Ruggiero, D., & Carpenter, M. B. (1977). Fastigial efferent projections in the monkey: An autoradiographic study. Journal of Comparative Neurology, 174(2), 281–305.Find this resource:

Becker, W., Nasios, G., Raab, S., & Jürgens, R. (2002). Fusion of vestibular and podokinesthetic information during self-turning towards instructed targets. Experimental Brain Research, 144(4), 458–474.Find this resource:

Bell, C. C., & Maler, L. (2005). Central neuroanatomy of electrosensory systems in fish. In T. H. Bullock (Ed.), Electroreception (pp. 68–111). New York: Springer.Find this resource:

Benson, A. J., Hutt, C. E. B., & Brown, S. F. (1989). Thresholds for the perception of whole body angular movement about a vertical axis. Aviation, Space, and Environmental Medicine, 60, 205–213.Find this resource:

Benson, A. J., Kass, J. R., & Vogel, H. (1986). European vestibular experiments on the Spacelab-1 mission: 4. Thresholds of perception of whole-body linear oscillation. Experimental Brain Research, 64(2), 264–271.Find this resource:

Beraneck, M., & Cullen, K. E. (2007). Activity of vestibular nuclei neurons during vestibular and optokinetic stimulation in the alert mouse. Journal of Neurophysiology, 98(3), 1549–1565.Find this resource:

Berthoz, A. (1996). The role of inhibition in the hierarchical gating of executed and imagined movements. Cognitive Brain Research, 3(2), 101–113.Find this resource:

Berthoz, A., & Israël, I. (1996). Vestibular contribution to spatial memory. In T. Ono (Ed.), Perception, memory and emotion: Frontiers in neuroscience (pp. 270–290). Oxford: Pergamon.Find this resource:

Berthoz, A., Israël, I., Georges-François, P., Grasso, R., & Tsuzuku, T. (1995). Spatial memory of body linear displacement: What is being stored? Science, 269(5220), 95–98.Find this resource:

Berthoz, A., Israël, I., Viéville, T., & Zee, D. (1987). Linear head displacement measured by the otoliths can be reproduced through the saccadic system. Neuroscience Letters, 82(3), 285–290.Find this resource:

Bloomberg, J., Jones, G. M., Segal, B., McFarlane, S., & Soul, J. (1988). Vestibular-contingent voluntary saccades based on cognitive estimates of remembered vestibular information. In E. Pirodda & O. Pompeiano (Eds.), Neurophysiology of the vestibular system: Selected papers of the Bárány Society Meeting, Bologna, June 1–4, 1987 (pp. 71–75). Basel, Switzerland: Karger.Find this resource:

Blouin, J., Amade, N., Vercher, J. L., & Gauthier, G. (1999). Opposing resistance to the head movement does not affect space perception during head rotations. In W. Becker, H. Deubel, & T. Mergner (Eds.), Current oculomotor research: Proceedings of the Ninth European Conference on Eye Movements, held September 23–26, 1997, in Ulm, Germany (pp. 193–201). Boston: Springer.Find this resource:

Blouin, J., Labrousse, L., Simoneau, M., Vercher, J. L., & Gauthier, G. M. (1998). Updating visual space during passive and voluntary head-in-space movements. Experimental Brain Research, 122(1), 93–100.Find this resource:

Blouin, J., Okada, T., Wolsley, C., & Bronstein, A. (1998). Encoding target-trunk relative position: Cervical versus vestibular contribution. Experimental Brain Research, 122(1), 101–107.Find this resource:

Bos, J. E., & Bles, W. (2002). Theoretical considerations on canal-otolith interaction and an observer model. Biological Cybernetics, 86(3), 191–207.Find this resource:

Bottini, G., Corcoran, R., Sterzi, R., Paulesu, E., Schenone, P., Scarpa, P., . . . Frith, C. D. (1994). The role of the right hemisphere in the interpretation of figurative aspects of language. A positron emission tomography activation study. Brain: A Journal of Neurology, 117(6), 1241–1253.Find this resource:

Boyle, R., Buttner, U., & Markert, G. (1985). Vestibular nuclei activity and eye movements in the alert monkey during sinusoidal optokinetic stimulation. Experimental Brain Research, 57(2), 362–369.Find this resource:

Boyle, R., & Pompeiano, O. (1981). Responses of vestibulospinal neurons to neck and macular vestibular inputs in the presence or absence of the paleocerebellum. Annals of the New York Academy of Sciences, 374, 373–394.Find this resource:

Brandt, T., & Dieterich, M. (2015). Does the vestibular system determine the lateralization of brain functions? Journal of Neurology, 262(1), 214–215.Find this resource:

Brandt, T., Dieterich, M., & Danek, A. (1994). Vestibular cortex lesions affect the perception of verticality. Annals of Neurology, 35(4), 403–412.Find this resource:

Bremmer, F., Klam, F., Duhamel, J. R., Ben Hamed, S., & Graf, W. (2002). Visual-vestibular interactive responses in the macaque ventral intraparietal area (VIP). European Journal of Neuroscience, 16(8), 1569–1586.Find this resource:

Brodal, P. (1979). The pontocerebellar projections in the rhesus monkey: An experimental study with retrograde axonal transport of horseradish peroxydase. Neuroscience, 4, 193–208.Find this resource:

Brooks, J. X., Carriot, J., & Cullen, K. E. (2015). Learning to expect the unexpected: Rapid updating in primate cerebellum during voluntary self-motion. Nature Neuroscience, 18, 1310–1317.Find this resource:

Brooks, J. X., & Cullen, K. E. (2009). Multimodal integration in rostral fastigial nucleus provides an estimate of body movement. Journal of Neuroscience, 29(34), 10499–10511.Find this resource:

Brooks, J. X., & Cullen, K. E. (2013). The primate cerebellum selectively encodes unexpected self-motion. Current Biology, 23(11), 947–955.Find this resource:

Brooks, J. X., & Cullen, K. E. (2014). Early vestibular processing does not discriminate active from passive self-motion if there is a discrepancy between predicted and actual proprioceptive feedback. Journal of Neurophysiology, 111(12), 2465–2478.Find this resource:

Brostek, L., Büttner, U., Mustari, M. J., & Glasauer, S. (2015). Eye velocity gain fields in MSTd during optokinetic stimulation. Cerebral Cortex, 25(8), 2181–2190.Find this resource:

Bryan, A. S., & Angelaki, D. E. (2009). Optokinetic and vestibular responsiveness in the macaque rostral vestibular and fastigial nuclei. Journal of Neurophysiology, 101(2), 714–720.Find this resource:

Büttner, U., & Henn, V. (1976). Thalamic unit activity in the alert monkey during natural vestibular stimulation. Brain Research, 103(1), 127–132.Find this resource:

Büttner, U., & Henn, V. (1981). Circularvection: Psychophysics and single-unit recordings in the monkey. Annals of the New York Academy of Sciences, 374, 274–283.Find this resource:

Büttner, U., & Henn, V., & Oswald, H. P. (1977). Vestibular-related neuronal activity in the thalamus of the alert monkey during sinusoidal rotation in the dark. Experimental Brain Research, 30(2), 435–444.Find this resource:

Buettner, U. W., & Büttner, U. (1979). Vestibular nuclei activity in the alert monkey during suppression of vestibular and optokinetic nystagmus. Experimental Brain Research, 37(3), 581–593.Find this resource:

Buettner, U. W., Büttner, U., & Henn, V. (1978). Transfer characteristics of neurons in vestibular nuclei of the alert monkey. Journal of Neurophysiology, 41(6), 1614–1628.Find this resource:

Carriot, J., Brooks, J. X., & Cullen, K. E. (2013). Multimodal integration of self-motion cues in the vestibular system: Active versus passive translations. Journal of Neuroscience, 33(50), 19555–19566.Find this resource:

Carriot, J., Cian, C., Paillard, A., Denise, P., & Lackner, J. R. (2011). Influence of multisensory graviceptive information on the apparent zenith. Experimental Brain Research, 208(4), 569–579.Find this resource:

Carriot, J., Jamali, M., Brooks, J. X., & Cullen, K. E. (2015). Integration of canal and otolith inputs by central vestibular neurons is sub-additive for both active and passive self-motion: Implication for perception. Journal of Neuroscience, 35(8), 3555–3565.Find this resource:

Carriot, J., Jamali, M., Chacron, M. J., & Cullen, K. E. (2014). Statistics of the vestibular input experienced during natural self-motion: Implications for neural processing. Journal of Neuroscience, 34(24), 8347–8357.Find this resource:

Chen, A., DeAngelis, G. C., & Angelaki, D. E. (2013). Functional specializations of the ventral intraparietal area for multisensory heading discrimination. Journal of Neuroscience, 33(8), 3567–3581.Find this resource:

Chen, A., Gu, Y., Liu, S., DeAngelis, G. C., & Angelaki, D. E. (2016). Evidence for a causal contribution of macaque vestibular, but not intraparietal, cortex to heading perception. Journal of Neuroscience, 36(13), 3789–3798.Find this resource:

Chen, A., Henry, E., DeAngelis, G. C., & Angelaki, D. E. (2007). Comparison of responses to three-dimensional rotation and translation in the ventral intraparietal (VIP) and medial superior temporal (MST) areas of rhesus monkey. Society for Neuroscience Abstracts, 33, 715–719.Find this resource:

Clark, B. J., & Taube, J. S. (2012). Vestibular and attractor network basis of the head direction cell signal in subcortical circuits. Frontiers in Neural Circuits, 6, 7.Find this resource:

Cullen, K. E. (2011). The neural encoding of self-motion. Current Opinion in Neurobiology, 21(4), 587–595.Find this resource:

Cullen, K. E. (2012). The vestibular system: Multimodal integration and encoding of self-motion for motor control. Trends in Neuroscience, 35(3), 185–196.Find this resource:

Cullen, K. E., Belton, T., & McCrea, R. A. (1991). A non-visual mechanism for voluntary cancellation of the vestibulo-ocular reflex. Experimental Brain Research, 83(2), 237–252.Find this resource:

Cullen, K. E., & Minor, L. B. (2002). Semicircular canal afferents similarly encode active and passive head-on-body rotations: Implications for the role of vestibular efference. Journal of Neuroscience, 22(11), RC226.Find this resource:

Cullen, K. E., & Roy, J. E. (2004). Signal processing in the vestibular system during active versus passive head movements. Journal of Neurophysiology, 91, 1919–1933.Find this resource:

Dale, A., & Cullen, K. E. (2015). Local population synchrony and the encoding of eye position in the primate neural integrator. Journal of Neuroscience, 35(10), 4287–4295.Find this resource:

DeAngelis, G. C., & Angelaki, D. E. (2012). Visual–vestibular integration for self-motion perception. In M. M. Murray & M. T. Wallace (Eds.), The neural bases of multisensory processes (pp. 629–651). Boca Raton, FL: CRC.Find this resource:

De Waele, C., Baudonnière, P. M., Lepecq, J. C., Tran Ba Huy, P., & Vidal, P. P. (2001). Vestibular projections in the human cortex. Experimental Brain Research, 141(4), 541–551.Find this resource:

Dichgans, J., Bizzi, E., Morasso, P., & Tagliasco, V. (1973). Mechanisms underlying recovery of eye-head coordination following bilateral labyrinthectomy in monkeys. Experimental Brain Research, 18(5), 548–562.Find this resource:

Dickman, J. D., & Angelaki, D. E. (2002). Vestibular convergence patterns in vestibular nuclei neurons of alert primates. Journal of Neurophysiology, 88(6), 3518–3533.Find this resource:

Duffy, C. J. (1998). MST neurons respond to optic flow and translational movement. Journal of Neurophysiology, 80(4), 1816–1827.Find this resource:

Duhamel, J. R., Colby, C. L., & Goldberg, M. E. (1998). Ventral intraparietal area of the macaque: Congruent visual and somatic response properties. Journal of Neurophysiology, 79, 126–136.Find this resource:

Dürsteler, M. R., & Wurtz, R. H. (1988). Pursuit and optokinetic deficits following chemical lesions of cortical areas MT and MST. Journal of Neurophysiology, 60(3), 940–965.Find this resource:

Ebata, S., Sugiuchi, Y., Izawa, Y., Shinomiya, K., & Shinoda, Y. (2004). Vestibular projection to the periarcuate cortex in the monkey. Neuroscience Research, 49(1), 55–68.Find this resource:

Fasold, O., Heinau, J., Trenner, M. U., Villringer, A., & Wenzel, R. (2008). Proprioceptive head posture-related processing in human polysensory cortical areas. NeuroImage, 40(3), 1232–1242.Find this resource:

Fasold, O., von Brevern, M., Kuhberg, M., Ploner, C. J., Villringer, A., Lempert, T., & Wenzel, R. (2002). Human vestibular cortex as identified with caloric stimulation in functional magnetic resonance imaging. NeuroImage, 17(3), 1384–1393.Find this resource:

Fetsch, C. R., Wang, S., Gu, Y., Deangelis, G. C., & Angelaki, D. E. (2007). Spatial reference frames of visual, vestibular, and multimodal heading signals in the dorsal subdivision of the medial superior temporal area. Journal of Neuroscience, 27(3), 700–712.Find this resource:

Friberg, L., Olsen, T. S., Roland, P. E., Paulson, O. B., & Lassen, N. A. (1985). Focal increase of blood flow in the cerebral cortex of man during vestibular stimulation. Brain, 108(3), 609–623.Find this resource:

Frissen, I., Campos, J. L., Souman, J. L., & Ernst, M. O. (2011). Integration of vestibular and proprioceptive signals for spatial updating. Experimental Brain Research, 212(2), 163–176.Find this resource:

Fukushima, K., Akao, T., Kurkin, S., & Fukushima, J. (2005). Role of vestibular signals in the caudal part of the frontal eye fields in pursuit eye movements in three‐dimensional space. Annals of the New York Academy of Sciences, 1039(1), 272–282.Find this resource:

Fukushima, J., Akao, T., Takeichi, N., Kurkin, S., Kaneko, C. R., & Fukushima, K. (2004). Pursuit-related neurons in the supplementary eye fields: Discharge during pursuit and passive whole body rotation. Journal of Neurophysiology, 91(6), 2809–2825.Find this resource:

Fukushima, K., Sato, T., Fukushima, J., Shinmei, Y., & Kaneko, C. R. (2000). Activity of smooth pursuit-related neurons in the monkey periarcuate cortex during pursuit and passive whole-body rotation. Journal of Neurophysiology, 83(1), 563–587.Find this resource:

Furuya, N., Kawano, K., & Shimazu, H. (1975). Functional organization of vestibulofastigial projection in the horizontal semicircular canal system in the cat. Experimental Brain Research, 24, 75–87.Find this resource:

Gdowski, G. T., & McCrea, R. A. (2000). Neck proprioceptive inputs to primate vestibular nucleus neurons: Experimental brain research. Experimentelle Hirnforschung: Experimentation Cerebrale, 135, 511–526.Find this resource:

Gelfand, S. A., (2004). Hearing: An introduction to psychological and physiological acoustics (4th ed.). New York: Marcel Dekker.Find this resource:

Gibson, J. J. (1950). The perception of visual surfaces. American Journal of Psychology, 63(3), 367–384.Find this resource:

Glasauer, S., Amorim, M.-A., Viaud-Delmon, I., & Berthoz, A. (2002). Differential effects of labyrinthine dysfunction on distance and direction during blindfolded walking of a triangular path. Experimental Brain Research, 145(4), 489–497.Find this resource:

Glasauer, S., Amorim, M.-A., Vitte, E., & Berthoz, A. (1994). Goal-directed linear locomotion in normal and labyrinthine-defective subjects. Experimental Brain Research, 98(2), 323–335.Find this resource:

Goldberg, J. M. (2000). Afferent diversity and the organization of central vestibular pathways. Experimental Brain Research, 130(3), 277–297.Find this resource:

Goldberg, J. M., & Cullen, K. E. (2011). Vestibular control of the head: Possible functions of the vestibulocollic reflex. Experimental Brain Research, 210, 331–345.Find this resource:

Goldberg, J. M., Smith, C. E., & Fernandez, C. (1984). Relation between discharge regularity and responses to externally applied galvanic currents in vestibular nerve afferents of the squirrel monkey. Journal of Neurophysiology, 51(6), 1236–1256.Find this resource:

Goldberg, J. M., Wilson, V. J., Cullen, K. E., Angelaki, D. E., Broussard, D. M., Buttner-Ennever, J., . . . Minor, L. B. (2012). The vestibular system: A sixth sense. New York: Oxford University Press.Find this resource:

Grabherr, L., Nicoucar, K., Mast, F. W., & Merfeld, D. M. (2008). Vestibular thresholds for yaw rotation about an earth-vertical axis as a function of frequency. Experimental Brain Research, 186(4), 677–681.Find this resource:

Grasso, R., Ivanenko, Y., & Lacquaniti, F. (1999). Time course of gaze influences on postural responses to neck proprioceptive and galvanic vestibular stimulation in humans. Neuroscience Letters, 273(2), 121–124.Find this resource:

Green, A. M., & Angelaki, D. E. (2007). Coordinate transformations and sensory integration in the detection of spatial orientation and self-motion: From models to experiments. Progress in Brain Research, 165, 155–180.Find this resource:

Green, A. M., Shaikh, A. G., & Angelaki, D. E. (2005). Sensory vestibular contributions to constructing internal models of self-motion. Journal of Neural Engineering, 2(3), S164–S179.Find this resource:

Grüsser, O. J., Pause, M., & Schreiter, U. (1990a). Localization and responses of neurones in the parieto-insular vestibular cortex of awake monkeys (Macaca fascicularis). Journal of Physiology, 430, 537–557.Find this resource:

Grüsser, O. J., Pause, M., & Schreiter, U. (1990b). Vestibular neurones in the parieto-insular cortex of monkeys (Macaca fascicularis): Visual and neck receptor responses. Journal of Physiology, 430, 537–557.Find this resource:

Gu, Y., DeAngelis, G. C., & Angelaki, D. E. (2007). A functional link between area MSTd and heading perception based on vestibular signals. Nature Neuroscience, 10(8), 1038–1047.Find this resource:

Gu, Y., DeAngelis, G. C., & Angelaki, D. E. (2012). Causal links between MSTd neurons and multisensory heading perception. Journal of Neuroscience, 32(7), 2299–2313.Find this resource:

Guedry, F. E. (1974). Psychophysics of vestibular sensation. In H. H. Kornhuber (Ed.), Handbook of sensory physiology (Vol. 6, pp. 1–154). New York: Springer.Find this resource:

Guedry, F. E., & Harris, C. S. (1963, September 25). Labyrinthine function related to experiments on the parallel swing. Project MR005.13–6001, subtask 1, Rep No. 86. Research Report. Naval School of Aviation Medicine (U.S.), 1–32.Find this resource:

Guldin, W. O., Akbarian, S., & Grüsser, O. J. (1992). Cortico-cortical connections and cytoarchitectonics of the primate vestibular cortex: A study in squirrel monkeys (Saimiri sciureus). Journal of Comparative Neurology, 326(3), 375–401.Find this resource:

Guldin, W. O., & Grüsser, O. J. (1998). Is there a vestibular cortex? Trends in Neurosciences, 21, 254–259.Find this resource:

Haque, A., Angelaki, D. E., & Dickman, J. D. (2004). Spatial tuning and dynamics of vestibular semicircular canal afferents in rhesus monkeys. Experimental Brain Research, 155, 81–90.Find this resource:

Hitier, M., Besnard, S., & Smith, P. F. (2014). Vestibular pathways involved in cognition. Frontiers in Integrative Neuroscience, 8, 59.Find this resource:

Hullar, T. E., Della Santina, C. C., Hirvonen, T., Lasker, D. M., Carey, J. P., & Minor, L. B. (2005). Responses of irregularly discharging chinchilla semicircular canal vestibular-nerve afferents during high-frequency head rotations. Journal of Neurophysiology, 93, 2777–2786.Find this resource:

Huterer, M., & Cullen, K. E. (2002). Vestibuloocular reflex dynamics during high-frequency and high- acceleration rotations of the head on body in rhesus monkey. Journal of Neurophysiology, 88, 13–28.Find this resource:

Inaba, N., & Kawano, K. (2010). Responses of MSTd and MT neurons during smooth pursuit exhibit similar temporal frequency dependence on retinal image motion. Cerebral Cortex, 20(7), 1708–1718.Find this resource:

Inaba, N., Miura, K., & Kawano, K. (2011). Direction and speed tuning to visual motion in cortical areas MT and MSTd during smooth pursuit eye movements. Journal of Neurophysiology, 105(4), 1531–1545.Find this resource:

Inaba, N., Shinomoto, S., Yamane, S., Takemura, A., & Kawano, K. (2007). MST neurons code for visual motion in space independent of pursuit eye movements. Journal of Neurophysiology, 97(5), 3473–3483.Find this resource:

Israël, I., André-Deshays, C., Charade, O., & Berthoz, A. (1993). Gaze control: II. Sequences of saccades toward memorized visual targets. Journal of Vestibular Research: Equilibrium & Orientation, 3(3), 345–360.Find this resource:

Israël, I., & Berthoz, A. (1989). Contribution of the otoliths to the calculation of linear displacement. Journal of Neurophysiology, 62(1), 247–263.Find this resource:

Israël, I., Grasso, R., Georges-Francois, P., Tsuzuku, T., & Berthoz, A. (1997). Spatial memory and path integration studied by self-driven passive linear displacement: I. Basic properties. Journal of Neurophysiology, 77(6), 3180–3192.Find this resource:

Israël, I., Rivaud, S., Berthoz, A., & Pierrot-Deseilligny, C. (1992). Cortical control of vestibular memory-guided saccades. Annals of the New York Academy of Sciences, 656, 472–484.Find this resource:

Israël, I., Rivaud, S., Gaymard, B., Berthoz, A., & Pierrot-Deseilligny, C. (1995). Cortical control of vestibular-guided saccades in man. Brain, 118(5), 1169–1183.Find this resource:

Ivanenko, Y. P., & Grasso, R. (1997). Integration of somatosensory and vestibular inputs in perceiving the direction of passive whole-body motion: Brain research. Cognitive Brain Research, 5(4), 323–327.Find this resource:

Ivanenko, Y. P., Grasso, R., Israël, I., & Berthoz, A. (1997). The contribution of otoliths and semicircular canals to the perception of two-dimensional passive whole-body motion in humans. Journal of Physiology, 502(1), 223–233.Find this resource:

Jacob, M. S., & Duffy, C. J. (2015). Steering transforms the cortical representation of self-movement from direction to destination. Journal of Neuroscience, 35(49), 16055–16063.Find this resource:

Jamali, M., Carriot, J., Chacron, M. J., & Cullen, K. E. (2013). Strong correlations between sensitivity and variability give rise to constant discrimination thresholds across the otolith afferent population. Journal of Neuroscience, 33(27), 11302–11313.Find this resource:

Jamali, M., Chacron, M. J., & Cullen, K. E. (2016). Self-motion evokes precise spike timing in the primate vestibular system. Nature Communications, 7, 13229.Find this resource:

Jamali, M., Sadeghi, S. G., & Cullen, K. E. (2009). Response of vestibular nerve afferents innervating utricle and saccule during passive and active translations. Journal of Neurophysiology, 101(1), 141–149.Find this resource:

Jürgens, R., & Becker, W. (2006). Perception of angular displacement without landmarks: Evidence for Bayesian fusion of vestibular, optokinetic, podokinesthetic, and cognitive information. Experimental Brain Research, 174(3), 528–543.Find this resource:

Kammermeier, S., Kleine, J., & Büttner, U. (2009). Vestibular-neck interaction in cerebellar patients. Annals of the New York Academy of Sciences, 1164, 394–399.Find this resource:

Kasper, J., Schor, R. H., & Wilson, V. J. (1988). Response of vestibular neurons to head rotations in vertical planes: I. Response to vestibular stimulation. Journal of Neurophysiology, 60(5), 1753–1764.Find this resource:

Kawasaki, M. (2005). Central neuroanatomy of electrosensory systems in fish. In T. H. Bullock, Electroreception (pp. 154–194). New York: Springer.Find this resource:

Keller, E. L., & Precht, W. (1979). Visual-vestibular responses in vestibular nuclear neurons in the intact and cerebellectomized, alert cat. Neuroscience, 4(11), 1599–1613.Find this resource:

Kennedy, P. M., & Inglis, J. T. (2002). Interaction effects of galvanic vestibular stimulation and head position on the soleus H reflex in humans. Clinical Neurophysiology, 113, 1709–1714.Find this resource:

Klam, F., & Graf, W. (2003). Vestibular signals of posterior parietal cortex neurons during active and passive head movements in macaque monkeys. Annals of the New York Academy of Sciences, 1004, 271–282.Find this resource:

Klam, F., & Graf, W. (2006). Discrimination between active and passive head movements by macaque ventral and medial intraparietal cortex neurons. Journal of Physiology, 574(2), 367–386.Find this resource:

Lackner, J. R., & DiZio, P. (2005). Vestibular, proprioceptive, and haptic contributions to spatial orientation. Annual Review of Psychology, 56, 115–147.Find this resource:

Lang, W., Büttner-Ennever, J. A., & Büttner, U. (1979). Vestibular projections to the monkey thalamus: An autoradiographic study. Brain Research, 177, 3–17.Find this resource:

Lanman, J., Bizzi, E., & Allum, J. (1978). The coordination of eye and head movement during smooth pursuit. Brain Research, 153(1), 39–53.Find this resource:

Lannou, J., Cazin, L., Precht, W., & Toupet, M. (1982). Optokinetic, vestibular, and optokinetic-vestibular responses in albino and pigmented rats. Pflugers Archiv: European Journal of Physiology, 393(1), 42–44.Find this resource:

Laurens, J., & Angelaki, D. E. (2011). The functional significance of velocity storage and its dependence on gravity. Experimental Brain Research, 210(3–4), 407–422.Find this resource:

Laurens, J., & Droulez, J. (2007). Bayesian processing of vestibular information. Biological Cybernetics, 96(4), 389–404.Find this resource:

Laurens, J., Meng, H., & Angelaki, D. E. (2013). Neural representation of orientation relative to gravity in the macaque cerebellum. Neuron, 80(6), 1508–1518.Find this resource:

Laurens, J., Strauman, D., & Hess, B. J. (2011). Spinning versus wobbling: How the brain solves a geometry problem. Journal of Neuroscience, 31(22), 8093–8101.Find this resource:

Lisberger, S. G. (1990). Visual tracking in monkeys: Evidence for short-latency suppression of the vestibuloocular reflex. Journal of Neurophysiology, 63(4), 676–688.Find this resource:

Livingstone, M. S., & Hubel, D. H. (1987). Psychophysical evidence for separate channels for the perception of form, color, movement, and depth. Journal of Neuroscience, 7(11), 3416–3468.Find this resource:

Lobel, E., Kleine, J. F., Bihan, D. L., Leroy-Willig, A., & Berthoz, A. (1998). Functional MRI of galvanic vestibular stimulation. Journal of Neurophysiology, 80(5), 2699–2709.Find this resource:

Lobel, E., Kleine, J. F., Leroy-Willig, A., Van de Moortele, P. F., Le Bihan, D., Grüsser, O. J., & Berthoz, A. (1999). Cortical areas activated by bilateral galvanic vestibular stimulation. Annals of the New York Academy of Sciences, 871, 313–323.Find this resource:

Loomis, J. M., Blascovich, J. J., & Beall, A. C. (1999). Immersive virtual environment technology as a basic research tool in psychology. Behavior Research Methods, Instruments, & Computers, 31(4), 557–564.Find this resource:

Lopez, C., & Blanke, O. (2011). The thalamocortical vestibular system in animals and humans. Brain Research Reviews, 67(1–2), 119–146.Find this resource:

Luan, H., Gdowski, M. J., Newlands, S. D., & Gdowski, G. T. (2013). Convergence of vestibular and neck proprioceptive sensory signals in the cerebellar interpositus. Journal of Neuroscience, 33(3), 1198–1210.Find this resource:

MacNeilage, P. R., Turner, A. H., & Angelaki, D. E. (2010). Canal-otolith interactions and detection thresholds of linear and angular components during curved-path self-motion. Journal of Neurophysiology, 104(2), 765–773.Find this resource:

Magnin, M., & Fuchs, A. F. (1977). Discharge properties of neurons in the monkey thalamus tested with angular acceleration, eye movement and visual stimuli. Experimental Brain Research, 28(3–4), 293–299.Find this resource:

Manzoni, D. (2007). The cerebellum and sensorimotor coupling: Looking at the problem from the perspective of vestibular reflexes. Cerebellum, 6, 24–37.Find this resource:

Manzoni, D., Andre, P., & Bruschini, L. (2004). Coupling sensory inputs to the appropriate motor responses: New aspects of cerebellar function. Archives Italiennes de Biologie, 142(3), 199–215.Find this resource:

Manzoni, D., Pompeiano, O., & Andre, P. (1998a). Convergence of directional vestibular and neck signals on cerebellar Purkinje cells. Pflügers Archiv: European Journal of Physiology, 435(5), 617–630.Find this resource:

Manzoni, D., Pompeiano, O., & Andre, P. (1998b). Neck influences on the spatial properties of vestibulospinal reflexes in decerebrate cats: Role of the cerebellar anterior vermis. Journal of Vestibular Research: Equilibrium & Orientation, 8(4), 283–297.Find this resource:

Manzoni, D., Pompeiano, O., Bruschini, L., & Andre, P. (1999). Neck input modifies the reference frame for coding labyrinthine signals in the cerebellar vermis: A cellular analysis. Neuroscience, 93(3), 1095–1107.Find this resource:

Marlinski, V., & McCrea, R. A. (2008a). Activity of ventroposterior thalamus neurons during rotation and translation in the horizontal plane in the alert squirrel monkey. Journal of Neurophysiology, 99(5), 2533–2545.Find this resource:

Marlinski, V., & McCrea, R. A. (2008b). Coding of self-motion signals in ventro-posterior thalamus neurons in the alert squirrel monkey. Experimental Brain Research, 189(4), 463–472.Find this resource:

Marr, D. (1982). Vision: A computational investigation into the human representation and processing of visual information. San Francisco: Henry Holt.Find this resource:

Massot, C., Chacron, M. J., & Cullen, K. E. (2011). Information transmission and detection thresholds in the vestibular nuclei: Single neurons vs. population encoding. Journal of Neurophysiology, 105(4), 1798–1814.Find this resource:

Massot, C., Schneider, A. D., Chacron, M. J., & Cullen, K. E. (2012). The vestibular system implements a linear-nonlinear transformation in order to encode self-motion. PLoS Biology, 10(7), e1001365.Find this resource:

Mayne, R. (1974). A systems concept of the vestibular organs. In H. H. Kornhuber (Ed.), Handbook of sensory physiology (Vol. 6.2, pp. 493–580). Berlin: Springer.Find this resource:

McCall, A. A., Miller, D. J., Catanzaro, M. F., Cotter, L. A., & Yates, B. J. (2015). Hindlimb movement modulates the activity of rostral fastigial nucleus neurons that process vestibular input. Experimental Brain Research, 233(8), 2411–2419.Find this resource:

McCall, A. A., Miller, D. M., DeMayo, W. M., Bourdages, G. H., & Yates, B. J. (2016). Vestibular nucleus neurons respond to hindlimb movement in the conscious cat. Journal of Neurophysiology, 116(4), 1785–1794.Find this resource:

McNaughton, B. L., Battaglia, F. P., Jensen, O., Moser, E. I., & Moser, M. B. (2006). Path integration and the neural basis of the “cognitive map.” Nature Reviews Neuroscience, 7(8), 663–678.Find this resource:

McNaughton, B. L., Chen, L. L., & Markus, E. J. (1991). “Dead reckoning,” landmark learning, and the sense of direction: A neurophysiological and computational hypothesis. Journal of Cognitive Neuroscience, 3(2), 190–202.Find this resource:

Medrea, I., & Cullen, K. E. (2013). Multisensory integration in early vestibular processing in mice: The encoding of passive vs. active motion. Journal of Neurophysiology, 110(12), 2704–2717.Find this resource:

Meng, H., & Angelaki, D. E. (2010). Responses of ventral posterior thalamus neurons to three-dimensional vestibular and optic flow stimulation. Journal of Neurophysiology, 103(2), 817–826.Find this resource:

Meng, H., May, P. J., Dickman, J. D., & Angelaki, D. E. (2007). Vestibular signals in primate thalamus: Properties and origins. Journal of Neuroscience, 27(50), 13590–13602.Find this resource:

Merfeld, D. M. (1995). Modeling human vestibular responses during eccentric rotation and off vertical axis rotation. Acta Otolaryngologica, S520(2), 354–359.Find this resource:

Merfeld, D. M., Park, S., Gianna-Poulin, C., Black, F. O., & Wood, S. (2005). Vestibular perception and action employ qualitatively different mechanisms: I. Frequency response of VOR and perceptual responses during Translation and Tilt. Journal of Neurophysiology, 94(1), 186–198.Find this resource:

Merfeld, D. M., Zupan, L. H., & Gifford, C. A. (2001). Neural processing of gravito-inertial cues in humans: II. Influence of the semicircular canals during eccentric rotation. Journal of Neurophysiology, 85(4), 1648–1660.Find this resource:

Merfeld, D. M., Zupan, L., & Peterka, R. J. (1999). Humans use internal models to estimate gravity and linear acceleration. Nature, 398(6728), 615–618.Find this resource:

Mergner, T., Anastasopoulos, D., Becker, W., & Deecke, L. (1981). Discrimination between trunk and head rotation: A study comparing neuronal data from the cat with human psychophysics. Acta Psychologica, 48(1), 291–302.Find this resource:

Mergner, T., Siebold, C., Schweigart, G., & Becker, W. (1991). Human perception of horizontal trunk and head rotation in space during vestibular and neck stimulation. Experimental Brain Research, 85, 389–404.Find this resource:

Merigan, W. H., & Maunsell, J. H. (1993). How parallel are the primate visual pathways? Annual Review of Neuroscience, 16, 369–402.Find this resource:

Miller, W. L., Maffei, V., Bosco, G., Iosa, M., Zago, M., Macaluso, E., & Lacquaniti, F. (2008). Vestibular nuclei and cerebellum put visual gravitational motion in context. Journal of Neurophysiology, 99(4), 1969–1982.Find this resource:

Mittelstaedt, M. L., & Glasauer, S. (1991). Idiothetic navigation in gerbils and humans. Zoologische Jahrbücher Physiologie, 95, 427–435.Find this resource:

Mittelstaedt, M. L., & Glasauer, S. (1992). The contribution of internal and substratal information to the perception of linear displacement. In H. Kreijcova & J. Jerabek (Eds.), Proceedings of the XVIIth Barany Society Meeting, Dobris, Czechoslovakia, July 2–5 (pp. 102–105). Dobříš: Barany Society.Find this resource:

Nakamura, T., & Bronstein, A. M. (1995). The perception of head and neck angular displacement in normal and labyrinthine-defective subjects. Brain, 118(5), 1157–1168.Find this resource:

Oertel, D. (1999). The role of timing in the brain stem auditory nuclei of vertebrates. Annual Review of Physiology, 61(1), 497–519.Find this resource:

Page, W. K., & Duffy, C. J. (2003). Heading representation in MST: Sensory interactions and population encoding. Journal of Neurophysiology, 89(4), 1994–2013.Find this resource:

Page, W. K., & Duffy, C. J. (2008). Cortical neuronal responses to optic flow are shaped by visual strategies for steering. Cerebral Cortex, 18, 727–739.Find this resource:

Penfield, W. (1957). Vestibular sensation and the cerebral cortex. Annals of Otology, Rhinology, and Laryngology, 66(3), 691–698.Find this resource:

Pierrot-Deseilligny, C., Israël, I., Berthoz, A., Rivaud, S., & Gaymard, B. (1993). Role of the different frontal lobe areas in the control of the horizontal component of memory-guided saccades in man. Experimental Brain Research, 95(1), 166–171.Find this resource:

Ramachandran, R., & Lisberger, S. G. (2006). Transformation of vestibular signals into motor commands in the vestibuloocular reflex pathways of monkeys. Journal of Neurophysiology, 96, 1061–1074.Find this resource:

Reisine, H., & Raphan, T. (1992). Unit activity in the vestibular nuclei of monkeys during off-vertical axis rotation. Annals of the New York Academy of Sciences, 656, 954–956.Find this resource:

Roy, J. E., & Cullen, K. E. (2001). Selective processing of vestibular reafference during self-generated head motion. Journal of Neuroscience, 21(6), 2131–2142.Find this resource:

Roy, J. E., & Cullen, K. E. (2002). Vestibuloocular reflex signal modulation during voluntary and passive head movements. Journal of Neurophysiology, 87(5), 2337–2357.Find this resource:

Roy, J. E., & Cullen, K. E. (2004). Dissociating self-generated from passively applied head motion: Neural mechanisms in the vestibular nuclei. Journal of Neuroscience, 24(9), 2102–2111.Find this resource:

Sadeghi, S. G., Chacron, M. J., Taylor, M. C., & Cullen, K. E. (2007). Neural variability, detection thresholds, and information transmission in the vestibular system. Journal of Neuroscience, 27(4), 771–781.Find this resource:

Sadeghi, S. G., Minor, L. B., & Cullen, K. E. (2007). Response of vestibular-nerve afferents to active and passive rotations under normal conditions and after unilateral labyrinthectomy. Journal of Neurophysiology, 97(2), 1503–1514.Find this resource:

Sadeghi, S. G., Minor, L. B., & Cullen, K. E. (2010). Neural correlates of motor learning in the vestibulo-ocular reflex: Dynamic regulation of multimodal integration in the macaque vestibular system. Journal of Neuroscience, 30(30), 10158–10168.Find this resource:

Sadeghi, S. G., Minor, L. B., & Cullen, K. E. (2011). Multimodal integration after unilateral labyrinthine lesion: Single vestibular nuclei neuron responses and implications for postural compensation. Journal of Neurophysiology, 105(2), 661–673.Find this resource:

Sadeghi, S. G., Minor, L. B., & Cullen, K. E. (2012). Neural correlates of sensory substitution in vestibular pathways following complete vestibular loss. Journal of Neuroscience, 32(42), 14685–14695.Find this resource:

Sadeghi, S. G., Mitchell, D. E., & Cullen, K. E. (2009). Different neural strategies for multimodal integration: Comparison of two macaque monkey species. Experimental Brain Research, 195(1), 45–57.Find this resource:

Sadeghi, S. G., Minor, L. B., & Cullen, K. E. (2006). Dynamics of the horizontal vestibuloocular reflex after unilateral labyrinthectomy: Response to high frequency, high acceleration, and high velocity rotations. Experimental Brain Research, 175, 471–484.Find this resource:

Schlack, A., Hoffmann, K.-P., & Bremmer, F. (2002). Interaction of linear vestibular and visual stimulation in the macaque ventral intraparietal area (VIP). European Journal of Neuroscience, 16(10), 1877–1886.Find this resource:

Schneider, A. D., Cullen, K. E., & Chacron, M. J. (2015). The increased sensitivity of irregular peripheral canal and otolith vestibular afferents optimizes their encoding of natural stimuli. Journal of Neuroscience, 35(14), 5522–5536.Find this resource:

Shimazu, H., & Smith, C. M. (1971). Cerebellar and labyrinthine influences on single vestibular neurons identified by natural stimuli. Journal of Neurophysiology, 34, 493–508.Find this resource:

Shinder, M. E., & Newlands, S. D. (2014). Sensory convergence in the parieto-insular vestibular cortex. Journal of Neurophysiology, 111(12), 2445–2464.Find this resource:

Shinder, M. E., & Taube, J. S. (2010). Differentiating ascending vestibular pathways to the cortex involved in spatial cognition. Journal of Vestibular Research: Equilibrium & Orientation, 20(1), 3–23.Find this resource:

Simoncelli, E. P., & Olshausen, B. A. (2001). Natural image statistics and neural representation. Annual Review of Neuroscience, 24, 1193–1216.Find this resource:

Skaggs, W. E., Knierim, J. J., Kudrimoti, H. S., & McNaughton, B. L. (1995). A model of the neural basis of the rat’s sense of direction. Advances in Neural Information Processing Systems, 7, 173–180.Find this resource:

Soyka, F., Giordano, P. R., Beykirch, K., & Bülthoff, H. H. (2011). Predicting direction detection thresholds for arbitrary translational acceleration profiles in the horizontal plane. Experimental Brain Research, 209(1), 95–107.Find this resource:

Suzuki, M., Kitano, H., Ito, R., Kitanishi, T., Yazawa, Y., Ogawa, T., . . . Kitajima, K. (2001). Cortical and subcortical vestibular response to caloric stimulation detected by functional magnetic resonance imaging. Cognitive Brain Research, 12(3), 441–449.Find this resource:

Takahashi, T., Moiseff, A., & Konishi, M. (1984). Time and intensity cues are processed independently in the auditory system of the owl. Journal of Neuroscience, 4(7), 1781–1786.Find this resource:

Takemura, K., & King, W. M. (2005). Vestibulo-colic reflex (VCR) in mice. Experimental Brain Research, 167(1), 103–107.Find this resource:

Takemura, A., Murata, Y., Kawano, K., & Miles, F. A. (2007). Deficits in short-latency tracking eye movements after chemical lesions in monkey cortical areas MT and MST. Journal of Neuroscience, 27, 529–541.Find this resource:

Taube, J. S. (2007). The head direction signal: Origins and sensory-motor integration. Annual Review of Neuroscience, 30, 181–207.Find this resource:

Taube, J. S., & Basset, J. P. (2003). Persistent neural activity in head direction cells. Cerebral Cortex, 13, 1162–1172.Find this resource:

Tokita, T., Ito, Y., & Takagi, K. (1989). Modulation by head and trunk positions of the vestibulo-spinal reflexes evoked by galvanic stimulation of the labyrinth: Observations by labyrinthine evoked EMG. Acta Otolaryngologica, 107, 327–332.Find this resource:

Tokita, T., Miyata, H., Takagi, K., & Ito, Y. (1991). Studies on vestibulo-spinal reflexes by examination of labyrinthine-evoked EMGs of lower limbs. Acta Otolaryngologica, S481, 328–332.Find this resource:

Valko, Y., Lewis, R. F., Priesol, A. J., & Merfeld, D. M. (2012). Vestibular labyrinth contributions to human whole-body motion discrimination. Journal of Neuroscience, 32(39), 13537–13542.Find this resource:

Vitte, E., Derosier, C., Caritu, Y., Berthoz, A., Hasboun, D., & Soulié, D. (1996). Activation of the hippocampal formation by vestibular stimulation: A functional magnetic resonance imaging study. Experimental Brain Research, 112(3), 523–526.Find this resource:

Waespe, W., & Henn, V. (1977a). Neuronal activity in the vestibular nuclei of the alert monkey during vestibular and optokinetic stimulation. Experimental Brain Research, 27(5), 523–538.Find this resource:

Waespe, W., & Henn, V. (1977b). Vestibular nuclei activity during optokinetic after-nystagmus (OKAN) in the alert monkey. Experimental Brain Research, 30(2–3), 323–330.Find this resource:

Wearne, S., Raphan, T., & Cohen, B. (1998). Control of spatial orientation of the angular vestibuloocular reflex by the nodulus and uvula. Journal of Neurophysiology, 79(5), 2690–2715.Find this resource:

Wiener, S. I., & Berthoz, A. (1993). Forebrain structures mediating the vestibular contribution during navigation. Multisensory Control of Movement, 1, 427–456.Find this resource:

Wiener, S. I., Berthoz, A., & Zugaro, M. B. (2002). Multisensory processing in the elaboration of place and head direction responses by limbic system neurons. Brain Research: Cognitive Brain Research, 14, 75–90.Find this resource:

Wilson, V. J., & Schor, R. H. (1999). The neural substrate of the vestibulocollic reflex: What needs to be learned. Experimental Brain Research, 129, 483–493.Find this resource:

Wilson, V. J., Zarzecki, P., Schor, R. H., Isu, N., Rose, P. K, Sato, H., . . . Umezaki, T. (1999). Cortical influences on the vestibular nuclei of the cat. Experimental Brain Research, 125, 1–13.Find this resource:

Worchel, P. (1952). The role of the vestibular organs in space orientation. Journal of Experimental Psychology, 44(1), 4–10.Find this resource:

Yakusheva, T. A., Shaikh, A. G., Green, A. M., Blazquez, P. M, Dickman, J. D., & Angelaki, D. E. (2007). Purkinje cells in posterior cerebellar vermis encode motion in an inertial reference frame. Neuron, 54(6), 973–985.Find this resource:

Zarzecki, P., Blum, P. S., Bakker, D. A., & Herman, D. (1983). Convergence of sensory inputs upon projection neurons of somatosensory cortex: Vestibular, neck, head, and forelimb inputs. Experimental Brain Research, 50(2–3), 408–414.Find this resource:

Zupan, L. H., Merfeld, D. M., & Darlot, C. (2002). Using sensory weighting to model the influence of canal, otolith and visual cues on spatial orientation and eye movements. Biological Cybernetics, 86(3), 209–230.Find this resource: