Jeffrey R. Holt and Gwenaëlle S.G. Géléoc
The organs of the vertebrate inner ear respond to a variety of mechanical stimuli: semicircular canals are sensitive to angular velocity, the saccule and utricle respond to linear acceleration (including gravity), and the cochlea is sensitive to airborne vibration, or sound. The ontogenically related lateral line organs, spaced along the sides of aquatic vertebrates, sense water movement. All these organs have a common receptor cell type, which is called the hair cell, for the bundle of enlarged microvilli protruding from its apical surface. In different organs, specialized accessory structures serve to collect, filter, and then deliver these physical stimuli to the hair bundles. The proximal stimulus for all hair cells is deflection of the mechanosensitive hair bundle. Hair cells convert mechanical information contained within the temporal pattern of hair bundle deflections into electrical signals, which they transmit to the brain for interpretation.
Age-related hearing loss affects over half of the elderly population, yet it remains poorly understood. Natural aging can cause the input to the brain from the cochlea to be progressively compromised in most individuals, but in many cases the cochlea has relatively normal sensitivity and yet people have an increasingly difficult time processing complex auditory stimuli. The two main deficits are in sound localization and temporal processing, which lead to poor speech perception. Animal models have shown that there are multiple changes in the brainstem, midbrain, and thalamic auditory areas as a function of age, giving rise to an alteration in the excitatory/inhibitory balance of these neurons. This alteration is manifest in the cerebral cortex as higher spontaneous and driven firing rates, as well as broader spatial and temporal tuning. These alterations in cortical responses could underlie the hearing and speech processing deficits that are common in the aged population.
Paul E. Nachtigall
Toothed whales and dolphins, odontocete cetaceans, produce very loud biosonar sounds in order to navigate and to locate and catch their prey of fish and squid. Underwater biosonar was not discovered until after 1950, but the initial experiments demonstrated a unique sensory modality that could find small targets far away and distinguish between objects buried in mud that differed only by the metal from which they were made. Dolphins determine the distance to their prey by evaluating very small time differences between the outgoing signal and the echo return. The type of outgoing signal varies greatly from low frequency, explosively loud sperm whale clicks, to frequency modulated mid-frequency beaked whale sounds, to very high frequency (over 100 kHz) harbor porpoise signals. All appear to be made by specialized pneumatic phonic lips closely connected to sound projecting fatty melons that focus sound before sending out narrow echolocation sound beams. The frequency of most hearing is matched to echolocation, with the areas of best hearing of the animals being the areas of principal outgoing signal frequency. The sensation levels of hearing are under the animal’s control with “automatic gain control” operating to assure the best hearing of the echo returns. Angular localization of the bottlenose dolphins, for discriminating the minimum audible angles of clicks, is less than one degree in both the horizontal and vertical directions. This remarkable localization performance has yet to be fully explained, but new hypotheses of gular pathways, shaded receiver models, and internal pinnae may provide some explanations as a theory of auditory localization in the odontocetes develops.
Yaniv Cohen, Emmanuelle Courtiol, Regina M. Sullivan, and Donald A. Wilson
Odorants, inhaled through the nose or exhaled from the mouth through the nose, bind to receptors on olfactory sensory neurons. Olfactory sensory neurons project in a highly stereotyped fashion into the forebrain to a structure called the olfactory bulb, where odorant-specific spatial patterns of neural activity are evoked. These patterns appear to reflect the molecular features of the inhaled stimulus. The olfactory bulb, in turn, projects to the olfactory cortex, which is composed of multiple sub-units including the anterior olfactory nucleus, the olfactory tubercle, the cortical nucleus of the amygdala, the anterior and posterior piriform cortex, and the lateral entorhinal cortex. Due to differences in olfactory bulb inputs, local circuitry and other factors, each of these cortical sub-regions appears to contribute to different aspects of the overall odor percept. For example, there appears to be some spatial organization of olfactory bulb inputs to the cortical nucleus of the amygdala, and this region may be involved in the expression of innate odor hedonic preferences. In contrast, the olfactory bulb projection to the piriform cortex is highly distributed and not spatially organized, allowing the piriform to function as a combinatorial, associative array, producing the emergence of experience-dependent odor-objects (e.g., strawberry) from the molecular features extracted in the periphery. Thus, the full perceptual experience of an odor requires involvement of a large, highly dynamic cortical network.
Many mammals, including humans, rely primarily on vision to sense the environment. While a large proportion of the brain is devoted to vision in highly visual animals, there are not enough neurons in the visual system to support a neuron-per-object look-up table. Instead, visual animals evolved ways to rapidly and dynamically encode an enormous diversity of visual information using minimal numbers of neurons (merely hundreds of millions of neurons and billions of connections!). In the mammalian visual system, a visual image is essentially broken down into simple elements that are reconstructed through a series of processing stages, most of which occur beneath consciousness. Importantly, visual information processing is not simply a serial progression along the hierarchy of visual brain structures (e.g., retina to visual thalamus to primary visual cortex to secondary visual cortex, etc.). Instead, connections within and between visual brain structures exist in all possible directions: feedforward, feedback, and lateral. Additionally, many mammalian visual systems are organized into parallel channels, presumably to enable efficient processing of information about different and important features in the visual environment (e.g., color, motion). The overall operations of the mammalian visual system are to: (1) combine unique groups of feature detectors in order to generate object representations and (2) integrate visual sensory information with cognitive and contextual information from the rest of the brain. Together, these operations enable individuals to perceive, plan, and act within their environment.
Tyler S. Manning and Kenneth H. Britten
The ability to see motion is critical to survival in a dynamic world. Decades of physiological research have established that motion perception is a distinct sub-modality of vision supported by a network of specialized structures in the nervous system. These structures are arranged hierarchically according to the spatial scale of the calculations they perform, with more local operations preceding those that are more global. The different operations serve distinct purposes, from the interception of small moving objects to the calculation of self-motion from image motion spanning the entire visual field. Each cortical area in the hierarchy has an independent representation of visual motion. These representations, together with computational accounts of their roles, provide clues to the functions of each area. Comparisons between neural activity in these areas and psychophysical performance can identify which representations are sufficient to support motion perception. Experimental manipulation of this activity can also define which areas are necessary for motion-dependent behaviors like self-motion guidance.
Understanding the principles by which sensory systems represent natural stimuli is one of the holy grails of neuroscience. In the auditory system, the study of the coding of natural sounds has a particular prominence. Indeed, the relationships between neural responses to simple stimuli (usually pure tone bursts)—often used to characterize auditory neurons—and complex sounds (in particular natural sounds) may be complex. Many different classes of natural sounds have been used to study the auditory system. Sound families that researchers have used to good effect in this endeavor include human speech, species-specific vocalizations, an “acoustic biotope” selected in one way or another, and sets of artificial sounds that mimic important features of natural sounds.
Peripheral and brainstem representations of natural sounds are relatively well understood. The properties of the peripheral auditory system play a dominant role, and further processing occurs mostly within the frequency channels determined by these properties. At the level of the inferior colliculus, the highest brainstem station, representational complexity increases substantially due to the convergence of multiple processing streams. Undoubtedly, the most explored part of the auditory system, in term of responses to natural sounds, is the primary auditory cortex. In spite of over 50 years of research, there is still no commonly accepted view of the nature of the population code for natural sounds in the auditory cortex. Neurons in the auditory cortex are believed by some to be primarily linear spectro-temporal filters, by others to respond to conjunctions of important sound features, or even to encode perceptual concepts such as “auditory objects.” Whatever the exact mechanism is, many studies consistently report a substantial increase in the variability of the response patterns of cortical neurons to natural sounds. The generation of such variation may be the main contribution of auditory cortex to the coding of natural sounds.
Simon Potier, Mindaugas Mitkus, Olivier Duriez, Almut Kelber, and Graham Martin
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Neuroscience. Please check back later for the full article.
Diurnal raptors (birds of the orders Accipitriformes and Falconiformes), renowned for their extraordinarily sharp eyesight, have fascinated humans for centuries. The high visual acuity in raptors is possible due to their unusually large eyes, both in relative and absolute terms, and a very high density of cone photoreceptors. Some large raptors, such as wedge-tailed eagles and the Old World vultures, have visual acuities twice that of humans, and six times that of ostriches, the animals with the largest terrestrial eyes. The highest density of cones occurs in one or two specialized retinal areas, the foveae, where, at least in some species, rods are lacking. The central deep fovea allows for the highest acuity in the lateral visual field that is probably used for detecting prey from a large distance. Actively hunting raptors have a second, shallower, temporal fovea that provides sharp vision in the frontal binocular visual field. Scavenging carrion eaters do not possess a temporal fovea, which might indicate different needs in foraging behavior.
Diurnal raptors, like most birds, have tetrachromatic color vision, based on four spectral types of cones sensitive to violet, blue, green, and red light. However, unlike most birds, their eyes are not very sensitive to ultraviolet light because it is strongly absorbed by the cornea and lens. Four cone types are present in the central fovea; thus, diurnal raptors might possess high-resolution tetrachromatic vision. However, because cones are narrow and densely packed and because rods are absent in the central fovea, the visual acuity of diurnal raptors drops dramatically as light levels decrease. These and other visual properties underpin prey detection and pursuit and reveal the ways in which these birds’ vision is highly tuned to make them successful diurnal predators.
Jose M. Alonso and Harvey A. Swadlow
The thalamocortical pathway is the main route of sensory information to the cerebral cortex. Vision, touch, hearing, taste, and balance all depend on the integrity of this pathway that connects the thalamic structures receiving sensory input with the cortical areas specialized in each sensory modality. Only the ancient sense of smell is independent of the thalamus, gaining access to cortex through more anterior routes. While the thalamocortical pathway targets different layers of the cerebral cortex, its main stream projects to the middle layers and has axon terminals that are dense, spatially restricted, and highly specific in their connections. The remarkable specificity of these thalamocortical connections allows for a precise reconstruction of the sensory dimensions that need to be most finely sampled, such as spatial acuity in vision and sound frequency in hearing. The thalamic axon terminals also segregate topographically according to their stimulus preferences, providing a simple principle to build cortical sensory maps: neighboring values in sensory space are represented by neighboring points within the cortex.
Thalamocortical processing is not static. It is continuously modulated by the brain stem and corticothalamic feedback based on the level of attention and alertness, and during sleep or general anesthesia. When alert, visual thalamic responses become stronger, more reliable, more sustained, more effective at sampling fast changes in the scene, and more linearly related to the stimulus. The high firing rates of the alert state make thalamocortical synapses chronically depressed and excitatory synaptic potentials less dependent on temporal history, improving even further the linear relation between stimulus and response. In turn, when alertness wanes, the thalamus reduces its firing rate, and starts generating spike bursts that drive large postsynaptic responses and keep the cortex responsive to sudden stimulus changes.
Sabine Kastner and Timothy J. Buschman
Natural scenes are cluttered and contain many objects that cannot all be processed simultaneously. Due to this limited processing capacity, neural mechanisms are needed to selectively enhance the information that is most relevant to one’s current behavior and to filter unwanted information. We refer to these mechanisms as “selective attention.” Attention has been studied extensively at the behavioral level in a variety of paradigms, most notably, Treisman’s visual search and Posner’s paradigm. These paradigms have also provided the basis for studies directed at understanding the neural mechanisms underlying attentional selection, both in the form of neuroimaging studies in humans and intracranial electrophysiology in non-human primates. The selection of behaviorally relevant information is mediated by a large-scale network that includes regions in all major lobes as well as subcortical structures. Attending to a visual stimulus modulates processing across the visual processing hierarchy with stronger effects in higher-order areas. Current research is aimed at characterizing the functions of the different network nodes as well as the dynamics of their functional connectivity.