Synaptic connections in the brain can change their strength in response to patterned activity. This ability of synapses is defined as synaptic plasticity. Long lasting forms of synaptic plasticity, long-term potentiation (LTP), and long-term depression (LTD), are thought to mediate the storage of information about stimuli or features of stimuli in a neural circuit. Since its discovery in the early 1970s, synaptic plasticity became a central subject of neuroscience, and many studies centered on understanding its mechanisms, as well as its functional implications.
Cynthia M. Harley and Mark K. Asplen
Annelid worms are simultaneously an interesting and difficult model system for understanding the evolution of animal vision. On the one hand, a wide variety of photoreceptor cells and eye morphologies are exhibited within a single phylum; on the other, annelid phylogenetics has been substantially re-envisioned within the last decade, suggesting the possibility of considerable convergent evolution. This article reviews the comparative anatomy of annelid visual systems within the context of the specific behaviors exhibited by these animals. Each of the major classes of annelid visual systems is examined, including both simple photoreceptor cells (including leech body eyes) and photoreceptive cells with pigment (trochophore larval eyes, ocellar tubes, complex eyes); meanwhile, behaviors examined include differential mobility and feeding strategies, similarities (or differences) in larval versus adult visual behaviors within a species, visual signaling, and depth sensing. Based on our review, several major trends in the comparative morphology and ethology of annelid vision are highlighted: (1) eye complexity tends to increase with mobility and higher-order predatory behavior; (2) although they have simple sensors these can relay complex information through large numbers or multimodality; (3) polychaete larval and adult eye morphology can differ strongly in many mobile species, but not in many sedentary species; and (4) annelids exhibiting visual signaling possess even more complex visual systems than expected, suggesting the possibility that complex eyes can be simultaneously well adapted to multiple visual tasks.
Anitha Pasupathy, Yasmine El-Shamayleh, and Dina V. Popovkina
Humans and other primates rely on vision. Our visual system endows us with the ability to perceive, recognize, and manipulate objects, to avoid obstacles and dangers, to choose foods appropriate for consumption, to read text, and to interpret facial expressions in social interactions. To support these visual functions, the primate brain captures a high-resolution image of the world in the retina and, through a series of intricate operations in the cerebral cortex, transforms this representation into a percept that reflects the physical characteristics of objects and surfaces in the environment. To construct a reliable and informative percept, the visual system discounts the influence of extraneous factors such as illumination, occlusions, and viewing conditions. This perceptual “invariance” can be thought of as the brain’s solution to an inverse inference problem in which the physical factors that gave rise to the retinal image are estimated. While the processes of perception and recognition seem fast and effortless, it is a challenging computational problem that involves a substantial proportion of the primate brain.
Kathleen E. Cullen
As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.
Tatyana O. Sharpee
Sensory systems exist to provide an organism with information about the state of the environment that can be used to guide future actions and decisions. Remarkably, two conceptually simple yet general theorems from information theory can be used to evaluate the performance of any sensory system. One theorem states that there is a minimal amount of energy that an organism has to spend in order to capture a given amount of information about the environment. The second theorem states that the maximum rate with which the organism can acquire resources from the environment, relative to its competitors, is limited by the information this organism collects about the environment, also relative to its competitors.
These two theorems provide a scaffold for formulating and testing general principles of sensory coding but leave unanswered many important practical questions of implementation in neural circuits. These implementation questions have guided thinking in entire subfields of sensory neuroscience, and include: What features in the sensory environment should be measured? Given that we make decisions on a variety of time scales, how should one solve trade-offs between making simpler measurements to guide minimal decisions vs. more elaborate sensory systems that have to overcome multiple delays between sensation and action. Once we agree on the types of features that are important to represent, how should they be represented? How should resources be allocated between different stages of processing, and where is the impact of noise most damaging? Finally, one should consider trade-offs between implementing a fixed strategy vs. an adaptive scheme that readjusts resources based on current needs. Where adaptation is considered, under what conditions does it become optimal to switch strategies? Research over the past 60 years has provided answers to almost all of these questions but primarily in early sensory systems. Joining these answers into a comprehensive framework is a challenge that will help us understand who we are and how we can make better use of limited natural resources.
Quentin Gaudry and Jonathan Schenk
Olfactory systems are tasked with converting the chemical environment into electrical signals that the brain can use to optimize behaviors such as navigating towards resources, finding mates, or avoiding danger. Drosophila melanogaster has long served as a model system for several attributes of olfaction. Such features include sensory coding, development, and the attempt to link sensory perception to behavior. The strength of Drosophila as a model system for neurobiology lies in the myriad of genetic tools made available to the experimentalist, and equally importantly, the numerical reduction in cell numbers within the olfactory circuit. Modern techniques have recently made it possible to target nearly all cell types in the antennal lobe to directly monitor their physiological activity or to alter their expression of endogenous proteins or transgenes.
Cynthia F. Moss
Echolocating bats have evolved an active sensing system, which supports 3D perception of objects in the surroundings and permits spatial navigation in complete darkness. Echolocating animals produce high frequency sounds and use the arrival time, intensity, and frequency content of echo returns to determine the distance, direction, and features of objects in the environment. Over 1,000 species of bats echolocate with signals produced in their larynges. They use diverse sonar signal designs, operate in habitats ranging from tropical rain forest to desert, and forage for different foods, including insects, fruit, nectar, small vertebrates, and even blood. Specializations of the mammalian auditory system, coupled with high frequency hearing, enable spatial imaging by echolocation in bats. Specifically, populations of neurons in the bat central nervous system respond selectively to the direction and delay of sonar echoes. In addition, premotor neurons in the bat brain are implicated in the production of sonar calls, along with movement of the head and ears. Audio-motor circuits, within and across brain regions, lay the neural foundation for acoustic orientation by echolocation in bats.
Douglas K. Reilly and Jagan Srinivasan
To survive, animals must properly sense their surrounding environment. The types of sensation that allow for detecting these changes can be categorized as tactile, thermal, aural, or olfactory. Olfaction is one of the most primitive senses, involving the detection of environmental chemical cues. Organisms must sense and discriminate between abiotic and biogenic cues, necessitating a system that can react and respond to changes quickly. The nematode, Caenorhabditis elegans, offers a unique set of tools for studying the biology of olfactory sensation.
The olfactory system in C. elegans is comprised of 14 pairs of amphid neurons in the head and two pairs of phasmid neurons in the tail. The male nervous system contains an additional 89 neurons, many of which are exposed to the environment and contribute to olfaction. The cues sensed by these olfactory neurons initiate a multitude of responses, ranging from developmental changes to behavioral responses. Environmental cues might initiate entry into or exit from a long-lived alternative larval developmental stage (dauer), or pheromonal stimuli may attract sexually mature mates, or repel conspecifics in crowded environments. C. elegans are also capable of sensing abiotic stimuli, exhibiting attraction and repulsion to diverse classes of chemicals. Unlike canonical mammalian olfactory neurons, C. elegans chemosensory neurons express more than one receptor per cell. This enables detection of hundreds of chemical structures and concentrations by a chemosensory nervous system with few cells. However, each neuron detects certain classes of olfactory cues, and, combined with their synaptic pathways, elicit similar responses (i.e., aversive behaviors). The functional architecture of this chemosensory system is capable of supporting the development and behavior of nematodes in a manner efficient enough to allow for the genus to have a cosmopolitan distribution.
Much progress has been made in unraveling the mechanisms that underlie the transition from acute to chronic pain. Traditional beliefs are being replaced by novel, more powerful concepts that consider the mutual interplay of neuronal and non-neuronal cells in the nervous system during the pathogenesis of chronic pain. The new focus is on the role of neuroinflammation for neuroplasticity in nociceptive pathways and for the generation, amplification, and mislocation of pain. The latest insights are reviewed here and provide a basis for understanding the interdependence of chronic pain and its comorbidities. The new concepts will guide the search for future therapies to prevent and reverse chronic pain.
Long-term changes in the properties and functions of nerve cells, including changes in synaptic strength, membrane excitability, and the effects of inhibitory neurotransmitters, can result from a wide variety of conditions. In the nociceptive system, painful stimuli, peripheral inflammation, nerve injuries, the use of or withdrawal from opioids—all can lead to enhanced pain sensitivity, to the generation of pain, and/or to the spread of pain to unaffected sites of the body. Non-neuronal cells, especially microglia and astrocytes, contribute to changes in nociceptive processing. Recent studies revealed not only that glial cells support neuroplasticity but also that their activation can trigger long-term changes in the nociceptive system.
Angel Ariel Caputi
American gymnotiformes and African mormyriformes have evolved an active sensory system using a self-generated electric field as a carrier of signals. Objects polarized by the discharge of a specialized electric organ project their images on the skin where electroreceptors tuned to the time course of the self-generated field transduce local signals carrying information about impedance, shape, size, and location of objects, as well as electrocommunication messages, and encode them as primary afferents trains of spikes. This system is articulated with other cutaneous systems (passive electroreception and mechanoception) as well as proprioception informing the shape of the fish’s body. Primary afferents project on the electrosensory lobe where electrosensory signals are compared with expectation signals resulting from the integration of recent past electrosensory, other sensory, and, in the case of mormyriformes, electro- and skeleton-motor corollary discharges. This ensemble of signals converges on the apical dendrites of the principal cells where a working memory of the recent past, and therefore predictable input, is continuously built up and updated as a pattern of synaptic weights. The efferent neurons of the electrosensory lobe also project to the torus and indirectly to other brainstem nuclei that implement automatic electro- and skeleton-motor behaviors. Finally, the torus projects via the preglomerular nucleus to the telencephalon where cognitive functions, including “electroperception” of shape-, size- and impedance-related features of objects, recognition of conspecifics, perception based decisions, learning, and abstraction, are organized.