Many organisms possess multiple sensory systems, such as vision, hearing, touch, smell, and taste. The possession of such multiple ways of sensing the world offers many benefits. These benefits arise not only because each modality can sense different aspects of the environment, but also because different senses can respond jointly to the same external object or event, thus enriching the overall experience - for example, looking at an individual while listening to them speak. However, combining information from different senses also poses many challenges for the nervous system. In recent years there has been dramatic progress in understanding how information from different sensory modalities gets integrated in order to construct useful representations of external space; and in how such multimodal representations constrain spatial attention. Such progress has involved numerous different disciplines, including neurophysiology, experimental psychology, neurological work with brain-damaged patients, neuroimaging studies, and computational modelling.
This volume brings together the leading researchers from all these approaches, to present the first integrative overview of this central topic in cognitive neuroscience.