COMBINING THE SENSES
How does the brain combine inputs from the different sensory modalities? We have studied this question using behavioural methods, EEG and brain imaging.
One of our main behavioural findings is that spatial and motion congruency (same hemifield; same motion direction) is a crucial factor when dynamic signals from the auditory and visual modality are combined: integration is most effective (closest to linear summation) when the signals are co-localised and move in the same direction (H+D+); when they are inconsistent in motion direction or location, an independent-decisions model accounts best for the data (Meyer et al, 2005).
In many instances, auditory and visual inputs carry not only information about motion direction or location, but also semantic information. A good example is lip reading, where both the auditory and visual signals carry semantic information, namely speech. Another example is biological motion, often studied with a point-light walker; the human brain is good at extracting body motions from reduced visual and auditory signals (e.g. a small set of moving dots). The question is how sensitive our brain is to semantic incongruencies between the auditory and visual modality. To investigate the semantic selectivity of these integration mechanisms, we used three different categories of stimuli: speech (SP), body actions (BM) and random patterns (SCR).
When we measured brain activity for these bimodal stimuli (either matched or non-matched) we identified an extensive brain network involved in auditory-visual integration, including the pSTS, IFG, IPL, SMA and the interior insula, and premotor areas (Meyer et al, 2011). However, the effect of semantic incongruency (mismatch between meaningful stimuli, i.e. speech and body motion) primarily modulates the pSTS and frontal areas, but not premotor areas. Our findings are consistent with the hypothesis that pSTS and frontal areas form a recognition network that combines sensory categorical representations (in pSTS) with action hypothesis generation in inferior frontal gyrus/premotor areas. We argue that the same neural networks process speech and body actions.
In addition to semantic and spatial congruency, temporal alignment is an important heuristic for sensory integration and whether the input from different senses is perceived a single event. The perceived simultaneity of auditory and visual events depends to some extent on the intensities of the unimodal stimuli. This is partly due to different processing speeds in the auditory and visual system. We have shown that the intensity-dependence of perceived synchrony is explained by early intensity-dependent processing latencies of the unimodal signals (Horsfall et al, 2020a). Furthermore, perceived synchronicty is plastic and can be altered by training; we have shown that this perceptual training is specific for the intensity of the stimuli and does not generalise across intensities (Horsfall et al. 2020b).
Smelling Sensations: interactions between odours and other senses
Olfaction is ingrained in the fabric of our daily life and
constitutes an integral part of our perceptual reality. We investigated
crossmodal correspondences between ten olfactory stimuli and other
modalities (angularity of shapes, smoothness of texture, pleasantness,
pitch, colours, musical genres and emotional dimensions. Robust associations
were found for most pairings, apart from musical genres (Ward et al, 2020a).
Applications for Virtual Reality applications will be explored with odour
stimulation as a tool to enhance immersiveness (Ward et al, 2020b).
Georg Meyer, University of Liverpool
Mark Greenlee, University of Regensburg
Neil Harrison, Hope University
Ryan Horsfall, University of Manchester
Ryan Ward, Electrical Engineering, University of Liverpool
Alan Marshall, Electrical Engineering, University of Liverpool
Meyer, G.F. and Wuerger, S.M (2001). Crossmodal integration of auditory and visual motion signals, NeuroReport, 12, 2557-2560.
Wuerger, S.M., Hofbauer, M. and Meyer G. (2003) The integration of auditory and visual motion signals at threshold, Perception & Psychophysics 65(8), 1188-1196
Meyer, G.F., Mulligan, J., and Wuerger, S.M (2004). Continuous Audio-visual digit recognition using N-best decision fusion, Information Fusion, 5, 91-101
Hofbauer, M., Wuerger, S. M., Meyer, G. F., Roehrbein, M., Schill, K., & Zetzsche, C. (2004). Catching audio-visual mice: Predicting the arrival time of auditory-visual motion signals. Cognitive, Affective & Behavioral Neuroscience, 4(2), 241–250
Meyer, G. F., Wuerger, S. M.,. Roehrbein, M., & Zetzsche, C. (2005). Low-level Integration of Auditory and Visual Motion Signals Requires Spatial Co-localisation, Experimental Brain Research, 166 (3-4), 538-547.
Wuerger, S.M., Meyer, G., Hofbauer, M., Schill, K. and C. Zetzsche (2010). Motion extrapolation of auditory-visual targets, Information Fusion, 11, 45–50.
Harrison, N. R., Wuerger, S. M., & Meyer, G. F. (2011). Reaction time facilitation for horizontally moving auditory-visual stimuli. Journal of Vision, 10(14), 1-21.
Meyer, G., Greenlee, M., & Wuerger, S. (2011) Interactions between Auditory and Visual Semantic Stimulus Classes: Evidence for Common Processing Networks for Speech and Body Actions. Journal of Cognitive Neuroscience, 23(9), 2271-2288.
Wuerger, S.M., Crocker-Buque, A., and Meyer G.(2011) Evidence for auditory-visual processing specific to biological motion, Seeing and Perceiving, 25, pp. 15-28.
Wuerger, S., Parkes, L., Lewis, P.A., Crocker-Buque, A., Rutschmann, R., & Meyer, G. F. (2012). Premotor Cortex Is Sensitive to Auditory–Visual Congruence for Biological Motion. Journal of Cognitive Neuroscience, 24(3), pp. 575-587. doi:10.1162/jocn_a_00173
Meyer, G. F., Harrison, N. R., & Wuerger, S. M. (2013). The time course of auditory–visual processing of speech and body actions: Evidence for the simultaneous activation of an extended neural network for semantic processing. Neuropsychologia, 51(9), 1716-1725. doi: http://dx.doi.org/10.1016/j.neuropsychologia.2013.05.014
Harrison, Neil R. , Witheridge, Sian , Makin, Alexis , Wuerger, Sophie , Pegna, Alan J. and Meyer, Georg . The effects of stereo disparity on the behavioural and electrophysiological correlates of perception of audio-visual motion in depth. (2015) Neuropsychologia. ISSN 1873-3514 (Online); 0028-3932 (print). DOI: 10.1016/j.neuropsychologia.2015.09.023
Horsfall, Ryan, Wuerger, Sophie, Meyer, Georg (2020a). Visual intensity-dependent
response latencies predict perceived audio-visual simultaneity, Journal
of Mathematical Psychology.
pdf Data are available on
Ryan P Horsfall, Sophie M Wuerger and Georg F Meyer (2020b). Narrowing of the audio-visual temporal binding window due to perceptual training is specific to high visual intensity stimuli. I-Perception (accepted for publications).
Ryan Joseph Ward, Sophie Wuerger, Alan Marshall (2020a). Smelling
sensations: olfactory crossmodal correspondences., Journal of Perceptual
Imaging (under review). bioRxiv 2020.04.15.042630; doi:
Ryan J. Ward, P. M. Jjunju, Elias J. Griffith, Sophie M. Wuerger and Alan Marshall (2020b) Artificial Odour-Vision Syneasthesia via Olfactory Sensory Argumentation., IEEE Sensors Journal (accepted)
Acknowledgement of Support
The Welcome Trust
The Royal Society