NEIL ROACH NEIL.ROACH@NOTTINGHAM.AC.UK
Professor of Vision Science
Asynchrony adaptation reveals neural population code for audio-visual timing
Roach, Neil W.; Heron, James; Whitaker, David; McGraw, Paul V.
Authors
James Heron
David Whitaker
Paul V. McGraw
Abstract
The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible—adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.
Citation
Roach, N. W., Heron, J., Whitaker, D., & McGraw, P. V. (2011). Asynchrony adaptation reveals neural population code for audio-visual timing. Proceedings of the Royal Society B: Biological Sciences, 278(1710), https://doi.org/10.1098/rspb.2010.1737
Journal Article Type | Article |
---|---|
Publication Date | May 1, 2011 |
Deposit Date | Mar 26, 2014 |
Publicly Available Date | Mar 26, 2014 |
Journal | Proceedings of the Royal Society B: Biological Sciences |
Print ISSN | 0962-8452 |
Electronic ISSN | 1471-2954 |
Publisher | The Royal Society |
Peer Reviewed | Peer Reviewed |
Volume | 278 |
Issue | 1710 |
DOI | https://doi.org/10.1098/rspb.2010.1737 |
Keywords | auditory-visual timing; multisensory; population coding |
Public URL | https://nottingham-repository.worktribe.com/output/707422 |
Publisher URL | http://rspb.royalsocietypublishing.org/content/278/1710/1314 |
Files
McGraw_Asynchrony.pdf
(944 Kb)
PDF
Copyright Statement
Copyright information regarding this work can be found at the following address: http://creativecommons.org/licenses/by/4.0
You might also like
Age-related changes in auditory and visual interactions in temporal rate perception
(2015)
Journal Article
Visual motion induces a forward prediction of spatial pattern
(2011)
Journal Article
Perceptual learning reconfigures the effects of visual adaptation
(2012)
Journal Article
Adaptation to implied tilt: extensive spatial extrapolation of orientation gradients
(2013)
Journal Article
Perceptual learning shapes multisensory causal inference via two distinct mechanisms
(2016)
Journal Article
Downloadable Citations
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search