Skip to main content

Research Repository

Advanced Search

All Outputs (18)

Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies (2021)
Journal Article
Watson, D. M., Akeroyd, M. A., Roach, N. W., & Webb, B. S. (2021). Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies. PLoS ONE, 16(5), 1-21. https://doi.org/10.1371/journal.pone.0251827

In dynamic multisensory environments, the perceptual system corrects for discrepancies arising between modalities. For instance, in the ventriloquism aftereffect (VAE), spatial disparities introduced between visual and auditory stimuli lead to a perc... Read More about Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies.

Learning to silence saccadic suppression (2021)
Journal Article
Scholes, C., McGraw, P. V., & Roach, N. W. (2021). Learning to silence saccadic suppression. Proceedings of the National Academy of Sciences, 118(6), Article e2012937118. https://doi.org/10.1073/pnas.2012937118

Perceptual stability is facilitated by a decrease in visual sensitivity during rapid eye movements, called saccadic suppression. While a large body of evidence demonstrates that saccadic programming is plastic, little is known about whether the perce... Read More about Learning to silence saccadic suppression.

Adaptation reveals multi-stage coding of visual duration (2019)
Journal Article
Heron, J., Fulcher, C., Collins, H., Whitaker, D., & Roach, N. W. (2019). Adaptation reveals multi-stage coding of visual duration. Scientific Reports, 9, 1-11. https://doi.org/10.1038/s41598-018-37614-3

In confict with historically dominant models of time perception, recent evidence suggests that the encoding of our environment’s temporal properties may not require a separate class of neurons whose raison d'être is the dedicated processing of tempor... Read More about Adaptation reveals multi-stage coding of visual duration.

Estimation of contrast sensitivity from fixational eye movements (2018)
Journal Article
Denniss, J., Scholes, C., McGraw, P. V., Nam, S.-H., & Roach, N. W. (2018). Estimation of contrast sensitivity from fixational eye movements. Investigative Ophthalmology & Visual Science, 59(13), 5408-5416. https://doi.org/10.1167/iovs.18-24674

Purpose: Even during steady fixation, people make small eye movements such as microsaccades, whose rate is altered by presentation of salient stimuli. Our goal was to develop a practical method for objectively and robustly estimating contrast sensiti... Read More about Estimation of contrast sensitivity from fixational eye movements.

Visual crowding is unaffected by adaptation-induced spatial compression (2018)
Journal Article
Chambers, A. L., Roach, N. W., & Johnston, A. (2018). Visual crowding is unaffected by adaptation-induced spatial compression. Journal of Vision, 18(3), Article 12. https://doi.org/10.1167/18.3.12

It has recently been shown that adapting to a densely textured stimulus alters the perception of visual space, such that the distance between two points subsequently presented in the adapted region appears reduced (Hisakata, Nishida, & Johnston, 2016... Read More about Visual crowding is unaffected by adaptation-induced spatial compression.

Selective modulation of visual sensitivity during fixation (2018)
Journal Article
Scholes, C. D., McGraw, P. V., & Roach, N. W. (in press). Selective modulation of visual sensitivity during fixation. Journal of Neurophysiology, 119(6), https://doi.org/10.1152/jn.00819.2017

During periods of steady fixation, we make small amplitude ocular movements, termed microsaccades, at a rate of 1-2 every second. Early studies provided evidence that visual sensitivity is reduced during microsaccades - akin to the well-established s... Read More about Selective modulation of visual sensitivity during fixation.

Rate after-effects fail to transfer cross-modally: evidence for distributed sensory timing mechanisms (2018)
Journal Article
Motola, A., Heron, J., McGraw, P. V., Roach, N. W., & Whitaker, D. (2018). Rate after-effects fail to transfer cross-modally: evidence for distributed sensory timing mechanisms. Scientific Reports, 8, Article 924. https://doi.org/10.1038/s41598-018-19218-z

Accurate time perception is critical for a number of human behaviours, such as understanding speech and the appreciation of music. However, it remains unresolved whether sensory time perception is mediated by a central timing component regulating all... Read More about Rate after-effects fail to transfer cross-modally: evidence for distributed sensory timing mechanisms.

New insights into the role of motion and form vision in neurodevelopmental disorders (2017)
Journal Article
Johnston, R., Pitchford, N. J., Roach, N. W., & Ledgeway, T. (2017). New insights into the role of motion and form vision in neurodevelopmental disorders. Neuroscience and Biobehavioral Reviews, 83, https://doi.org/10.1016/j.neubiorev.2017.09.031

A selective deficit in processing the global (overall) motion, but not form, of spatially extensive objects in the visual scene is frequently associated with several neurodevelopmental disorders, including preterm birth. Existing theories that propos... Read More about New insights into the role of motion and form vision in neurodevelopmental disorders.

Generalization of prior information for rapid Bayesian time estimation (2016)
Journal Article
Roach, N. W., McGraw, P. V., Whitaker, D., & Heron, J. (2017). Generalization of prior information for rapid Bayesian time estimation. Proceedings of the National Academy of Sciences, 114(2), 412-417. https://doi.org/10.1073/pnas.1610706114

To enable effective interaction with the environment, the brain combines noisy sensory information with expectations based on prior experience. There is ample evidence showing that humans can learn statistical regularities in sensory input and exploi... Read More about Generalization of prior information for rapid Bayesian time estimation.

Object size determines the spatial spread of visual time (2016)
Journal Article
Fulcher, C., McGraw, P. V., Roach, N. W., Whitaker, D., & Heron, J. (2016). Object size determines the spatial spread of visual time. Proceedings of the Royal Society B: Biological Sciences, 283(1835), Article 20161024. https://doi.org/10.1098/rspb.2016.1024

A key question for temporal processing research is how the nervous system extracts event duration, despite a notable lack of neural structures dedicated to duration encoding. This is in stark contrast to the orderly arrangement of neurons tasked with... Read More about Object size determines the spatial spread of visual time.

Why is the processing of global motion impaired in adults with developmental dyslexia? (2016)
Journal Article
Johnston, R., Pitchford, N. J., Roach, N. W., & Ledgeway, T. (2016). Why is the processing of global motion impaired in adults with developmental dyslexia?. Brain and Cognition, 108, 20-31. https://doi.org/10.1016/j.bandc.2016.07.004

Individuals with dyslexia are purported to have a selective dorsal stream impairment that manifests as a deficit in perceiving visual global motion relative to global form. However, the underlying nature of the visual deficit in readers with dyslexia... Read More about Why is the processing of global motion impaired in adults with developmental dyslexia?.

Perceptual learning shapes multisensory causal inference via two distinct mechanisms (2016)
Journal Article
McGovern, D. P., Roudaia, E., Newell, F. N., & Roach, N. W. (2016). Perceptual learning shapes multisensory causal inference via two distinct mechanisms. Scientific Reports, 6, Article 24673. https://doi.org/10.1038/srep24673

To accurately represent the environment, our brains must integrate sensory signals from a common source while segregating those from independent sources. A reasonable strategy for performing this task is to restrict integration to cues that coincide... Read More about Perceptual learning shapes multisensory causal inference via two distinct mechanisms.

Fixational eye movements predict visual sensitivity (2015)
Journal Article
Scholes, C., McGraw, P. V., Nyström, M., & Roach, N. W. (2015). Fixational eye movements predict visual sensitivity. Proceedings of the Royal Society B: Biological Sciences, 282(1817), Article 20151568. https://doi.org/10.1098/rspb.2015.1568

© 2015 The Author(s) Published by the Royal Society. All rights reserved. During steady fixation, observers make small fixational saccades at a rate of around 1–2 per second. Presentation of a visual stimulus triggers a biphasic modulation in fixatio... Read More about Fixational eye movements predict visual sensitivity.

Adaptation to implied tilt: extensive spatial extrapolation of orientation gradients (2013)
Journal Article
Roach, N. W., & Webb, B. S. (2013). Adaptation to implied tilt: extensive spatial extrapolation of orientation gradients. Frontiers in Psychology, 4(July), Article 10. https://doi.org/10.3389/fpsyg.2013.00438

To extract the global structure of an image, the visual system must integrate local orientation estimates across space. Progress is being made toward understanding this integration process, but very little is known about whether the presence of struc... Read More about Adaptation to implied tilt: extensive spatial extrapolation of orientation gradients.

Perceptual learning reconfigures the effects of visual adaptation (2012)
Journal Article
McGovern, D. P., Roach, N. W., & Webb, B. S. (2012). Perceptual learning reconfigures the effects of visual adaptation. Journal of Neuroscience, 32(39), https://doi.org/10.1523/JNEUROSCI.1363-12.2012

Our sensory experiences over a range of different timescales shape our perception of the environment. Two particularly striking short-term forms of plasticity with manifestly different time courses and perceptual consequences are those caused by visu... Read More about Perceptual learning reconfigures the effects of visual adaptation.

Visual motion induces a forward prediction of spatial pattern (2011)
Journal Article
Roach, N. W., McGraw, P. V., & Johnston, A. (2011). Visual motion induces a forward prediction of spatial pattern. Current Biology, 21(9), https://doi.org/10.1016/j.cub.2011.03.031

Cortical motion analysis continuously encodes image velocity but might also be used to predict future patterns of sensory input along the motion path. We asked whether this predictive aspect of motion is exploited by the human visual system. Targets... Read More about Visual motion induces a forward prediction of spatial pattern.

Asynchrony adaptation reveals neural population code for audio-visual timing (2011)
Journal Article
Roach, N. W., Heron, J., Whitaker, D., & McGraw, P. V. (2011). Asynchrony adaptation reveals neural population code for audio-visual timing. Proceedings of the Royal Society B: Biological Sciences, 278(1710), https://doi.org/10.1098/rspb.2010.1737

The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships re... Read More about Asynchrony adaptation reveals neural population code for audio-visual timing.