Jonathan Denniss
Estimation of contrast sensitivity from fixational eye movements
Denniss, Jonathan; Scholes, Chris; McGraw, Paul V.; Nam, Se-Ho; Roach, Neil W.
Authors
Dr CHRIS SCHOLES Chris.Scholes@nottingham.ac.uk
ASSISTANT PROFESSOR IN PSYCHOLOGY
Professor PAUL MCGRAW paul.mcgraw@nottingham.ac.uk
PROFESSOR OF VISUAL NEUROSCIENCE
Se-Ho Nam
Professor NEIL ROACH NEIL.ROACH@NOTTINGHAM.AC.UK
PROFESSOR OF VISION SCIENCE
Abstract
Purpose: Even during steady fixation, people make small eye movements such as microsaccades, whose rate is altered by presentation of salient stimuli. Our goal was to develop a practical method for objectively and robustly estimating contrast sensitivity from microsaccade rates in a diverse population.
Methods: Participants, recruited to cover a range of contrast sensitivities, were visually normal (n = 19), amblyopic (n = 10), or had cataract (n = 9). Monocular contrast sensitivity was estimated behaviorally while binocular eye movements were recorded during interleaved passive trials. A probabilistic inference approach was used to establish the likelihood of observed microsaccade rates given the presence or absence of a salient stimulus. Contrast sensitivity was estimated from a function fitted to the scaled log-likelihood ratio of the observed microsaccades in the presence or absence of a salient stimulus across a range of contrasts.
Results: Microsaccade rate signature shapes were heterogeneous; nevertheless, estimates of contrast sensitivity could be obtained in all participants. Microsaccade-estimated contrast sensitivity was unbiased compared to behavioral estimates (1.2% mean), with which they were strongly correlated (Spearman's ρ 0.74, P < 0.001, median absolute difference 7.6%). Measurement precision of microsaccade-based contrast sensitivity estimates was worse than that of behavioral estimates, requiring more than 20 times as many presentations to equate precision.
Conclusions: Microsaccade rate signatures are heterogeneous in shape when measured across populations with a broad range of contrast sensitivities. Contrast sensitivity can be robustly estimated from rate signatures by probabilistic inference, but more stimulus presentations are currently required to achieve similarly precise estimates to behavioral techniques.
Citation
Denniss, J., Scholes, C., McGraw, P. V., Nam, S.-H., & Roach, N. W. (2018). Estimation of contrast sensitivity from fixational eye movements. Investigative Ophthalmology & Visual Science, 59(13), 5408-5416. https://doi.org/10.1167/iovs.18-24674
Journal Article Type | Article |
---|---|
Acceptance Date | Oct 1, 2018 |
Publication Date | Nov 14, 2018 |
Deposit Date | Nov 27, 2018 |
Publicly Available Date | Nov 27, 2018 |
Journal | Investigative Opthalmology & Visual Science |
Print ISSN | 0146-0404 |
Electronic ISSN | 1552-5783 |
Publisher | Association for Research in Vision and Ophthalmology |
Peer Reviewed | Peer Reviewed |
Volume | 59 |
Issue | 13 |
Pages | 5408-5416 |
DOI | https://doi.org/10.1167/iovs.18-24674 |
Keywords | microsaccades, fixational eye movements; contrast sensitivity; objective |
Public URL | https://nottingham-repository.worktribe.com/output/1311152 |
Publisher URL | https://iovs.arvojournals.org/article.aspx?articleid=2715095#203285609 |
Files
Estimation of Contrast Sensitivity From Fixational Eye Movements
(1 Mb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/
You might also like
The spatial properties of adaptation-induced distance compression
(2022)
Journal Article
Learning to silence saccadic suppression
(2021)
Journal Article
Adaptation reveals multi-stage coding of visual duration
(2019)
Journal Article
Downloadable Citations
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search