Darren G.M. Cunningham
Measuring nonlinear signal combination using EEG
Cunningham, Darren G.M.; Baker, Daniel H.; Peirce, Jonathan W.
Authors
Daniel H. Baker
Professor JONATHAN PEIRCE JONATHAN.PEIRCE@NOTTINGHAM.AC.UK
PROFESSOR OF PSYCHOLOGY RESEARCH METHODS
Abstract
Relatively little is known about the processes, both linear and nonlinear, by which signals are combined beyond V1. By presenting two stimulus components simultaneously, flickering at different temporal frequencies (frequency tagging) while measuring steady-state visual evoked potentials, we can assess responses to the individual components, including direct measurements of suppression on each other, and various nonlinear responses to their combination found at intermodulation frequencies. The result is a rather rich dataset of frequencies at which responses can be found. We presented pairs of sinusoidal gratings at different temporal frequencies, forming plaid patterns that were "coherent" (looking like a checkerboard) and "noncoherent" (looking like a pair of transparently overlaid gratings), and found clear intermodulation responses to compound stimuli, indicating nonlinear summation. This might have been attributed to cross-orientation suppression except that the pattern of intermodulation responses differed for coherent and noncoherent patterns, whereas the effects of suppression (measured at the component frequencies) did not. A two-stage model of nonlinear summation involving conjunction detection with a logical AND gate described the data well, capturing the difference between coherent and noncoherent plaids over a wide array of possible response frequencies. Multistimulus frequency-tagged EEG in combination with computational modeling may be a very valuable tool in studying the conjunction of these signals. In the current study the results suggest a second-order mechanism responding selectively to coherent plaid patterns.
Citation
Cunningham, D. G., Baker, D. H., & Peirce, J. W. (2017). Measuring nonlinear signal combination using EEG. Journal of Vision, 17(5), Article 10. https://doi.org/10.1167/17.5.10
Journal Article Type | Article |
---|---|
Acceptance Date | Mar 3, 2017 |
Online Publication Date | May 24, 2017 |
Publication Date | Jun 30, 2017 |
Deposit Date | Jun 30, 2017 |
Publicly Available Date | Jun 30, 2017 |
Journal | Journal of Vision |
Electronic ISSN | 1534-7362 |
Publisher | Association for Research in Vision and Ophthalmology |
Peer Reviewed | Peer Reviewed |
Volume | 17 |
Issue | 5 |
Article Number | 10 |
DOI | https://doi.org/10.1167/17.5.10 |
Keywords | Vision, EEG, VEP, ssVEP, Neuroscience |
Public URL | https://nottingham-repository.worktribe.com/output/870279 |
Publisher URL | http://jov.arvojournals.org/article.aspx?articleid=2628973 |
Related Public URLs | https://osf.io/4s9z3/ http://www.peirce.org.uk http://jov.arvojournals.org/article.aspx?articleid=2628973 |
Contract Date | Jun 30, 2017 |
Files
i1534-7362-17-5-10.pdf
(819 Kb)
PDF
Copyright Statement
Copyright information regarding this work can be found at the following address: http://creativecommons.org/licenses/by/4.0
You might also like
The timing mega-study: comparing a range of experiment generators, both lab-based and online
(2020)
Journal Article
PsychoPy2: experiments in behavior made easy
(2019)
Journal Article
Cue Combination of Conflicting Color and Luminance Edges
(2015)
Journal Article
Downloadable Citations
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search