Timothy J. Fawcett
Universal automated classification of the acoustic startle reflex using machine learning
Fawcett, Timothy J.; Longenecker, Ryan J.; Brunelle, Dimitri L.; Berger, Joel I.; Wallace, Mark N.; Galazyuk, Alex V.; Rosen, Merri J.; Salvi, Richard J.; Walton, Joseph P.
Authors
Ryan J. Longenecker
Dimitri L. Brunelle
Joel I. Berger
MARK WALLACE mark.wallace@nottingham.ac.uk
Research Fellow
Alex V. Galazyuk
Merri J. Rosen
Richard J. Salvi
Joseph P. Walton
Abstract
The startle reflex (SR), a robust, motor response elicited by an intense auditory, visual, or somatosensory stimulus has been widely used as a tool to assess psychophysiology in humans and animals for almost a century in diverse fields such as schizophrenia, bipolar disorder, hearing loss, and tinnitus. Previously, SR waveforms have been ignored, or assessed with basic statistical techniques and/or simple template matching paradigms. This has led to considerable variability in SR studies from different laboratories, and species. In an effort to standardize SR assessment methods, we developed a machine learning algorithm and workflow to automatically classify SR waveforms in virtually any animal model including mice, rats, guinea pigs, and gerbils obtained with various paradigms and modalities from several laboratories. The universal features common to SR waveforms of various species and paradigms are examined and discussed in the context of each animal model. The procedure describes common results using the SR across species and how to fully implement the open-source R implementation. Since SR is widely used to investigate toxicological or pharmaceutical efficacy, a detailed and universal SR waveform classification protocol should be developed to aid in standardizing SR assessment procedures across different laboratories and species. This machine learning-based method will improve data reliability and translatability between labs that use the startle reflex paradigm. [Abstract copyright: Copyright © 2022. Published by Elsevier B.V.]
Citation
Fawcett, T. J., Longenecker, R. J., Brunelle, D. L., Berger, J. I., Wallace, M. N., Galazyuk, A. V., …Walton, J. P. (2023). Universal automated classification of the acoustic startle reflex using machine learning. Hearing Research, 428, Article 108667. https://doi.org/10.1016/j.heares.2022.108667
Journal Article Type | Article |
---|---|
Acceptance Date | Dec 12, 2022 |
Online Publication Date | Dec 15, 2022 |
Publication Date | 2023-02 |
Deposit Date | Jan 13, 2023 |
Publicly Available Date | Dec 16, 2023 |
Journal | Hearing Research |
Electronic ISSN | 1878-5891 |
Publisher | Elsevier |
Peer Reviewed | Peer Reviewed |
Volume | 428 |
Article Number | 108667 |
DOI | https://doi.org/10.1016/j.heares.2022.108667 |
Keywords | Ensemble models, Waveform classification, Machine learning, Pre-pulse inhibition, Acoustic startle response |
Public URL | https://nottingham-repository.worktribe.com/output/15924093 |
Publisher URL | https://www.sciencedirect.com/science/article/pii/S0378595522002350?via%3Dihub |
Files
This file is under embargo until Dec 16, 2023 due to copyright restrictions.
You might also like
Salicylate decreases the spontaneous firing rate of guinea pig auditory nerve fibres
(2021)
Journal Article
Nitric oxide increases gain in the ventral cochlear nucleus of guinea pigs with tinnitus
(2020)
Journal Article