Skip to main content

Research Repository

Advanced Search

All Outputs (31)

Event probabilities have a different impact on early and late electroencephalographic measures regarded as metrics of prediction (2023)
Journal Article
Saurels, B. W., Johnston, A., Yarrow, K., & Arnold, D. H. (2024). Event probabilities have a different impact on early and late electroencephalographic measures regarded as metrics of prediction. Journal of Cognitive Neuroscience, 36(1), 187-199. https://doi.org/10.1162/jocn_a_02076

The oddball protocol has been used to study the neural and perceptual consequences of implicit predictions in the human brain. The protocol involves presenting a sequence of identical repeated events that are eventually broken by a novel "oddball" pr... Read More about Event probabilities have a different impact on early and late electroencephalographic measures regarded as metrics of prediction.

Predictive extrapolation effects can have a greater impact on visual decisions, while visual adaptation has a greater impact on conscious visual experience (2023)
Journal Article
Bouyer, L. N., Arnold, D. H., Johnston, A., & Taubert, J. (2023). Predictive extrapolation effects can have a greater impact on visual decisions, while visual adaptation has a greater impact on conscious visual experience. Consciousness and Cognition, 115, Article 103583. https://doi.org/10.1016/j.concog.2023.103583

Human vision is shaped by historic and by predictive processes. The lingering impact of visual adaptation, for instance, can act to exaggerate differences between past and present inputs, whereas predictive processes can promote extrapolation effects... Read More about Predictive extrapolation effects can have a greater impact on visual decisions, while visual adaptation has a greater impact on conscious visual experience.

fMRI evidence that hyper-caricatured faces activate object-selective cortex (2023)
Journal Article
Elson, R., Schluppeck, D., & Johnston, A. (2023). fMRI evidence that hyper-caricatured faces activate object-selective cortex. Frontiers in Psychology, 13, Article 1035524. https://doi.org/10.3389/fpsyg.2022.1035524

Many brain imaging studies have looked at the cortical responses to object categories and faces. A popular way to manipulate face stimuli is by using a “face space,” a high dimensional representation of individual face images, with the average face l... Read More about fMRI evidence that hyper-caricatured faces activate object-selective cortex.

The spatial properties of adaptation-induced distance compression (2022)
Journal Article
Jovanovic, L., McGraw, P. V., Roach, N. W., & Johnston, A. (2022). The spatial properties of adaptation-induced distance compression. Journal of Vision, 22(11), Article 7. https://doi.org/10.1167/jov.22.11.7

Exposure to a dynamic texture reduces the perceived separation between objects, altering the mapping between physical relations in the environment and their neural representations. Here we investigated the spatial tuning and spatial frame of referenc... Read More about The spatial properties of adaptation-induced distance compression.

A PCA-Based Active Appearance Model for Characterising Modes of Spatiotemporal Variation in Dynamic Facial Behaviours (2022)
Journal Article
Watson, D. M., & Johnston, A. (2022). A PCA-Based Active Appearance Model for Characterising Modes of Spatiotemporal Variation in Dynamic Facial Behaviours. Frontiers in Psychology, 13, Article 880548. https://doi.org/10.3389/fpsyg.2022.880548

Faces carry key personal information about individuals, including cues to their identity, social traits, and emotional state. Much research to date has employed static images of faces taken under tightly controlled conditions yet faces in the real wo... Read More about A PCA-Based Active Appearance Model for Characterising Modes of Spatiotemporal Variation in Dynamic Facial Behaviours.

Exploring the Common Mechanisms of Motion-Based Visual Prediction (2022)
Journal Article
Hu, D., Ison, M., & Johnston, A. (2022). Exploring the Common Mechanisms of Motion-Based Visual Prediction. Frontiers in Psychology, 13, Article 827029. https://doi.org/10.3389/fpsyg.2022.827029

Human vision supports prediction for moving stimuli. Here we take an individual differences approach to investigate whether there could be a common processing rate for motion-based visual prediction across diverse motion phenomena. Motion Induced Spa... Read More about Exploring the Common Mechanisms of Motion-Based Visual Prediction.

Synchronous facial action binds dynamic facial features (2021)
Journal Article
Johnston, A., Brown, B. B., & Elson, R. (2021). Synchronous facial action binds dynamic facial features. Scientific Reports, 11, Article 7191. https://doi.org/10.1038/s41598-021-86725-x

We asked how dynamic facial features are perceptually grouped. To address this question, we varied the timing of mouth movements relative to eyebrow movements, while measuring the detectability of a small temporal misalignment between a pair of oscil... Read More about Synchronous facial action binds dynamic facial features.

Understanding sensory induced hallucinations: From neural fields to amplitude equations (2021)
Journal Article
Nicks, R., Cocks, A., Avitabile, D., Johnston, A., & Coombes, S. (2021). Understanding sensory induced hallucinations: From neural fields to amplitude equations. SIAM Journal on Applied Dynamical Systems, 20(4), 1683-1714. https://doi.org/10.1137/20M1366885

Explorations of visual hallucinations, and in particular those of Billock and Tsou [V. A. Billock and B. H. Tsou, Proc. Natl. Acad. Sci. USA, 104 (2007), pp. 8490-8495], show that annular rings with a background flicker can induce visual hallucinatio... Read More about Understanding sensory induced hallucinations: From neural fields to amplitude equations.

The interrelationship between the face and vocal tract configuration during audiovisual speech (2020)
Journal Article
Scholes, C., Skipper, J. I., & Johnston, A. (2020). The interrelationship between the face and vocal tract configuration during audiovisual speech. Proceedings of the National Academy of Sciences, 117(51), 32791-32798. https://doi.org/10.1073/pnas.2006192117

It is well established that speech perception is improved when we are able to see the speaker talking along with hearing their voice, especially when the speech is noisy. While we have a good understanding of where speech integration occurs in the br... Read More about The interrelationship between the face and vocal tract configuration during audiovisual speech.

A data-driven characterisation of natural facial expressions when giving good and bad news (2020)
Journal Article
Watson, D. M., Brown, B. B., & Johnston, A. (2020). A data-driven characterisation of natural facial expressions when giving good and bad news. PLoS Computational Biology, 16(10), Article e1008335. https://doi.org/10.1371/journal.pcbi.1008335

Facial expressions carry key information about an individual’s emotional state. Research into the perception of facial emotions typically employs static images of a small number of artificially posed expressions taken under tightly controlled experim... Read More about A data-driven characterisation of natural facial expressions when giving good and bad news.

Dynamic Facial Models for Video-Based Dimensional Affect Estimation (2019)
Conference Proceeding
Song, S., Sánchez-Lozano, E., Kumar Tellamekala, M., Shen, L., Johnston, A., & Valstar, M. (2019). Dynamic Facial Models for Video-Based Dimensional Affect Estimation. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) (1608-1617). https://doi.org/10.1109/ICCVW.2019.00200

Dimensional affect estimation from a face video is a challenging task, mainly due to the large number of possible facial displays made up of a set of behaviour primitives including facial muscle actions. The displays vary not only in composition but... Read More about Dynamic Facial Models for Video-Based Dimensional Affect Estimation.

Motion integration is anisotropic during smooth pursuit eye movements (2019)
Journal Article
Souto, D., Chudasama, J., Kerzel, D., & Johnston, A. (2019). Motion integration is anisotropic during smooth pursuit eye movements. Journal of Neurophysiology, 121(5), 1787-1797. https://doi.org/10.1152/jn.00591.2018

Smooth pursuit eye movements (pursuit) are used to minimize the retinal motion of moving objects. During pursuit, the pattern of motion on the retina carries not only information about the object movement but also reafferent information about the eye... Read More about Motion integration is anisotropic during smooth pursuit eye movements.

Suboptimal human multisensory cue combination (2019)
Journal Article
Arnold, D. H., Petrie, K., Murray, C., & Johnston, A. (2019). Suboptimal human multisensory cue combination. Scientific Reports, 9(1), Article 5155. https://doi.org/10.1038/s41598-018-37888-7

Information from different sensory modalities can interact, shaping what we think we have seen, heard, or otherwise perceived. Such interactions can enhance the precision of perceptual decisions, relative to those based on information from a single s... Read More about Suboptimal human multisensory cue combination.

Selective binding of facial features reveals dynamic expression fragments (2018)
Journal Article
Harrison, C., Binetti, N., Mareschal, I., & Johnston, A. (2018). Selective binding of facial features reveals dynamic expression fragments. Scientific Reports, 8(1), Article 9031. https://doi.org/10.1038/s41598-018-27242-2

The temporal correspondence between two arbitrarily chosen pairs of alternating features can generally be reported for rates up to 3–4 Hz. This limit is however surpassed for specialised visual mechanisms that encode conjunctions of features. Here we... Read More about Selective binding of facial features reveals dynamic expression fragments.

Visual crowding is unaffected by adaptation-induced spatial compression (2018)
Journal Article
Chambers, A. L., Roach, N. W., & Johnston, A. (2018). Visual crowding is unaffected by adaptation-induced spatial compression. Journal of Vision, 18(3), Article 12. https://doi.org/10.1167/18.3.12

It has recently been shown that adapting to a densely textured stimulus alters the perception of visual space, such that the distance between two points subsequently presented in the adapted region appears reduced (Hisakata, Nishida, & Johnston, 2016... Read More about Visual crowding is unaffected by adaptation-induced spatial compression.

Individual differences in first- and second-order temporal judgment (2018)
Journal Article
Corcoran, A. W., Groot, C., Bruno, A., Johnston, A., & Cropper, S. J. (2018). Individual differences in first- and second-order temporal judgment. PLoS ONE, 13(2), Article e0191422. https://doi.org/10.1371/journal.pone.0191422

The ability of subjects to identify and reproduce brief temporal intervals is influenced by many factors whether they be stimulus-based, task-based or subject-based. The current study examines the role individual differences play in subsecond and sup... Read More about Individual differences in first- and second-order temporal judgment.

Temporal order judgements of dynamic gaze stimuli reveal a postdictive prioritisation of averted over direct shifts (2017)
Journal Article
Binetti, N., Harrison, C., Mareschal, I., & Johnston, A. (2017). Temporal order judgements of dynamic gaze stimuli reveal a postdictive prioritisation of averted over direct shifts. i-Perception, 8(4), https://doi.org/10.1177/2041669517720808

We studied temporal order judgements (TOJs) of gaze shift behaviours and evaluated the impact of gaze direction (direct and averted gaze) and face context information (both eyes set within a single face or each eye within two adjacent hemifaces) on T... Read More about Temporal order judgements of dynamic gaze stimuli reveal a postdictive prioritisation of averted over direct shifts.

Pupil response hazard rates predict perceived gaze durations (2017)
Journal Article
Binetti, N., Harrison, C., Mareschal, I., & Johnston, A. (2017). Pupil response hazard rates predict perceived gaze durations. Scientific Reports, 7, Article 3969. https://doi.org/10.1038/s41598-017-04249-9

© 2017 The Author(s). We investigated the mechanisms for evaluating perceived gaze-shift duration. Timing relies on the accumulation of endogenous physiological signals. Here we focused on arousal, measured through pupil dilation, as a candidate timi... Read More about Pupil response hazard rates predict perceived gaze durations.

Time-order errors in duration judgment are independent of spatial positioning (2017)
Journal Article
Harrison, C., Binetti, N., Mareschal, I., & Johnston, A. (2017). Time-order errors in duration judgment are independent of spatial positioning. Frontiers in Psychology, 8, Article 340. https://doi.org/10.3389/fpsyg.2017.00340

Time-order errors (TOEs) occur when the discriminability between two stimuli are affected by the order in which they are presented. While TOEs have been studied since the 1860s, it is unknown whether the spatial properties of a stimulus will affect t... Read More about Time-order errors in duration judgment are independent of spatial positioning.

Multiple-stage ambiguity in motion perception reveals global computation of local motion directions (2016)
Journal Article
Rider, A. T., Nishida, S., & Johnston, A. (2016). Multiple-stage ambiguity in motion perception reveals global computation of local motion directions. Journal of Vision, 16(15), Article 7. https://doi.org/10.1167/16.15.7

The motion of a 1D image feature, such as a line, seen through a small aperture, or the small receptive field of a neural motion sensor, is underconstrained, and it is not possible to derive the true motion direction from a single local measurement.... Read More about Multiple-stage ambiguity in motion perception reveals global computation of local motion directions.