Skip to main content

Research Repository

Advanced Search

Outputs (29)

Event probabilities have a different impact on early and late electroencephalographic measures regarded as metrics of prediction (2023)
Journal Article
Saurels, B. W., Johnston, A., Yarrow, K., & Arnold, D. H. (2024). Event probabilities have a different impact on early and late electroencephalographic measures regarded as metrics of prediction. Journal of Cognitive Neuroscience, 36(1), 187-199. https://doi.org/10.1162/jocn_a_02076

The oddball protocol has been used to study the neural and perceptual consequences of implicit predictions in the human brain. The protocol involves presenting a sequence of identical repeated events that are eventually broken by a novel "oddball" pr... Read More about Event probabilities have a different impact on early and late electroencephalographic measures regarded as metrics of prediction.

Predictive extrapolation effects can have a greater impact on visual decisions, while visual adaptation has a greater impact on conscious visual experience (2023)
Journal Article
Bouyer, L. N., Arnold, D. H., Johnston, A., & Taubert, J. (2023). Predictive extrapolation effects can have a greater impact on visual decisions, while visual adaptation has a greater impact on conscious visual experience. Consciousness and Cognition, 115, Article 103583. https://doi.org/10.1016/j.concog.2023.103583

Human vision is shaped by historic and by predictive processes. The lingering impact of visual adaptation, for instance, can act to exaggerate differences between past and present inputs, whereas predictive processes can promote extrapolation effects... Read More about Predictive extrapolation effects can have a greater impact on visual decisions, while visual adaptation has a greater impact on conscious visual experience.

fMRI evidence that hyper-caricatured faces activate object-selective cortex (2023)
Journal Article
Elson, R., Schluppeck, D., & Johnston, A. (2023). fMRI evidence that hyper-caricatured faces activate object-selective cortex. Frontiers in Psychology, 13, Article 1035524. https://doi.org/10.3389/fpsyg.2022.1035524

Many brain imaging studies have looked at the cortical responses to object categories and faces. A popular way to manipulate face stimuli is by using a “face space,” a high dimensional representation of individual face images, with the average face l... Read More about fMRI evidence that hyper-caricatured faces activate object-selective cortex.

A PCA-Based Active Appearance Model for Characterising Modes of Spatiotemporal Variation in Dynamic Facial Behaviours (2022)
Journal Article
Watson, D. M., & Johnston, A. (2022). A PCA-Based Active Appearance Model for Characterising Modes of Spatiotemporal Variation in Dynamic Facial Behaviours. Frontiers in Psychology, 13, Article 880548. https://doi.org/10.3389/fpsyg.2022.880548

Faces carry key personal information about individuals, including cues to their identity, social traits, and emotional state. Much research to date has employed static images of faces taken under tightly controlled conditions yet faces in the real wo... Read More about A PCA-Based Active Appearance Model for Characterising Modes of Spatiotemporal Variation in Dynamic Facial Behaviours.

Exploring the Common Mechanisms of Motion-Based Visual Prediction (2022)
Journal Article
Hu, D., Ison, M., & Johnston, A. (2022). Exploring the Common Mechanisms of Motion-Based Visual Prediction. Frontiers in Psychology, 13, Article 827029. https://doi.org/10.3389/fpsyg.2022.827029

Human vision supports prediction for moving stimuli. Here we take an individual differences approach to investigate whether there could be a common processing rate for motion-based visual prediction across diverse motion phenomena. Motion Induced Spa... Read More about Exploring the Common Mechanisms of Motion-Based Visual Prediction.

Synchronous facial action binds dynamic facial features (2021)
Journal Article
Johnston, A., Brown, B. B., & Elson, R. (2021). Synchronous facial action binds dynamic facial features. Scientific Reports, 11, Article 7191. https://doi.org/10.1038/s41598-021-86725-x

We asked how dynamic facial features are perceptually grouped. To address this question, we varied the timing of mouth movements relative to eyebrow movements, while measuring the detectability of a small temporal misalignment between a pair of oscil... Read More about Synchronous facial action binds dynamic facial features.

Understanding sensory induced hallucinations: From neural fields to amplitude equations (2021)
Journal Article
Nicks, R., Cocks, A., Avitabile, D., Johnston, A., & Coombes, S. (2021). Understanding sensory induced hallucinations: From neural fields to amplitude equations. SIAM Journal on Applied Dynamical Systems, 20(4), 1683-1714. https://doi.org/10.1137/20M1366885

Explorations of visual hallucinations, and in particular those of Billock and Tsou [V. A. Billock and B. H. Tsou, Proc. Natl. Acad. Sci. USA, 104 (2007), pp. 8490-8495], show that annular rings with a background flicker can induce visual hallucinatio... Read More about Understanding sensory induced hallucinations: From neural fields to amplitude equations.

The interrelationship between the face and vocal tract configuration during audiovisual speech (2020)
Journal Article
Scholes, C., Skipper, J. I., & Johnston, A. (2020). The interrelationship between the face and vocal tract configuration during audiovisual speech. Proceedings of the National Academy of Sciences, 117(51), 32791-32798. https://doi.org/10.1073/pnas.2006192117

It is well established that speech perception is improved when we are able to see the speaker talking along with hearing their voice, especially when the speech is noisy. While we have a good understanding of where speech integration occurs in the br... Read More about The interrelationship between the face and vocal tract configuration during audiovisual speech.

A data-driven characterisation of natural facial expressions when giving good and bad news (2020)
Journal Article
Watson, D. M., Brown, B. B., & Johnston, A. (2020). A data-driven characterisation of natural facial expressions when giving good and bad news. PLoS Computational Biology, 16(10), Article e1008335. https://doi.org/10.1371/journal.pcbi.1008335

Facial expressions carry key information about an individual’s emotional state. Research into the perception of facial emotions typically employs static images of a small number of artificially posed expressions taken under tightly controlled experim... Read More about A data-driven characterisation of natural facial expressions when giving good and bad news.

Dynamic Facial Models for Video-Based Dimensional Affect Estimation (2019)
Presentation / Conference Contribution
Song, S., Sánchez-Lozano, E., Kumar Tellamekala, M., Shen, L., Johnston, A., & Valstar, M. (2019). Dynamic Facial Models for Video-Based Dimensional Affect Estimation. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) (1608-1617). https://doi.org/10.1109/ICCVW.2019.00200

Dimensional affect estimation from a face video is a challenging task, mainly due to the large number of possible facial displays made up of a set of behaviour primitives including facial muscle actions. The displays vary not only in composition but... Read More about Dynamic Facial Models for Video-Based Dimensional Affect Estimation.

Motion integration is anisotropic during smooth pursuit eye movements (2019)
Journal Article
Souto, D., Chudasama, J., Kerzel, D., & Johnston, A. (2019). Motion integration is anisotropic during smooth pursuit eye movements. Journal of Neurophysiology, 121(5), 1787-1797. https://doi.org/10.1152/jn.00591.2018

Smooth pursuit eye movements (pursuit) are used to minimize the retinal motion of moving objects. During pursuit, the pattern of motion on the retina carries not only information about the object movement but also reafferent information about the eye... Read More about Motion integration is anisotropic during smooth pursuit eye movements.

Suboptimal human multisensory cue combination (2019)
Journal Article
Arnold, D. H., Petrie, K., Murray, C., & Johnston, A. (2019). Suboptimal human multisensory cue combination. Scientific Reports, 9(1), Article 5155. https://doi.org/10.1038/s41598-018-37888-7

Information from different sensory modalities can interact, shaping what we think we have seen, heard, or otherwise perceived. Such interactions can enhance the precision of perceptual decisions, relative to those based on information from a single s... Read More about Suboptimal human multisensory cue combination.

Selective binding of facial features reveals dynamic expression fragments (2018)
Journal Article
Harrison, C., Binetti, N., Mareschal, I., & Johnston, A. (2018). Selective binding of facial features reveals dynamic expression fragments. Scientific Reports, 8(1), Article 9031. https://doi.org/10.1038/s41598-018-27242-2

The temporal correspondence between two arbitrarily chosen pairs of alternating features can generally be reported for rates up to 3–4 Hz. This limit is however surpassed for specialised visual mechanisms that encode conjunctions of features. Here we... Read More about Selective binding of facial features reveals dynamic expression fragments.

Individual differences in first- and second-order temporal judgment (2018)
Journal Article
Corcoran, A. W., Groot, C., Bruno, A., Johnston, A., & Cropper, S. J. (2018). Individual differences in first- and second-order temporal judgment. PLoS ONE, 13(2), Article e0191422. https://doi.org/10.1371/journal.pone.0191422

The ability of subjects to identify and reproduce brief temporal intervals is influenced by many factors whether they be stimulus-based, task-based or subject-based. The current study examines the role individual differences play in subsecond and sup... Read More about Individual differences in first- and second-order temporal judgment.

Temporal order judgements of dynamic gaze stimuli reveal a postdictive prioritisation of averted over direct shifts (2017)
Journal Article
Binetti, N., Harrison, C., Mareschal, I., & Johnston, A. (2017). Temporal order judgements of dynamic gaze stimuli reveal a postdictive prioritisation of averted over direct shifts. i-Perception, 8(4), https://doi.org/10.1177/2041669517720808

We studied temporal order judgements (TOJs) of gaze shift behaviours and evaluated the impact of gaze direction (direct and averted gaze) and face context information (both eyes set within a single face or each eye within two adjacent hemifaces) on T... Read More about Temporal order judgements of dynamic gaze stimuli reveal a postdictive prioritisation of averted over direct shifts.

Pupil response hazard rates predict perceived gaze durations (2017)
Journal Article
Binetti, N., Harrison, C., Mareschal, I., & Johnston, A. (2017). Pupil response hazard rates predict perceived gaze durations. Scientific Reports, 7, Article 3969. https://doi.org/10.1038/s41598-017-04249-9

© 2017 The Author(s). We investigated the mechanisms for evaluating perceived gaze-shift duration. Timing relies on the accumulation of endogenous physiological signals. Here we focused on arousal, measured through pupil dilation, as a candidate timi... Read More about Pupil response hazard rates predict perceived gaze durations.

Time-order errors in duration judgment are independent of spatial positioning (2017)
Journal Article
Harrison, C., Binetti, N., Mareschal, I., & Johnston, A. (2017). Time-order errors in duration judgment are independent of spatial positioning. Frontiers in Psychology, 8, Article 340. https://doi.org/10.3389/fpsyg.2017.00340

Time-order errors (TOEs) occur when the discriminability between two stimuli are affected by the order in which they are presented. While TOEs have been studied since the 1860s, it is unknown whether the spatial properties of a stimulus will affect t... Read More about Time-order errors in duration judgment are independent of spatial positioning.

Multiple-stage ambiguity in motion perception reveals global computation of local motion directions (2016)
Journal Article
Rider, A. T., Nishida, S., & Johnston, A. (2016). Multiple-stage ambiguity in motion perception reveals global computation of local motion directions. Journal of Vision, 16(15), Article 7. https://doi.org/10.1167/16.15.7

The motion of a 1D image feature, such as a line, seen through a small aperture, or the small receptive field of a neural motion sensor, is underconstrained, and it is not possible to derive the true motion direction from a single local measurement.... Read More about Multiple-stage ambiguity in motion perception reveals global computation of local motion directions.

Face exploration dynamics differentiate men and women (2016)
Journal Article
Coutrot, A., Binetti, N., Harrison, C., Mareschal, I., & Johnston, A. (2016). Face exploration dynamics differentiate men and women. Journal of Vision, 16(14), Article 16. https://doi.org/10.1167/16.14.16

The human face is central to our everyday social interactions. Recent studies have shown that while gazing at faces, each one of us has a particular eyescanning pattern, highly stable across time. Although variables such as culture or personality hav... Read More about Face exploration dynamics differentiate men and women.

Temporal synchrony is an effective cue for grouping and segmentation in the absence of form cues (2016)
Journal Article
Rideaux, R., Badcock, D. R., Johnston, A., & Edwards, M. (2016). Temporal synchrony is an effective cue for grouping and segmentation in the absence of form cues. Journal of Vision, 16(11), 1-12. https://doi.org/10.1167/16.11.23

The synchronous change of a feature across multiple discrete elements, i.e., temporal synchrony, has been shown to be a powerful cue for grouping and segmentation. This has been demonstrated with both static and dynamic stimuli for a range of tasks.... Read More about Temporal synchrony is an effective cue for grouping and segmentation in the absence of form cues.

An adaptable metric shapes perceptual space (2016)
Journal Article
Hisakata, R., Nishida, S., & Johnston, A. (2016). An adaptable metric shapes perceptual space. Current Biology, 26(14), R678-R680. https://doi.org/10.1016/j.cub.2016.05.047

How do we derive a sense of the separation of points in the world within a space-variant visual system? Visual directions are thought to be coded directly by a process referred to as local sign, in which a neuron acts as a labeled line for the percei... Read More about An adaptable metric shapes perceptual space.

Pupil dilation as an index of preferred mutual gaze duration (2016)
Journal Article
Binetti, N., Harrison, C., Coutrot, A., Johnston, A., & Mareschal, I. (2016). Pupil dilation as an index of preferred mutual gaze duration. Royal Society Open Science, 3(7), https://doi.org/10.1098/rsos.160086

Most animals look at each other to signal threat or interest. In humans, this social interaction is usually punctuated with brief periods of mutual eye contact. Deviations from this pattern of gazing behaviour generally make us feel uncomfortable and... Read More about Pupil dilation as an index of preferred mutual gaze duration.

Difference magnitude is not measured by discrimination steps for order of point patterns (2016)
Journal Article
Protonotarios, E. D., Johnston, A., & Griffin, L. D. (2016). Difference magnitude is not measured by discrimination steps for order of point patterns. Journal of Vision, 16(9), https://doi.org/10.1167/16.9.2

We have shown in previous work that the perception of order in point patterns is consistent with an interval scale structure (Protonotarios, Baum, Johnston, Hunter, & Griffin, 2014). The psychophysical scaling method used relies on the confusion betw... Read More about Difference magnitude is not measured by discrimination steps for order of point patterns.

Time order reversals and saccades (2016)
Journal Article
Kresevic, J. L., Marinovic, W., Johnston, A., & Arnold, D. H. (2016). Time order reversals and saccades. Vision Research, 125, https://doi.org/10.1016/j.visres.2016.04.005

Ballistic eye movements, or saccades, present a major challenge to the visual system. They generate a rapid blur of movement across the surface of the retinae that is rarely consciously seen, as awareness of input is suppressed around the time of a s... Read More about Time order reversals and saccades.

Visual motion induces a forward prediction of spatial pattern (2011)
Journal Article
Roach, N. W., McGraw, P. V., & Johnston, A. (2011). Visual motion induces a forward prediction of spatial pattern. Current Biology, 21(9), https://doi.org/10.1016/j.cub.2011.03.031

Cortical motion analysis continuously encodes image velocity but might also be used to predict future patterns of sensory input along the motion path. We asked whether this predictive aspect of motion is exploited by the human visual system. Targets... Read More about Visual motion induces a forward prediction of spatial pattern.