Research

HCI + UX research

ACM CHI’23 paper 

Quantifying eye-cursor coordination during remote, online drag-and-drop user interactions

Jennifer Bertrand & Craig Chapman

Fully remote and online, we used cursor-tracking and webcam eye-tracking to explore the dynamics of eye-cursor coordination for digital drag-and-drop user interactions. 

Read our full paper (PDF) or watch the ACM CHI’23 presentation video:

manuscript forthcoming (under submission)

Remotely assessing decision difficulty: complementing cursor-tracking with insights from webcam eye-tracking

Jennifer Bertrand, Alexandra Ouellette Zuk & Craig Chapman

This project extends our earlier work (cursor-tracking only - see below) by exploring how webcam eye-tracking can provide unique insights about decision difficulty that we can’t capture from cursor-tracking alone.

This project is currently under submission - read our pre-print here!

in press at JEP:HPP

Reading player intentions from mouse-cursor movements in a two-player remote card game

Helen Ma, Jennifer Bertrand, Craig Chapman & Dana Hayward

This work involved the building and deployment of a remote, online two-player card game where players could see each others’ mouse cursors throughout the game. We made some games competitive and some cooperative, and people moved their mouse cursors differently between contexts.

This project has been accepted for publication and is in press at the Journal of Experimental Psychology: Human Perception and Performance (JEP: HPP).

manuscript forthcoming (under submission)

Decision difficulty reflected in both mouse cursor and touchscreen dynamics

Alexandra Ouellette Zuk, Jennifer Bertrand & Craig Chapman

Across 3 types of decisions, we remotely tested people using computers, tablets or smartphones, measuring their mouse cursor or touchscreen interactions. We’ve found rich decision difficulty information revealed in the movements of both mouse and touch interactions.

This project is currently under submission - read our preprint here!

Remote and Online Methods to Measure Human Behaviour

in prep

A Practical Guide to Remote Webcam Eye-tracking

Jennifer Bertrand & Craig Chapman 

As a product of the journey we embarked on when the pandemic shifted our research from the lab to the internet, we learned A LOT of lessons (often the hard way) to arrive at high-quality webcam eye-tracking data from remote participants.

As one of the first papers of its kind for webcam eye-tracking, we offer a comprehensive guide about all the practical aspects of running remote webcam eye-tracking experiments, and add data-backed insights about eye-tracking quality, experimental costs and effective data collection strategies.

This project is in preparation for publication. Pre-print forthcoming.

Decision-making, visual perception + cognitive neuroscience research

  • Brain rhythms and flicker: measuring phase coherance with EEG

    Using electroencephalography (EEG), we show that theta frequency brain rhythms are predictive of brightness enhancement from flickering stimuli.

    Read our Scientific Reports paper here.

  • Rapid reach decisions with dynamic probabilistic information

    With motion capture recordings we reveal that rapid reach decisions accurately reflect the future state of dynamic probabilistic information.

    Read our Cortex paper here.

  • Identifying the way brightness enhancement varies with flicker frequency

    In a two-part behavioural experiment, we determined the frequencies of flicker that generate the greatest brightness enhancement.

    Read our Experimental Brain Research paper here.