Speaker
Description
Mental fatigue is a common challenge in human–robot collaboration (HRC), yet it often remains undetected until performance failures occur. To support adaptive systems that respond proactively to user states, this study examined behavioral and pupillometric indicators of mental fatigue in a controlled yet ecologically valid scenario mimicking HRC. Forty‑one participants completed either a low‑ or high‑load version of a fatigue‑inducing working‑memory task (two‑back vs. time‑load dual‑back), followed by a joystick‑based drag‑and‑drop task requiring continuous motor control, while pupil size was recorded.
The high‑load task induced stronger cognitive strain, reflected in steeper declines in accuracy and reduced task‑evoked pupil responses. In the subsequent motor task, previously fatigued participants moved objects faster but with less placement precision—indicating a fatigue-related speed-accuracy trade-off. Pupil constriction and re dilation amplitudes decreased over time, suggesting declining task demands with growing familiarity. Notably, larger late-trial pupil re-dilations predicted faster reaction times, linking phasic arousal (indicated by larger pupil responses) to improved performance once the task was learned.
These findings highlight pupil dynamics as a marker of mental fatigue that precedes overt performance decline. Integrating pupillometry with fine‑grained behavioral measures can support adaptive HRC systems that monitor cognitive state in real time and adjust assistance to sustain safety in collaborative environments. By bridging neuroscience, psychology, and robotics, this interdisciplinary approach advances our understanding of how cognitive states influence—and are influenced by—dynamic human-machine interactions.