Speaker
Description
Decision making in social contexts requires integrating sensory evidence over time from both stimulus and social sources. While prior work shows social settings affect perceptual accuracy and confidence, it remains unclear how real-time social feedback modulates these factors.
We analyzed a continuous perceptual decision-making task where participants reported the perceived direction of motion (accuracy) in a random dot pattern (RDP) and their confidence. Signal-to-noise ratio (coherence) varied over time, and participants’ payoffs were independent. Participants had access to two stimulus features (motion direction, coherence) and two behavioral features of their partner (accuracy, confidence). Using Transfer Entropy (TE), we quantified information flow from these sources.
Subjects tended to interact via the same behavioral dimension: 90% of participants’ accuracy was influenced by partners’ accuracy, and 38% of participants’ confidence by partners’ confidence. From stimulus sources, subjects accuracy received information from motion direction of the RDP in all participants, and confidence by coherence in 99%. To link these patterns to task performance, we applied a Bayesian linear regression.
This analysis showed that greater TE from motion direction ([0.421, 0.766], 94% CI), coherence ([0.005, 0.343], 94% CI), and partners’ accuracy ([0.49, 0.78], 94% CI) predicted better performance. Model comparison revealed that including both stimulus and social information outperformed stimulus-only models (as indicated by the ELPD, estimated with LOO-CV).
These results suggest that even without enforced cooperation or competition, participants integrate both stimulus and social cues in perceptual decisions. Moreover, real-time social feedback selectively modulates accuracy and confidence via matching behavioral dimensions between partners.