Speaker
Description
Computer vision provides powerful tools for studying primate behavior in videos recorded in the wild by enabling automatic tracking of individuals and detection of their behaviors. While much of the existing work has focused on identifying individual actions, relatively little attention has been given to detecting social interactions among nonhuman primates.
In this talk, we present a deep learning-based approach for detecting interactions in naturalistic settings, using field experiments with redfronted lemurs as a case study. We show how individuals can be tracked and identified using a bounding box-based model, and how our custom video annotation interface facilitates efficient labeling of actions and interactions. Using the annotated data, we train computer vision models designed to detect social interactions, including gaze target detection models and dynamic scene graphs.
Together, this pipeline outlines a path for automatically detecting social interactions of nonhuman primates in natural environments, with potential applicability across species and research contexts.