Pelikan with present data at MOBSIN 2019, Leuven 15-17 May, 2019.
Encounters with a social robot: Data session.
The data for this data session come from a small corpus of video recordings of 4 pairs of people interacting with a Cozmo toy robot, a small robot inspired by Pixar’s Wall-E and Eve. Encountering a social robot for the first time, participants only gradually discover the robot’s capabilities. They learn to make sense of the robot’s utterances, movements and facial gestures over the course of the interaction. In exploring what the robot can and cannot do, participants demonstrate basic assumptions about openings of interactions. While the robot is fully mobile and drives around on the dinner table, its head does not turn but is limited to up and down movements. Such issues of mobility pose interesting constraints on how the addressee of an utterance can be selected, typically assigning one participant the role of a bystander. In the data session, I will show two examples of how people actively reorient the robot to the other participant, who then takes their turn in (re-)opening the interaction with the robot. I collected these data with the larger goal of exploring non-lexical vocalizations in human-robot interaction, focusing on how humans orient to robotic vocalizations as sequentially organized.