Learning Two-Person Interaction Models for Responsive Synthetic Humanoids Vogt David Ben Amor Heni Berger Erik Jung Bernhard Imitation learning is a promising approach for generating life-like behaviors of virtual humans and humanoid robots. So far, however, imitation learning has been mostly restricted to single agent settings where observed motions are adapted to new environment conditions but not to the dynamic behavior of interaction partners. In this paper, we introduce a new imitation learning approach that is based on the simultaneous motion capture of two human interaction partners. From the observed interactions, low-dimensional motion models are extracted and a mapping between these motion models is learned. This interaction model allows the real-time generation of agent behaviors that are responsive to the body movements of an interaction partner. The interaction model can be applied both to the animation of virtual characters as well as to the behavior generation for humanoid robots. humanoid robots imitation learning interaction learning motion adaptation motor learning virtual characters 004 periodical academic journal Journal of Virtual Reality and Broadcastings 11(2014) 1 2014 1860-2037 urn:nbn:de:0009-6-38565 10.20385/1860-2037/11.2014.1 http://nbn-resolving.de/urn:nbn:de:0009-6-38565 vogt2014