To develop robots that effectively interact with humans, it is important to understand how humans teach and learn in different scenarios. The research aims to collect data for pairs of humans performing collaborative object-moving tasks in a virtual environment. This is performed for two scenarios, first, when both the agents have complete information about the goal and second, when one only has partial information about it. The data is analyzed for the above-mentioned scenarios. In both scenarios, probability distributions of actions are evaluated for both agents and are compared to understand behavior when goal-teaching is involved and when it isn’t.
Modern robotic devices are required to perform certain tasks where they need to teach actions to humans through haptic/physical interactions, e.g., rehabilitative robots. For achieving that, a simulation study, with agents having information asymmetry is performed based on a signaling game to model physical communication between agents. The simulation shows that information exchange along with jointly reaching the goal could be done in finite time. An experimental study is also performed with human subjects to demonstrate robot teaching in physical tasks. The results can be extended into rehabilitation robots or semi-autopilots to provide physical assistance more safely and efficiently.