Hunter Zhang is an undergraduate student working at the People and Robots Lab. His research interests focus on the ways AR/VR could be used to enhance human-computer or human-robot interaction.
Schoen, A., D. Sullivan, H. Zhang, D. Rakita, and B. Mutlu. Lively: Enabling Multimodal, Lifelike, and Extensible Real-Time Robot Motion. Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23), 2023.
AbstractRobots designed to interact with people in collaborative or social scenarios must move in ways that are consistent with the robot’s task and communication goals. However, combining these goals in a naïve manner can result in mutually exclusive solutions, or infeasible or problematic states and actions. In this paper, we present Lively, a framework which supports configurable, real-time, task-based and communicative or socially-expressive motion for collaborative and social robotics across multiple levels of programmatic accessibility. Lively supports a wide range of control methods (i.e., position, orientation, and joint-space goals), and balances them with complex procedural behaviors for natural, lifelike motion that are effective in collaborative and social contexts. We discuss the design of three levels of programmatic accessibility of Lively, including a graphical user interface for visual design called LivelyStudio, the core library Lively for full access to its capabilities for developers, and an extensible architecture for greater customizability and capability.