Abstract
Human demonstrations are important in a range of robotics applications, and are created with a variety of input methods. However, the design space for these input methods has not been extensively studied. In this paper, focusing on demonstrations of hand-scale object manipulation tasks to robot arms with two-finger grippers, we identify distinct usage paradigms in robotics that utilize human-to-robot demonstrations, extract abstract features that form a design space for input methods, and characterize existing input methods as well as a novel input method that we introduce, the instrumented tongs. We detail the design specifications for our method and present a user study that compares it against three common input methods: free-hand manipulation, kinesthetic guidance, and teleoperation. Study results show that instrumented tongs provide high quality demonstrations and a positive experience for the demonstrator while offering good correspondence to the target robot.
DOI:10.1109/hri.2019.8673310
Bibtex
@inproceedings{Praveena_2019, doi = {10.1109/hri.2019.8673310}, url = {https://doi.org/10.1109%2Fhri.2019.8673310}, year = 2019, month = {mar}, publisher = {{IEEE}}, author = {Pragathi Praveena and Guru Subramani and Bilge Mutlu and Michael Gleicher}, title = {Characterizing Input Methods for Human-to-Robot Demonstrations}, booktitle = {2019 14th {ACM}/{IEEE} International Conference on Human-Robot Interaction ({HRI})} }