Abstract
In this paper, we introduce a novel method to support remote telemanipulation tasks in complex environments by providing operators with an enhanced view of the task environment. Our method features a novel viewpoint adjustment algorithm designed to automatically mitigate occlusions caused by workspace geometry, supports visual exploration to provide operators with situation awareness in the remote environment, and mediates context-specific visual challenges by making view- point adjustments based on sparse input from the user. Our method builds on the dynamic camera telemanipulation viewing paradigm, where a user controls a manipulation robot, and a camera-in-hand robot alongside the manipulation robot servos to provide a sufficient view of the remote environment. We discuss the real-time motion optimization formulation used to arbitrate the various objectives in our shared-control-based method, partic- ularly highlighting how our occlusion avoidance and viewpoint adaptation approaches fit within this framework. We present results from an empirical evaluation of our proposed occlusion avoidance approach as well as a user study that compares our telemanipulation shared-control method against alternative telemanipulation approaches. We discuss the implications of our work for future shared-control research and robotics applications.
DOI: 10.15607/rss.2019.xv.068
BibTex
@inproceedings{Rakita_2019, doi = {10.15607/rss.2019.xv.068}, url = {https://doi.org/10.15607%2Frss.2019.xv.068}, year = 2019, month = {jun}, publisher = {Robotics: Science and Systems Foundation}, author = {Daniel Rakita and Bilge Mutlu and Michael Gleicher}, title = {Remote Telemanipulation with Adapting Viewpoints in Visually Complex Environments}, booktitle = {Robotics: Science and Systems {XV}} }