Bodystorming human-robot interactions

Porfirio, D., E. Fisher, A. Sauppé, A. Albarghouthi, and B. Mutlu. “Bodystorming Human-Robot Interactions”. Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, ACM, 2019, pp. 479-91.

Abstract

Designing and implementing human-robot interactions requires numerous skills, from having a rich understanding of social interactions and the capacity to articulate their subtle requirements, to the ability to then program a social robot with the many facets of such a complex interaction. Although designers are best suited to develop and implement these interactions due to their inherent understanding of the context and its requirements, these skills are a barrier to enabling designers to rapidly explore and prototype ideas: it is impractical for designers to also be experts on social interaction behaviors, and the technical challenges associated with programming a social robot are prohibitive. In this work, we introduce Synthé, which allows designers to act out, or bodystorm, multiple demonstrations of an interaction. These demonstrations are automatically captured and translated into prototypes for the design team using program synthesis. We evaluate Synthé in multiple design sessions involving pairs of designers bodystorming interactions and observing the resulting models on a robot. We build on the findings from these sessions to improve the capabilities of Synthé and demonstrate the use of these capabilities in a second design session.

DOI: 10.1145/3332165.3347957

BibTex

@inproceedings{Porfirio_2019,
	doi = {10.1145/3332165.3347957},
	url = {https://doi.org/10.1145%2F3332165.3347957},
	year = 2019,
	month = {oct},
	publisher = {{ACM}},
	author = {David Porfirio and Evan Fisher and Allison Saupp{\'{e}} and Aws Albarghouthi and Bilge Mutlu},
	title = {Bodystorming Human-Robot Interactions},
	booktitle = {Proceedings of the 32nd Annual {ACM} Symposium on User Interface Software and Technology}
}