Skip to Main content Skip to Navigation
New interface
Conference papers

An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction

Abstract : In multimodal human-robot interaction (HRI), the process of communication can be established through verbal, non-verbal, and/or para-verbal cues. The linguistic literature shows that para-verbal and non-verbal communications are naturally synchronized, however the natural mechnisam of this synchronization is still largely unexplored. This research focuses on the relation between non-verbal and para-verbal communication by mapping prosody cues to the corresponding metaphoric arm gestures. Our approach for synthesizing arm gestures uses the coupled hidden Markov models (CHMM), which could be seen as a collection of HMM characterizing the segmented prosodic characteristics' stream and the segmented rotation characteristics' streams of the two arms articulations. Experimental results with Nao robot are reported.
Document type :
Conference papers
Complete list of metadata

Cited literature [16 references]  Display  Hide  Download
Contributor : Amir ALY Connect in order to contact the contributor
Submitted on : Thursday, July 2, 2015 - 3:16:16 PM
Last modification on : Wednesday, May 11, 2022 - 3:20:03 PM
Long-term archiving on: : Tuesday, April 25, 2017 - 8:23:27 PM


Files produced by the author(s)


Public Domain




Amir Aly, Adriana Tapus. An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction. The 14th IFAC Symposium on Information Control Problems in Manufacturing, May 2012, Bucharest, Romania. ⟨10.3182/20120523-3-RO-2023.00364⟩. ⟨hal-01169980⟩



Record views


Files downloads