An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction

Abstract : In multimodal human-robot interaction (HRI), the process of communication can be established through verbal, non-verbal, and/or para-verbal cues. The linguistic literature shows that para-verbal and non-verbal communications are naturally synchronized, however the natural mechnisam of this synchronization is still largely unexplored. This research focuses on the relation between non-verbal and para-verbal communication by mapping prosody cues to the corresponding metaphoric arm gestures. Our approach for synthesizing arm gestures uses the coupled hidden Markov models (CHMM), which could be seen as a collection of HMM characterizing the segmented prosodic characteristics' stream and the segmented rotation characteristics' streams of the two arms articulations. Experimental results with Nao robot are reported.
Document type :
Conference papers
Complete list of metadatas

Cited literature [16 references]  Display  Hide  Download

https://hal-ensta-paris.archives-ouvertes.fr//hal-01169980
Contributor : Amir Aly <>
Submitted on : Thursday, July 2, 2015 - 3:16:16 PM
Last modification on : Wednesday, July 3, 2019 - 10:48:05 AM
Long-term archiving on : Tuesday, April 25, 2017 - 8:23:27 PM

File

Aly-INCOM2012.pdf
Files produced by the author(s)

Licence


Public Domain

Identifiers

Collections

Citation

Amir Aly, Adriana Tapus. An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction. The 14th IFAC Symposium on Information Control Problems in Manufacturing, May 2012, Bucharest, Romania. ⟨10.3182/20120523-3-RO-2023.00364⟩. ⟨hal-01169980⟩

Share

Metrics

Record views

123

Files downloads

167