An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction - Archive ouverte HAL Access content directly
Conference Papers Year :

An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction

Amir Aly
Connectez-vous pour contacter l'auteur

Abstract

In multimodal human-robot interaction (HRI), the process of communication can be established through verbal, non-verbal, and/or para-verbal cues. The linguistic literature shows that para-verbal and non-verbal communications are naturally synchronized, however the natural mechnisam of this synchronization is still largely unexplored. This research focuses on the relation between non-verbal and para-verbal communication by mapping prosody cues to the corresponding metaphoric arm gestures. Our approach for synthesizing arm gestures uses the coupled hidden Markov models (CHMM), which could be seen as a collection of HMM characterizing the segmented prosodic characteristics' stream and the segmented rotation characteristics' streams of the two arms articulations. Experimental results with Nao robot are reported.
Fichier principal
Vignette du fichier
Aly-INCOM2012.pdf (1.54 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01169980 , version 1 (02-07-2015)

Licence

Public Domain

Identifiers

Cite

Amir Aly, Adriana Tapus. An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction. The 14th IFAC Symposium on Information Control Problems in Manufacturing, May 2012, Bucharest, Romania. ⟨10.3182/20120523-3-RO-2023.00364⟩. ⟨hal-01169980⟩

Collections

ENSTA ENSTA_U2IS
27 View
194 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More