An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction - ENSTA Paris - École nationale supérieure de techniques avancées Paris Accéder directement au contenu
Communication Dans Un Congrès Année : 2012

An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction

Amir Aly
Connectez-vous pour contacter l'auteur

Résumé

In multimodal human-robot interaction (HRI), the process of communication can be established through verbal, non-verbal, and/or para-verbal cues. The linguistic literature shows that para-verbal and non-verbal communications are naturally synchronized, however the natural mechnisam of this synchronization is still largely unexplored. This research focuses on the relation between non-verbal and para-verbal communication by mapping prosody cues to the corresponding metaphoric arm gestures. Our approach for synthesizing arm gestures uses the coupled hidden Markov models (CHMM), which could be seen as a collection of HMM characterizing the segmented prosodic characteristics' stream and the segmented rotation characteristics' streams of the two arms articulations. Experimental results with Nao robot are reported.
Fichier principal
Vignette du fichier
Aly-INCOM2012.pdf (1.54 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01169980 , version 1 (02-07-2015)

Licence

Domaine public

Identifiants

Citer

Amir Aly, Adriana Tapus. An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction. The 14th IFAC Symposium on Information Control Problems in Manufacturing, May 2012, Bucharest, Romania. ⟨10.3182/20120523-3-RO-2023.00364⟩. ⟨hal-01169980⟩

Collections

ENSTA ENSTA_U2IS
29 Consultations
223 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More