Ir al contenido

Documat


Engaging human-to-robot attention using conversational gestures and lip-synchronization

  • Autores: Felipe Andrés Cid Burgos, Luis Jesús Manso Fernández-Argüelles Árbol académico, Luis Vicente Calderita Estévez, Agustín Sánchez Domínguez, Pedro Miguel Núñez Trujillo Árbol académico
  • Localización: JoPha: Journal of Physical Agents, ISSN-e 1888-0258, Vol. 6, Nº. 1, 2012 (Ejemplar dedicado a: Advances on physical agents), pág. 2
  • Idioma: inglés
  • DOI: 10.14198/jopha.2012.6.1.02
  • Enlaces
  • Resumen
    • Human-Robot Interaction (HRI) is one of the most important subfields of social robotics. In several applications, text-to-speech (TTS) techniques are used by robots to provide feedback to humans. In this respect, a natural synchronization between the synthetic voice and the mouth of the robot could contribute to improve the interaction experience. This paper presents an algorithm for synchronizing Text-To-Speech systems with robotic mouths. The proposed approach estimates the appropriate aperture of the mouth based on the entropy of the synthetic audio stream provided by the TTS system. The paper also describes the cost-efficient robotic head which has been used in the experiments and introduces the use of conversational gestures for engaging Human-Robot Interaction. The system, which has been implemented in C++ and can perform in real- time, is freely available as part of the RoboComp open-source robotics framework. Finally, the paper presents the results of the opinion poll that has been conducted in order to evaluate the interaction experience.


Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno