Towards an Interactive Human-Robot Relationship: Developing a Customized Robot Behavior to Human Profile. - PASTEL - Thèses en ligne de ParisTech
Thèse Année : 2014

Towards an Interactive Human-Robot Relationship: Developing a Customized Robot Behavior to Human Profile.

Vers une relation Homme-Robot Interactive : développement d'un comportement du Robot adapté au Profil de l’Homme

Amir Aly
Connectez-vous pour contacter l'auteur

Résumé

Robots become more and more omnipresent in our life and society, and many challenges arise when we try to use them in a social context. This thesis focuses on how to generate an adapted robot’s behavior to human’s profile so as to enhance the human-robot relationship. This research addresses a wide range of complex problems varying from analyzing and understanding human’s emotion and personality to synthesizing a complete synchronized multimodal behavior that combines gestures, speech, and facial expressions. Our methodologies have been examined experimentally with NAO robot from Aldebaran Robotics and ALICE robot from Hanson Robotics. The first part of this thesis focuses on emotion analysis and discusses its evolutionary nature. The fuzzy nature of emotions imposes a big obstacle in front of defining precise membership criteria for each emotion class. Therefore, fuzzy logic looks appropriate for modeling these complex data sets, as it imitates human logic by using a descriptive and imprecise language in order to cope with fuzzy data. The variation of emotion expressivity through cultures and the difficulty of including many emotion categories inside one database, makes the need for an online recognition system of emotion as a critical issue. A new online fuzzy-based emotion recognition system through prosodic cues was developed in order to detect whether the expressed emotion confirms one of the previously learned emotion clusters, or it constitutes a new cluster (not learned before) that requires a new verbal and/or nonverbal action to be synthesized. On the other hand, the second part of this thesis focuses on personality traits, which play a major role in human social interaction. Different researches studied the long term effect of the extraversion-introversion personality trait on human’s generated multimodal behavior. This trait can, therefore, be used to characterize the combined verbal and nonverbal behavior of a human interacting with a robot so as to allow the robot to adapt its generated multimodal behavior to the interacting human’s personality. This behavior adaptation could follow either the similarity attraction principle (i.e., individuals are more attracted by others who have similar personality traits) or the complementarity attraction principle (i.e., individuals are more attracted by others whose personalities are complementary to their own personalities) according to the context of interaction. In this thesis, we examine the effects of the multimodality and unimodality of the generated behavior on interaction, in addition to the similarity attraction principle as it considers the effect of the initial interaction between human and robot on the developing relationship (e.g., friendship), which makes it more appropriate for our interaction context. The detection of human’s personality trait as being introverted or extraverted is based on a psycholinguistic analysis of human’s speech, upon which the characteristics of the generated robot’s speech and gestures are defined. Last but not least, the third part of this thesis focuses on gesture synthesis. The generation of appropriate head-arm metaphoric gestures does not follow a specific linguistic analysis. It is mainly based on the prosodic cues of human’s speech, which correlate firmly with emotion and the dynamic characteristics of metaphoric gestures. The proposed system uses the Coupled Hidden Markov Models (CHMM) that contain two chains for modeling the characteristic curves of the segmented speech and gestures. When a speech-test signal is present to the trained CHMM, a corresponding set of adapted metaphoric gestures will be synthesized. An experimental study (in which the robot adapts the emotional content of its generated multimodal behavior to the context of interaction) is set for examining the emotional content of the generated robot’s metaphoric gestures by human’s feedback di- rectly. Besides, we examine the effects of both the generated facial expressions using the expressive face of ALICE robot, and the synthesized emotional speech using the text to speech toolkit (Mary-TTS) on enhancing the expressivity of the robot, in addition to comparing between the effects of the multimodal interaction and the interaction that employs less affective cues on human. Generally, the research on understanding human’s profile and generating an adapted robot’s behavior opens the door to other topics that need to be addressed in an elaborate way. These topics include, but not limited to: developing a computational cognitive architecture that can simulate the functionalities of the human brain areas that allow understanding and generating speech and physical actions appropriately to the context of interaction, which constitutes a future research scope for this thesis.
L'importance de considérer l'émotion dans l'interaction homme-robot comme base pour le comportement généré du robot, est la nature floue de l'émotion. Cela peut entraîner le robot à générer un comportement inapproprié au contexte de l'interaction en méconnaissant une émotion observée. Cela ouvre la porte vers une nouvelle méthodologie floue à base pour détecter l'émotion plus précisément. Cette méthodologie décide si l'émotion observée a besoin d'un nouveau comportement à synthétiser au cas où elle constitue une nouvelle classe non apprise auparavant, ou si elle peut être attribuée à un comportement existant dans la mémoire d'action du robot. D'autre part, l'effet à long terme de la personnalité sur le comportement verbal et non verbal de l’homme, le rend fiable pour être considéré comme un facteur déterminant pour le comportement multimodal synthétisé du robot. Par conséquent, l'adaptation du comportement généré verbal et non verbal du robot à la personnalité de l'homme comme étant introverti ou extraverti, pourrait bien améliorer l'attraction de l'homme au robot. Le processus de génération du comportement multimodal synchronisé du robot à travers la parole, les gestes, et les expressions faciales en fonction du profil de l'homme, subit un modèle cognitif de calcul. Ce modèle simule les fonctionnalités cognitives de l'homme, qui apprennent l'objectif et le mécanisme des actions multimodales effectuées par des hommes dans le milieu environnant. Par conséquent, lors d'une interaction, le robot devient capable de synthétiser par lui-même, un comportement multimodal basé sur le profil de l'homme, le contexte de l'interaction, et les expériences enregistrées dans sa mémoire d'action.
Fichier principal
Vignette du fichier
Thesis_ALY.pdf (5.26 Mo) Télécharger le fichier
Loading...

Dates et versions

tel-01128923 , version 1 (10-03-2015)

Licence

Domaine public

Identifiants

  • HAL Id : tel-01128923 , version 1

Citer

Amir Aly. Towards an Interactive Human-Robot Relationship: Developing a Customized Robot Behavior to Human Profile.. Computer Science [cs]. ENSTA ParisTech, 2014. English. ⟨NNT : ⟩. ⟨tel-01128923⟩
506 Consultations
526 Téléchargements

Partager

More