We aim at training intelligent machines able to deal with affective and social dimensions in the interaction with humans living in a physical and social space.

Research challenges

• Fusion between linguistic and paralinguistic channel, multimodal fusion

• Cross-corpora experiments, adaptation learning techniques

• Affective dialog system, long-term relation-ship

• Social interaction with robots, ethics and affective robotics

NaoAffective and social dimension detection are being applied to both human-machine interaction with robots and in the analysis of audiovisual and audio documents such as call center data. The main research subjects in this area are emotion and social clues identification in human-robot interaction, emotion detection based on verbal and non verbal clues (acoustic, visual and multimodal), dynamic user profile (emotional and interactional dimensions) in dialog for assistive robotics, and multimodal detection of the anxiety applied to therapeutic serious games.

In order to design affective interactive systems, experimental grounding is required to study expressions of emotion and social clues during interaction. Socio-cultural clues are contrary to emotions voluntarily controlled. In human interaction, nonverbal elements such as gesture, facial expression and paralinguistic clues are valuable for a more precise understanding of the communicated message. Voice and speech play a fundamental role in social interactions but they have been relatively neglected, at least in the last years, compared to other aspects of social exchanges such as, facial expressions or gestures. There is a tendency within the area of emotion-oriented computing to use very exaggerated and unnatural emotional data portrayed by actors. It seems increasingly clear that this strategy is not effective, because the forms of expression that occur in natural interactions are fundamentally different from those that actors generate on command. Since 2001, the work on speech introduced in this theme is based on the use of genuinely naturalistic material. The team was one of the first to grasp the issue, and is one of a very small number of researchers who has now consistently taken on the challenge of finding, annotating and analysing databases of real-life emotional data. The team has collected and analysed emotional speech databases in financial consultations, calls for medical help and human-robot interactions. Studies were led on various levels of fear (stress, anxiety, fear panics), of anger (annoyance, anger), of sadness (disappointment, sadness, depression) and of positive feelings (relief, satisfaction, enjoyment, pride). Analysis techniques that extracted spectral, prosodic and affect burst markers and automatic emotion detection systems using sophisticated machine learning techniques such as Support Vector Machines (SVM) have been developed to understand this comprehensive data. Recent comparisons show that they are on a par with those developed by other members of the international community.

A social robot sensitive to emotions should not take only punctual emotions into account, but also have a representation of the emotional and interactional profile of the user along the interactions, in order to have a chance of being more relevant in its behavioural responses. We have studied the way paralinguistic clues impact the human-robot interaction as a first step, by linking the low-level clues computed from speech to an emotional and interactional profile of the user. Being able to predict which specific behaviour will have a chance to trigger pleasure in the user is a plus. For example, someone dominant and with a high self-confidence will not need to be encouraged to interact, and this encouragement could even be seen as irrelevant, even boring. The system would provide a closed interaction loop, where the robot would react to the emotional message of the human, and trigger an emotional response in the human according to relevant chosen behaviours. There are many cases where voice is not the only clue available to identify emotions and social stances. We propose in our next steps of research to extract multimodal dimensions using gaze tracking (with a webcam), posture detection (with a Kinect 3D sensor) and a few physiological clues such as EEG with non-invasive sensors in order to improve the performance of our systems.

Research subjects developed in the area are: Speaker and emotion identification in human-robot interaction, Emotion detection for analyzing the quality of Client/Agent interaction in call center data, Engagement in Human-Robot interaction, Emotion detection based on acoustic, visual and physiological clues for Assistive Robotics and finally Multimodal detection of the anxiety for the design of a serious game with therapeutic purpose.

Applications and projects

The detection of affective and social dimensions can be used for human-machine communication with robots but also for audiovisual documents analysis with goals of health, security, education, entertainment or serious games applications.

Robotics is a relevant framework for assistive applications due to the learning and skills of robots. Human-Robot interaction is an hot topic on robotics. This broad research area, encompassing social interaction among robots and humans pose many challenges to the community. In a near future, socially assistive robotics aims to address critical areas and gaps in care by automating supervision, coaching, motivation, and companionship aspects of one-to-one interactions with individuals from various large and growing populations, including the elderly, children, disabled people, and individuals with social phobias among many others.

CernaThe ethical issues, including safety, privacy, and dependability of robot behaviour, are also more and more widely discussed. It is thus necessary that a bigger ethical thought is combined with the scientific and technological development of robots, to ensure the harmony and acceptability of their relation with the human beings. We are also involved in the Ethical working group for research in robotics of CERNA (Committee on the Ethics of the Research in sciences and technologies of the Digital technology of Allistene).

Ethics, Goals and Societal impact in Affective Computing is also a central subject of AAAC (SIG Ethics)  (L. Devillers, B. Schuller (Imperial College London, UK- UK) : The AAAC is a professional, world-wide association for researchers in Affective Computing, Emotions and Human-Machine Interaction. The AAAC, formerly the HUMAINE Association, was founded in June 2007.The ambition of the SIG Ethics is to collect the main ethics, goals and societal impact questions of The AAAC community.

The pole co-evolution human-machine of the Digital Society Institut at Paris-Saclay (L. Devillers, Ch. Licoppe (I3 Telecom ParisTech)) carries the idea of distribution of the intelligence between the users and the machines (Robotics, objects connected (quantified self), intelligent house etc.). Seen under this perspective, the user learns the use of the machine at the same time as the machine adapts itself to him, putting questions of acceptability, design of interaction and ethics. The joint evolution of the technologies and the users implies one collaboration of the actors STIC and SHS during the projects from their design to their evaluation.

Pole co-evolution human–machine ISN colloque 9-10 April 2015

Social robotics projects

• ISN TE2R (2015-16) : Tracks, explanations and responsibility of the robot - The big challenges of robotics in the society: to understand and to build the digital society, carried by the laboratory CERDI (Université Paris-Sud) and by the laboratory LIMSI-CNRS.

• ISN Engagement in a social interaction with Robots (2015-16): robot-humain-Quantified-Self - Experiment innovative manners to strengthen the power of stimulation and attachment of the interaction man-robot, carried(worn) by the LIMSI-CNRS and I3-Telecom-Paris

Collaboration with Aldebaran-Robotics (R. Gelin), Spirops (A. Buendia) 

Collaboration with the LIUM (Y. Esteve), Koç University (M. Sezgin), Dublin University (N. Campbell), UMONS (S. Dupont)

Collaboration with the CEA (Ch. Leroux)

 Project with therapeutic vocation

Cooperation with A. Pelissolo at Pitié Salpêtrière hospital

Affective computing projects

Collaboration with CPU team at LIMSI-CNRS and STAPS (Paris-Saclay).

Collaboration with UNIGE (K. Scherer), with Erlangen University (A. Batliner), with Paris VIII University (C. Pelachaud), with Belfast University (R. Cowie)

Collaboration with SME Voxler (N. Delorme), with IRCAM (X. Rodet)

Projects on emotions with call centers

Videos