Multimodal Affective Interaction

This axis focuses on the study, detection, simulation and production of affective and socio-cognitive processes that appear during interactions with intelligent multimodal interfaces. We focus on the non-verbal behaviour such as facial expressions, postures, body movements and eye contact. In the context of the research trend in Affective Computing, we study 1) the detection of affective expressions, 2) the simulation of human reasoning and adaptation to these detected affects, 3) the expression of emotions by animated virtual agents. This work requires strong collaborations between researchers in cognitive psychology and computer science to extend existing theories of emotions and personality in psychology and turn them into computational models that will be implemented and evaluated throughout experiments.

For instance, we study the impact of stress on applicants' behaviours during job interviews (Europeen project TARDIS, ANR project COMPARSE) and on collaboration within rescue teams (ANR project VICTEAMS), using research in cognitive psychology and computer simulation of models from human and social science. We also study the impact of personality on human behaviour and the possibility to express this personality in the behaviour of intelligent agents (for action selection, for dialogue strategies, etc).



 Two approaches for training students for job interviews: the TARDIS European project (above) and the MACH project with MIT Media Lab (below).




Campus universitaire bât 507
Rue du Belvédère
F - 91405 Orsay cedex
Tél +33 (0) 1 69 15 80 15


Scientific report

LIMSI in numbers

8 Research Teams
100 Researchers
40 Technicians and Engineers
60 Doctoral Students
70 Trainees


Paris-Saclay University new window

Logo DataIA