department of informatics

The Communication Board

The CommnucationBoard and EmotiBoard project

The Communication Board - EmotiBoard is an interdisciplinary project, leaded by Denis Lalanne and Fabien Ringeval at the department of Informatics and in collaboration with Prof. Juergen Sauer and Andreas Sonderegger from the department of psychology. The goal of this project is to develop a methodology for emotionally enriching remote collaborative interactions; members of virtual teams are indeed less productive and less affectively committed compared to collocated teams, in particular with regard to difficult work situation such as intercultural teamwork. The project involves both real-time emotion recognition from multimodal inputs (speech and electrodermal activity - EDA) and adapted emotional visual representation as an emotion feedback; one see the emotion of his/her remote partner. Various studies were performed with EmotiBoard to investigate its benefits for improving quality of remote collaborative interactions with an emotional feedback, in particular for emotion awareness (i.e. accuracy in perception of remote teammate’s emotion) and social behaviors like agreement, engagement, dominance, performance, rapport, etc. 

In a first study, “Computer-supported work in partially distributed and co-located teams: the influence of mood feedback”, we shown the usefulness of the EmotiBoard as a mood feedback tool, because it helped better understand other team members’ mood and improved other outcome measures of team work. 

In a second study “On the Influence of Emotional Feedback on Emotion Awareness and Gaze Behavior”, we shown that the gaze behavior of an EmotiBoard’s user can be used to predict with more than 75% of accuracy the emotion (arousal and valence) of his/her remote teammate.

In a third study, “Introducing the RECOLA Multimodal Corpus of Remote Collaborative and Affective Interactions”, we build a new multimodal database to provide models to an automatic emotion recognition system, which is used for adapting the emotion feedback.

In addition to these papers, several master thesis were conducted on different research aspects of the EmotiBoard. In the thesis of Hervé Sierro, we investigated the use of data-driven based linguistic information for the automatic recognition of emotion (e.g., voiced and unvoiced speech segments, p-center based rhythmic units), and shown the superiority of this approach compared to the traditional prosodic features. In the thesis of Tomasz Jacykiewicz, we perform the automatic recognition of laughter events in speech and analyse the importance of different groups of feature (e.g., verbal and non-verbal) on the performance. Finally, in the thesis of Samaneh Soleimani, we used automatic emotion recognition from speech and EDA to generate different types of emotion feedbacks' visualisation.


The project started in 2011 and will finish end 2013.


The project is funded by the IM2.