We've all seen pictures of emotional expressions: happy faces, sad faces, angry faces, and more. Aldebaran's Emotion Team is working on detecting these emotional expressions and more, to build Pepper and NAO's emotional engine.
The Emotion Team is comprised of specialists and engineers in emotion, machine learning, perception, cognitive science and user experience. We are multicultural and love robots!
Can you introduce the members of your team?
Sure! At the heart of our team is Marine CHAMOUX, who is a computer vision engineer working on multimodal fusion, tying together all sources of emotion, from face, to voice, to body language.
Dr. Laurent GEORGE is an engineer with a background in signal processing; fun fact: he did his Ph.D. on brain-computer interfaces!
Karsten KNESE is also an engineer with a specialty in perception and machine learning on robots. He has lived all over the world, including Korea, Munich and Barcelona.
Kyohei OTA and Audrick FAUSTA are our test engineers, ensuring the robustness and reliability of our software. Kyohei provides invaluable support in Japanese, and you may recognize Audrick FAUSTA from his artistic performances and dance.
Alexia BUCLET has a background in cognitive science, and helps us create user experiences and scenarios to capture emotions in the wild.
I'm Dr. Angelica LIM, and I'm the computer scientist leading the team based on my research on artificial intelligence, robots and emotions.
Kyohei OTA, Laurent GEORGE, Angelica LIM, Karsten KNESE, Alexia BUCLET, Marine CHAMOUX (missing: Audrick FAUSTA)
People often ask us, why is it important for a robot to recognize emotion?
There are many good reasons, here are a few!
Our goal is to have a robot that can use and understand our human language of emotions, instead of humans having to adapt to machines!
Can you tell us more about communication through emotions?
Of course! We all know communication is important. With Aldebaran's robots, we use speech and dialog to communicate, which is already a great step But in fact, every day we communicate our feelings, intentions, and preferences, without even saying a word!
Emotions contribute a large part of our non-verbal language.
From an evolutionary perspective, it may be that expressing emotions was a primitive form of social communication before language emerged.
Sadness, for example, occurs when you have a goal, something stops you from reaching it, and you feel you can't do anything about it.* For instance, imagine the loss of a loved one. Expressing sadness signals to others that something bad has happened, you feel powerless, and may need support. (See the emotion wheel below.)
[Geneva Emotion Wheel]
Anger means that your goal was blocked, yet unlike sadness, you feel there is something that can be done to fix it. A rush of energy. A wrong to make right. For example, imagine seeing a video on cruelty or injustice -- anger can push you to action. Expressing anger signals to others that your goals are not being met, and you're going to do something about it.
In the animal kingdom, screeches of monkeys and birds signal danger, an implicit message "I saw danger, be careful!". Like anger, fear is accompanied by a rush of energy. Imagine the uncontrollable yelp when you see a car coming fast and close. Expressing fear is a signal that something bad is happening, and you're not sure you can handle it.
Happiness signals that everything is ok, and all systems are go. Imagine the expression of satisfaction on your face after eating a good meal, or after finishing a tough project. Expressing happiness signals that your goals are being met.
Expressions of emotion are a rich form of non-verbal communication. The road ahead of us is still long, but the Aldebaran Emotion team is working today to make Pepper and NAO the first commercial robots in the world to decipher these fascinating emotional signals, so we can communicate with technology the way we do with our humans friends, everyday
*According to Appraisal Theory of Emotion, c.f. Work of K. Scherer