Elise D.

Using emotion as a feedback in app development

On September 3rd, lectures were given at Aldebaran by researchers from the fields of Social Robots Interactions and Human-Robot Interactions.

The video here is from the talk ‘Emotional Machine Learning’ given by Dr. Angelica Lim, software engineer at Aldebaran. To give you guys a bit of context, Angelica and her team, the expressivity team, are currently working on three topics: the first one is about making the interaction between the human and the robot easy and clear to understand; the second one is using emotion as a feedback to improve that interaction and the third one is to develop ‘bricks of interaction’, which means helping the robot to make connection between the interactions he can observes to give them a higher level of meaning.

The lecture, as you will see, was about some of their work with Pepper about the detection of emotion by the robot, to understand what it means, how do we make it possible and what it could represent for developers.

As Angelica mentioned in her talk, the goal for Pepper is to be able to detect accurately the user’s emotion and to react to them in an appropriate way.

The ALMood module described in the presentation was developed with the NAOqi 2.3 version and currently runs on Pepper. As explained in our documentation, ALMood gathers information from other modules, such as the ones from ALGazeAnalysis (for head angles), ALFaceCharacteristics (for face and smile characteristics) or ALVoiceEmotionAnalysis (for voice emotion).

ALMood Method List

Pepper will observe different elements surrounding the users and their interactions, such as smile degree, facial expression, head attitude, the words used or contact with its tactile sensors, to estimate the overall emotion of the user.

We believe this feedback offers great opportunities to explore in application development. As Angelica said:

“Part of the reason why we are making these mood metrics is so developers can use them in their application”

In addition to every data and sensors you can use in the development of robotics applications, we have here emotional data. Since the applications for Pepper or NAO are all based on interacting with people, adding emotion as a new metric to your application characteristics could allow you to optimize that interaction. The robot will adapt to the emotional reaction of the user, bettering the link and the experience.

Using ALMood with AppsAnalytics offers you the opportunity to get an emotional feedback and to optimize your applications with these data. This particular sentence did not make the person smile at all? Well, now you know it and can change it to something that will have a more positive impact!

The possibilities are huge, and we hope you are as excited as we are about them!

To start with ALMood, download sample_get_mood.crg behavior. This sample shows notice the main steps to use it: The extractors are started in Active mode; the robot makes a joke or comment ; the robot detects the resulting mood (positive/negative/neutral) of the person during the 3 seconds that follow.

You can also check out the Python tutorial for ALMood over there. Do not hesitate to share with us on the forum, our teams will gladly discuss about this with you guys!

Back to top