See and Listen for Pepper

See and Listen : discover an app for Pepper

Let’s say you just bought Pepper. No, in fact, let’s say you bought it a few weeks ago. You’ve been waiting for it since that moment. And today’s the big day. Pepper is arriving to your home. You unpack it, you turn it on. And you expect the magic to happen!   But how to interact with Pepper ? Will it see your face? Understand your voice ? React to your commands ?

NAOqi 2.4.3

NAOqi 2.4.3 for Pepper is available

Upgrade your Pepper and developer tools now! We are very pleased to announce the release of NAOqi 2.4.3 for Pepper. Here are the most important elements to notice: This release fixes important audio bugs (synchronization, speech rocognition, etc.) This release provides a new feature on the tablet to easily perform a factory reset This release support a new cool device: the charging station Please, read below in order to get more details. You can download Choregraphe and the developer tools in the resources section. This release has fixed important bugs about the audio system. You have maybe noticed a bad synchronization between the Text to speech and the Automatic speech recognition or other bugs in dialog that result in a random interaction with Pepper or the robot stops talking. Many of those issues have been fixed and you will now take advantage of a better interaction with your robot. We are pleased to announce also that this version is supporting a new “cool” device, the charging station! Once the chargin


MGF 2016: Robotics and the Game Industry

For the second year, Aldebaran attended the Mobile Games Forum, held on January 20th and 21st during the London Mobiles Game Week. For its 13th event MGF gathered together various game developers and publishers as well as platform and service providers. Over the course of the two day event, the trends and best practices of the mobile games industry were discussed and highlighted. Among the most innovative devices and games, we decided to introduce Pepper, our special guest, to the audience visitng our booth during the event.
Pepper wore his Aldebaran hoodie for the occasion

A Week in Tokyo

A week in Tokyo

Few days ago, we shared on the blog an overview of the Pepper App Challenge and Innovation Challenge. The team was in Tokyo to attend the finale but also took it as an opportunity to discover and share what has been going on around Pepper in Japan. From application development to Pepper deployment in shops, we tried to gather in a video a glimpse of Japan excitement for Pepper. You will find in the video a recap of the Challenge finale, an Akihabara Atelier tour, examples of Pepper’s application in stores and an overview of the IREX exhibition. Feel free to give us your feedback about this!


Pepper App Challenge & Pepper Innovation Challenge: Application development in Japan

November 28th, Belle salle event hall in Tokyo Shibuya district, hundreds of people gathered to attend the finals of the Pepper App Challenge and the Pepper Innovation Challenge: close-up on a major developer event. After a successful first edition in February, Softbank Robotics launched both events on September 18, 2015. Developers had until October 31st to submit their applications to one of the challenges.

Rodolphe Gelin

Meet Rodolphe Gelin, EVP Chief Scientific Officer at Aldebaran

After 7 years working at Aldebaran, Rodolphe was recently appointed to the position of Chief Scientific Officer, an opportunity to learn more about his vision and Aldebaran future. Rodolphe can you start by telling us a bit about your background?             Sure, I graduated from the School of Civil Engineering and hold a DEA (Research Master’s Degree) in Artificial Intelligence from the University of Paris VI. Out of school I joined the French Atomic Energy Commission (Commissariat à l’Energie Atomique: CEA) where I worked on the control of mobile robots for service applications, cleaning and helping the disabled. In 1998, I led the laboratory service robotics department and in 2001 was appointed as head of cognitive service robotics and interaction at CEA LIST (technological research division) in Saclay. Between 2006 and December 2008, I was responsible for the Interactive System program where I helped develop important partnerships within the robotics industry. As a member of the European CARE project I handled the European professional service robotics roadmap. On behalf of the ISO (International Organization for Standardiz


NAOqi 2.4.2 available for Pepper

Hello community,
  This month is the month of releases! On Tuesday 20th, we announced the release of NAOqi 2.1.4 for NAO. Today, it’s Pepper’s turn with the release of NAOqi 2.4.2. You can read the release note

2.1.4 release for NAOqi

NAOqi 2.1.4 for NAO available

October is just beginning and there are more than 4.000 of you registered to our Developer Program. Thanks a lot for that. We’re glad to see so much interest for our tools! We also noticed an increase in the activity of the forum. We’re happy to see more and more of you ask questions, but also answer other people’s questions.
  Today, we’re pleased to announce the release of NAOqi 2.1.4 for NAO. All through summer, our teams have been working h

Object recognition NAO

Using NAO Image Recognition in your application

Sebastien Cagnon have been a Behavior Architect at Aldebaran for 2 years, creating complex apps on NAO and Pepper. He is currently Head of Technical Support in Tokyo. Author of the blog About Robot, he regularly writes blog post about the application creation process with helpful ressources. Today, we share with you guys his latest blog post about packaging the object recognition feature in an application. You can find the original article on Sebastien's blog and also find him on Twitter or Github. NAO and Pepper have this awesome feature to recognize objects from a database of images. You can create the database quite easily with Choregraphe and upload it to your robot. It's rather simple. For more details on the basics of creating and using a Vision Recognition Database,

Humand Emotion with ALMood

Using emotion as a feedback in app development

On September 3rd, lectures were given at Aldebaran by researchers from the fields of Social Robots Interactions and Human-Robot Interactions. The video here is from the talk ‘Emotional Machine Learning’ given by Dr. Angelica Lim, software engineer at Aldebaran. To give you guys a bit of context, Angelica and her team, the expressivity team, are currently working on three topics: the first one is about making the interaction between the human and the robot easy and clear to understand; the second one is using emotion as a feedback to improve that interaction and the third one is to develop ‘bricks of interaction’, which means helping the robot to make connection between the interactions he can observes to give them a higher level of meaning. The lecture, as you will see, was about some of their work with Pepper about the detection of emotion by the robot, to understand what it means, how do we make it possible and what it could represent for developers. As Angelica mentioned in her talk, the goal for Pepper is to be able to detect accurately the user’s emotion and to react to them in an appropriate way. The ALMood module described in the presentation was developed with the NAOqi 2.3

Back to top