Object recognition NAO

Using NAO Image Recognition in your application

Sebastien Cagnon have been a Behavior Architect at Aldebaran for 2 years, creating complex apps on NAO and Pepper. He is currently Head of Technical Support in Tokyo. Author of the blog About Robot, he regularly writes blog post about the application creation process with helpful ressources. Today, we share with you guys his latest blog post about packaging the object recognition feature in an application. You can find the original article on Sebastien's blog and also find him on Twitter or Github. NAO and Pepper have this awesome feature to recognize objects from a database of images. You can create the database quite easily with Choregraphe and upload it to your robot. It's rather simple. For more details on the basics of creating and using a Vision Recognition Database,

Humand Emotion with ALMood

Using emotion as a feedback in app development

On September 3rd, lectures were given at Aldebaran by researchers from the fields of Social Robots Interactions and Human-Robot Interactions. The video here is from the talk ‘Emotional Machine Learning’ given by Dr. Angelica Lim, software engineer at Aldebaran. To give you guys a bit of context, Angelica and her team, the expressivity team, are currently working on three topics: the first one is about making the interaction between the human and the robot easy and clear to understand; the second one is using emotion as a feedback to improve that interaction and the third one is to develop ‘bricks of interaction’, which means helping the robot to make connection between the interactions he can observes to give them a higher level of meaning. The lecture, as you will see, was about some of their work with Pepper about the detection of emotion by the robot, to understand what it means, how do we make it possible and what it could represent for developers. As Angelica mentioned in her talk, the goal for Pepper is to be able to detect accurately the user’s emotion and to react to them in an appropriate way. The ALMood module described in the presentation was developed with the NAOqi 2.3

How to master Community 3

How to master Community (part 3/3)

As mentioned in our last how to article, there is one more step to master Community’s latest features. In this blog post, we will focus on the forum. This section is growing more and more every day, thanks to you guys J. In order for you to make the most of it, here are a few tips about posting, reading and answering on the forum.       The three-points method to optimize an answer You have a question/issue with Choregraphe or your robot or you simply want to talk about something with the community? You have already checked on the forum and didn’t find any answer or similar topic? Well, it’s time to create a brand new one then!
1: create a new topic Before posting, we will help you make sure there is nothing related to the question you are about to ask. Fill in your topic title and check whether your question hasn’t already been answered or not.
2: check for similar discussions Then, you just have to fill in the differ


Get your hands on Choregraphe and become a roboticist (yeah for real)

Did you know that you could play the roboticist even if you do not own a robot? I guess I already had your curiosity. Now I have your attention. Indeed, our visual programming interface Choregraphe is available for everyone who wants to get his/her hands on the gorgeous boxes thing thanks to a three month free trial. How do I get Choregraphe? (please tell me) To download Choregraphe, simply sign in on Community, go to the Aldebaran Software page and then click on the Choregraphe suit to find the latest version of the software for Linux, Mac and Windows OS. You will also have access to older versions (only the 1.14.5 one for the moment but we will add more of them soon). You got the Choregraphe you were looking for? It is now time for you to install it. Th

Sandrine Tourcher and NAO

How to make your applications more intuitive?

We worked with Sandrine Tourcher to establish a list of questions one should always have in mind when designing an app for NAO or Pepper to create a better interaction between a robot and a human. As User eXperience (UX) experts, she and her team are in charge of all the usability improvements of Aldebaran’s products, from our websites and software to your interactions with the robots. It’s important to know that building a good User eXperience is a team work with interaction designers, visual designers, developers, usability experts and strategy representatives. In this multidisciplinary team, the UX expert has two roles. He is able to evaluate the experience the user is living with our products to identify issues and detect improvement levers, but also to be an actor of these changes as a usability expert. She describes their role as “user’s advocate” by representing users in the conception team to make every products and content intuitive, efficient and easy to use for them. From this perspective she brings some insight, golden rules and advices that might also be helpful to developers working on robots at home, like you. 1 - What’s the intended audience for what I’m building?


What's new in Choregraphe 2.1

A few days ago we released a new version of NAOQI that runs both on NAO and Pepper. With it came a new version of Choregraphe through which you can access all the new functionalities of our robots : dialog, facial recognition… We sat down with Victor Paléologue to talk about what’s new in this release. Please introduce yourself and your work at Aldebaran. I’m working with the Desktop Team. We’re in charge of all of Aldebaran’s graphical apps, including Choregraphe, a powerful tool to develop apps for the robot. We also work on Monitor, and some other tools for internal use. My job is to take NAOQI and to create graphical interfaces to control it, with small modules and a little bit of refinement of what’s already in NAOQI. Sometimes I’ll work on NAOQI, for example to make sure we have events on what’s happening with the robot or the mechanics of launching behaviors. In the end everything that has to do with the graphical language has to do with what’s happening in NAOQI, and in the robot. So we can thank you for the


How to make your NAO apps magical

Julien Gorrias is emotion director at Aldebaran, which means he’s in charge of making us all fall in love with NAO. He’s spending a lot of time making sure that NAO and Romeo will make an emotional connection with people. From the way a robot looks to a way he acts, we have to think about that connection if we want him to be more than a machine for the people who will welcome a robot into their home. As essential as it is, it can all seem a bit abstract, especially when you’re working on an app and just trying to make it work. But if we’re to build an emotional robot, “it just works” is not enough. We’ve talked with Julien to find out more about emotional design and how it applies to robots, and came up with these four questions you should ponder when building an app. Finding answers to those questions should help you make your app a little bit more than just useful. Is my app doing something that could only be done with a robot? A humanoid robot is not a laptop. It’s not a smartphone either. He may be of less use if you’re trying to accomplish some specific productive tasks. Don’t try to make him do something that another pl


Programing NAO with Choregraphe

We were at Makerland in Warsaw last week to introduce NAO and Choregraphe to the Polish maker community.  Céline and Jessica from Aldebaran animated a workshop for the attendees, and we were lucky enough to have Stephen Chin, Java Evangelist, film this session for us. So if you've never programmed NAO or simply need a refresher on the fundamentals of Choregraphe, this video will be the perfect tutorial for you :


Vision recognition tutorial

NAO has a pair of big beautiful LED eyes, but he also has a couple of cameras to watch the world around him. You can use those so your NAO will learn to recognize shapes and objects. But how exactly can you do that ? Don't worry, these few slides will tell you all about it :  Vision recognition tutorial from Nicolas RIGAUD Here's the link to the Vision_Demo project you'll need : vision_demo.crg

Back to top