The Nao Robot

The Nao Robot

During my work at the TU/e, I had the chance to work with the Orion School. From this collaboration we designed a couple of simple exercises to be used by the Orion’s therapist in her daily work with the children.

As the robot is an attractive gadget for the children, they did not have problems to approach the robot and answer the robot questions. The children even wanted to give a hand sake to the robot at the first sight.

The humanoid robot used, a Nao Robot manufactured by Aldebaran, is a 60cm nice robot that the children find “charming”. The features of this robot include a speech recognition system (can understand words), voice synthesizer (can talk), two cameras (can see), some touch sensors (can feel touch), lights on its face, head or chest and motors (can move, walk and even grab small things with its hands).

We used the robot as an element to make the therapy more attractive. As the children fall in love with the robot since the very first moment, they were willing to collaborate on all the exercises proposed.

The exercises

Along the four months of our collaborations we designed two exercises called Conversations and Emotions. The aim in both exercises were teach something to the children at the same time that they could feel comfortable and collaborative.

The set up was as simple as possible: A tablet and the robot. We used the tablet to provide an interface with the robot, allowing the children to explore the robot capabilities.


This exercise was designed to promote the conversation with the autistic children. In this case, the therapist was using the tablet, writing sentences on live that the robot had to say or selecting the speech from a list of predefined (and customizable) sentences.

The robot was programmed to “breath” while it was waiting, avoiding the bad effect of a silent and quiet robot in waiting mode. Even different movements (like the amazing air guitar) were offered though the tablet, to make the interaction more funny and relaxed.

While the robot was speaking, it also move according to different emotions or attitudes. So, the robot could say “Hello” being happy or sad. With these few elements: movements and sentences we got quite interesting results.

The main screen of the "conversations" exercise

The main screen of the “conversations” exercise


At Orion, the therapist was teaching the autistic children to identify human emotions, as usually these children have problems to read the body language of other people and identify if they are happy or sad.

So, we created an exercise with the robot to help them to identify emotions. As the robot does not provide as much information as a human being when it moves (the autistic children usually get very confused due to the huge amount of information that our faces transmit) it was a good candidate to practice this exercise.

The robot was performing emotions using only body language and no sound or lights were used. In total, 12 emotions were programmed into the robot, divided in four groups: happy emotions, sad emotions, angry emotions and fear emotions.

The exercise is as follow: The robot performs an emotion, and the children has to select the performed emotion from different choices. The robot encourages or congratulates the children along the wrong and right answers.

As every children is different, the number of times that the child has to identify an emotion and the number of choices was configurable trough a menu.

Main screen of the emotions exercise

Main screen of the emotions exercise

Alternatively, and for testing and validation proposes, the tablet could work in off-line mode, showing videos of one of the instructors performing the emotion instead of making use of the robot.


Despite we take note of everything that happen along the exercises to measure the experiment results, I will never forget the human outcomes we saw in the children.

All of them were excited and happy the day of the experiments, and a few of them started to ask for the robot two or three days before the therapy. In total, 5 children between 8 and 13 years old participated in 5 days of experiments. Most of them proved to pay more attention to the exercises and were more willing to collaborate and receive feedback from the robot.

I want to make a special mention to the things that happen in parallel to the exercise for conversations: the children used the robot as an avatar to express their feelings. They wrote down on the tablet sentences expressing their feelings, and later made the robot say them. The sentences were questions made to themselves, the children used the robot to ask themselves how they felt, so they could express though the robot, their feelings.

Leave a Reply

Your email address will not be published. Required fields are marked *