Aaliyah, a Robot with Emotions
In the Aaliyah project a robot was developed, which communicates with the environment through gestures and emotions.
The robot is able to express different emotions. In order to be able to freely program the emotions, gestures were implemented as facial actions, based on the Facial Action Coding System of Paul Ekman. The robot can rotate around all three rotation axes, and change its position back and forth. The robot observes the environment with its two cameras and follows objects with its eyes and its head. Therefore, it is possible to set the direction of the eyes and move his upper and lower eyelids independently. The robot can move its lips with six degrees of freedom, and its mouth with an additional degree of freedom. The eyebrows have each 3 degrees of freedom. In addition, the nose, cheek muscles and ears can move separately.