ERCIM News 84

Image ERCIM News 84 cover page

January 2011
Special theme:
Intelligent and Cognitive Systems
This issue in pdf
(60 pages; 9 Mb)
Next issue
October 2014
Next special theme:
Software Quality
Call for the next issue
Get the latest issue to your desktop
RSS Feed

AIsoy 1: A Robot that Perceives, Feels and Makes Decisions

by D. García, C. Pallardó, D.Ríos Insua and R. Moreno, A. Redchuk

We have recently developed AIsoy 1, a robot that is capable of inferring the current state of its environment and what its user is doing, through voice, vision and a system of sensors. As a consequence, it modifies its emotional state and makes decisions aimed at attaining its objectives. With numerous functionalities, AIsoy 1 is a social emotional bot with enormous potentiality as an edutainment tool, as a cognitive personal assistant and as a therapeutic means.

Affective computing and affective decision-making constitute two areas of growing interest in relation to intelligent and cognitive systems. One of their key applications refers to the development of bots which are capable of interacting intelligently and emotionally with their users and other bots. They are mainly used for education and entertainment purposes.

With such a background, we have developed AIsoy 1, a social emotional robot that perceives the environment surrounding it and the actions performed by its users. As a consequence, it will modify its emotional state and make decisions. AIsoy 1 has emerged as a result of a project from AIsoy Robotics, with the cooperation of researchers from the Rey Juan Carlos University, and supported by the Spanish CDTI.

AIsoy 1 robot.
AIsoy 1 robot.

At the core of AIsoy 1, there is a decision analytic model which guides the robot’s decision making. It is based on a built-in multiobjective utility function which takes into account as objectives, in order of importance: the bot’s own safety, the bot’s energy level, how nicely its user is treating it, its own entertainment and its requirement to be properly updated. These have a clear reminiscence of Maslow’s hierarchy of needs. It also maintains several learning and forecasting models which allow it to forecast how the user will react and the environment will evolve, given the bot’s potential action, and the recent past history of user’s actions and environmental states. These forecasting models are then mixed. They are combined with the utility function to approximate the expected utilities of various actions. The alternative is chosen randomly with probabilities proportional to the expected utilities. This increases unpredictability of the bot’s choices under similar circumstances. The bot’s actions include Talk, Sing, Tell a joke, Do nothing or Ask to be charged, among others. This decision making mechanism will be fired synchronously unless an exception happens, in which case the bot has built in reactive rules.

Emotions play a central role in AIsoy 1’s artificial life. We have opted for an emotional model which uses basic emotions (such as happy, sad, angry) and mixes them to provide more complex emotions. The evolution of emotions takes into account the values, expectations and built in standards within our bot, as well as previous emotions. They influence forecasts and utility evaluations and, thus, influence the decisions made. Emotions are shown through the expression of our bot (given its ability to move its neck, eyebrows and eyelids, and illuminate its mouth, which includes seventy minileds facilitating the display of numbers, letters and signs) and the colour of its chest. Emotions are expressed also through our bot’s voice pitch, speed and volume.

The voice interface is a very powerful component of AIsoy 1. It includes an ASR based on a BNF grammar which helps in identifying the conversation topic. Then, it deduces appropriate responses with a dialogue manager and, finally, through a TTS, it synthesises the required sentences. This scheme aids AIsoy 1 in maintaining a reasonably intelligent conversation with a user. It also facilitates user recognition through voice. Currently, AIsoy 1 only speaks and understands Spanish, but English and Catalan versions will soon be available. AIsoy 1 also incorporates a powerful visual recognition system that facilitates face detection, user recognition, object tracking and character recognition, thus permitting reading in a sufficiently stable environment.

AIsoy 1 is based on an ARM Cortex-A8 microprocessor, with our own operating system AIROS, based on LINUX. It includes sensors for temperature, inclination, touch, light and strength. It has an integrated camera as well as an integrated audio system. It may communicate with other AIsoy 1 bots through a private radio protocol.

The built in functionalities permits AIsoy 1 to be described as a revolutionary edutainment bot, which may serve as a cognitive personal assistant, may be used with kids for educational, recreational and therapeutic uses and with elderly people for companion purposes. It facilitates also communication through social networks like Facebook or Twitter.

AIsoy 1 is available at the AIsoy web page. Several videos of AIsoy 1 in action may be seen in YouTube.

Links:
http://www.aisoy.es/
http://www.estamoscreandovida.com/
http://www.youtube.com/watch?v=No7MqxUONRs

Please contact:
David Rios Insua
Royal Academy of Sciences, Madrid, Spain
Tel: +34 609718937
E-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.

{jcomments on}