Comprehensive coverage

Artificial emotional intelligence

Emotions are a critical part of human interaction, even if the interaction is with a machine. How can a computer improve posture and performance, and what makes humans interested in machines that mimic emotional interaction?

Israel Benjamin, from issue 121, September 2008

Friendly imitation

Alien robot
Alien robot

In a back room in the laboratories of the Massachusetts Institute of Technology, MIT, a man sits in front of a strange computer monitor: the monitor is mounted on an arm that can turn the monitor left and right or up and down, move the monitor away from the person or closer to him, and raise or lower the monitor. The arm and the monitor mounted on it are the Rocco robot, reminiscent in shape and movements of the desk lamp from the animation company Pixar. With the help of a camera installed in it, the robot can detect and imitate the user's posture (leaning back or forward, the angle of the back and head). It is also possible to program it to shake its "head", i.e. the computer monitor, negatively when the position is not correct and may harm health.

Do we really need a computer monitor that moves by itself? Isn't it hard enough to read the text on a screen that doesn't move? It turns out that Roku can encourage people to adopt healthier sitting postures by taking advantage of a person's tendency to mimic the posture of the person sitting in front of them. By gradually changing the "position" of the robot (leaning forward or backward, upright or leaning on the table), the robot leads the person sitting in front of it to a more correct position.

In another demonstration of Roku's effect on its users, one group of subjects was given an easy-to-solve problem and another group was given an unsolvable problem. Each participant was then presented, on the Roku screen, a series of identity problems. Some of the participants worked when Rocco's monitor was raised, and for the others the monitor was lowered: if Rocco was a person, we would say in the first condition that Rocco is sitting upright in a chair, and in the second condition we would say that Rocco is leaning on the table. It was found that those participants who received the first easy task, and therefore succeeded in it, invested and persisted more in the following tasks when they worked in front of an "upright" robot, while those who initially received the task without a solution invested and persisted more in front of the "bent" robot. This finding corresponds to what is already known: the best position is the one that suits the person's mood at the time. But people don't always adjust their posture to their emotional state. People also tend to imitate the posture of others in their environment. These facts explain the findings in the new study, and imply that the participants in the study accepted Roku as human, at least to the extent that it was enough to make them adapt to his posture1. Roku represents a new current in robotics and artificial intelligence: adapting the behavior of software and robots to the emotional state of the human users, and programming the robots to imitate emotional states.

The list "on the interface and on the face" ("Galileo" 118) dealt with technological developments that include the recognition of human facial expressions, as well as the creation of facial expressions for virtual characters displayed on the computer monitor or by activating motors that move parts of the faces of robots. This list describes how such developments could help open new communication channels and the integration of computers into human society.

Something in common, something missing
In 1871, Charles Darwin published the book "The Descent of Man", and caused a storm that resonates to this day by claiming that man and apes are descended from the same ancestor. A year later, Darwin continued the same argument in the book "Expression of Emotions in Man and Animals", and wrote: "The young and the old of distant races, both in man and in animals, express the same mental states by the same movements" ( Translation by the author of this column, J.B.). Until the middle of the 20th century, most anthropologists believed, contrary to Darwin's opinion, that facial expressions are acquired, and therefore culture-dependent, but today the research findings show that Darwin was largely right, especially for the main expressions, which include anger, sadness, fear, surprise , disgust and joy (and see: Miriam Dishon-Berkowitz, "Spontaneous facial expressions of victory and loss", "Galileo" 103).

The question "Do animals have feelings?" provokes fierce debates2 between psychologists, philosophers, neurologists and ethologists (researchers of animal behavior), but there is no doubt that humans have an intuitive tendency to attribute feelings to animals. We tend to attribute emotions to even abstract geometric shapes, if they are presented in a film with appropriate "behavior" (if one dot seems to be chasing another dot, and the corresponding dot shakes and changes color, it's hard to avoid the impression that it feels fear).

The conclusion from this also corresponds to Darwin's claims: emotions are a critical part of human social interaction, and we cling to every clue to attribute emotions to people, animals and even inanimate objects in our environment: although we all know that cars are emotionless machines, it is not uncommon to hear someone say "the car is nervous this morning" . Attributing emotions to moving points, or to cars, relies on only faint clues. In humans and in some animals, communication mechanisms have developed during evolution that are used specifically to transmit and receive emotions, in particular facial expressions and voice characteristics. These mechanisms are active all the time, hence the attribution of emotions to inanimate objects as well. Furthermore, interacting with objects that do not respond to our emotions and do not express emotions themselves is a limited, insufficient and often ineffective interaction.

Comments
1. A simpler explanation can be offered: the subjects brought their heads to the appropriate height for looking at the monitor, and therefore adopted different positions in front of the robot's different positions; However, it seems that the researchers rejected this explanation.

2. When a dog bares its teeth and growls, it is clear to any person that the dog is threatening to attack, and scientists would agree that this communication mechanism has an evolutionary advantage, but is this a programmed behavior that is not linked to any mental state? This is what the controversy revolves around, which of course also has moral implications. Later, the word "emotions", and similar words, will be written without quotation marks, even when it comes to situations where it is clear that there is no real emotion - for example, in a robot whose entire software boils down to the ability to move its mechanical face in a way that imitates human facial expressions.

Thirsty to connect
The non-verbal channels of communication play an important role in our relationships with the people around us, and even with our pets. Even in telephone conversations we can use at least one non-verbal channel - the instrumentality of speech - but in our communication with computers and robots the lack of these channels is evident. This shortcoming hinders the integration of computers and robots into human society, makes it difficult for unskilled users to use the computerized services, increases the "digital divide" (the social and economic gap between those who have access to the computer network and those who do not), and contributes to technophobia (fear of machines and their use) - Some people are not ready to "communicate" with an answering machine because of its "inhumanity".

People do thirst for connection - even with machines that have not been programmed for any recognition or expression of emotions. Many owners of vacuum cleaner robots give them names and even dress them up, so that the robot looks more like part of the family. This may also be due to a reluctance to introduce an alien machine into the intimate family space. American soldiers in Iraq developed strong feelings for the robots they use to dispose of bombs and mines, became particularly attached to specific robots, expressed grief when the robots were damaged and invested efforts in repairing them. They even took the robots fishing with them, with the robot arms holding the rod. A colonel in the United States Army watched a demonstration of a minefield clearing robot in which the robot lost leg after leg due to the mine explosions, and continued to drag itself forward. He demanded to stop the demonstration (which the robot's manufacturers considered to be particularly successful) on the grounds that it was "inhumane".

If all this happens when the robot is not programmed for expression, what will happen when we add such programming? First, these facts show that it will not be difficult to make humans perceive the robot as expressing emotions. Those who watched the new movie "WALL-E" by Pixar must have noticed that from the first moment we treat the little robot as human, even before the script reveals his intelligence and emotions. Second, research suggests that people will trust robots that use non-verbal communication channels more, and accept them as more human. For example, in the simulation of the "Prisoner's Dilemma" (a game that rewards players for cooperating but punishes them if they put their trust in a non-cooperating player), the activity of those areas of the brain linked to trying to anticipate the actions of another person was tested.

These areas of the subjects' brains were found to be more active as the opponent was viewed through more channels (the variations were: an opponent whose behavior was observed only through "yes" or "no" decisions presented on a computer monitor; robotic arms pressing computer keys; a robot with a body and a face; a human opponent ). Another study invited volunteers to talk about a healthy life (exercise, flossing, avoiding fatty food) with a robot sitting in front of them, with a video of the same robot on the computer monitor, and with a static image of the robot in which only the lips are moving. The volunteers rated the robot with the physical presence as more reliable, sociable, kind, responsive and respectful of others. As expected, second place went to the robot viewed on the computer screen. Furthermore, the volunteers who met the physical robot expressed a greater willingness to implement the robot's health recommendations.

According to Dr. Amelia Ortiz Nicolás (Ortiz Nicolás) from the University of the Basque Country (University of the Basque Country), the successful transmission of a message depends 7% on the words included in the message, 38% on the music, and 55% on the facial expressions. One of the projects she led dealt with computer-aided teaching. The same subjects were presented to the students in three forms: in a textual form; explained by a 13D computerized character; And in the explanation by the same character with the addition of emotions, i.e. changing the tone of voice and facial expressions according to the section of the material being studied. Students who studied in the second method answered correctly a 10% greater number of questions than students who studied in the first method, and switching to the third method improved performance by an additional XNUMX%.

The transition from virtual characters created in XNUMXD animation on the computer screen to physical characters present in the real world greatly increases the effect. Robotic heads that combine convincing facial movement with natural language conversation, such as those from the Hansen company, create a particularly convincing impression. This impression is also helped by conveying non-verbal messages, such as combining the robots' answers with facial expressions, pausing in the appropriate places and changing the tone of speech.

Sensors that help teachers distinguish between alert students and students who are too tired, bored or frustrated to learn effectively Illustration: Clipart | CDBank
Sensors that help teachers distinguish between alert students and students who are too tired, bored or frustrated to learn effectively
Illustration: Clipart | CDBank
Emotion engineering

The developments mentioned so far show the emotional response of humans to the behavior of the robot. To integrate into society, the robot also needs to respond correctly to the behavior of humans. For example, if a person reacts with fear to the approach of a robot, the robot will slow down or move away and change its behavior (movement, "facial" expression) so that it is less threatening. Over time, the robot will learn the preferences of the people with whom it comes into contact, link their preferences and reactions to the appropriate contexts, try to predict in advance what they will need, and adjust its actions to help as much as possible and disturb as little as possible. At the very least, a correct identification of the emotional state of people is required here. Already in 2005, it was found that navigation systems installed in cars encourage more careful driving if the way they address drivers matches their mood.

Among the many entities striving to promote this goal is the Feelix Growing consortium, which consists of several universities and commercial companies from six European countries. The project is headed by Dr. Lola Cañamero from the University of Hertfordshire in England. To achieve the goal, four tasks were defined: identifying the challenges and needs arising from the social integration of autonomous robots; exploring the combination of emotions, expressions and interactions required to develop such robots; Realizing the required capabilities in at least two robotic systems, and drawing conclusions regarding the required technologies; and defining an action plan to create standards for the design and development of such systems in the future. The project combines about 25 experts from fields including developmental psychology, comparative psychology, neurology, ethology and robotics. On the project website, you can see, among other things, interaction between humans and robotic faces, and a small robot that learns to follow people, and tries - like a young puppy - to understand when it should be close to its "mother" and when to walk a greater distance behind her.

Emotional interaction is especially important when working with children. One step in this direction is helping human teachers understand the emotional state of students in the classroom. At the University of Massachusetts, software was developed to teach algebra and geometry to high school students, but when all the students in the class are engrossed in the computer stations in front of them, the teachers have no way of knowing if the students feel comfortable, or if they are bored, frustrated or anxious. To this end, the university developed sensor systems that are installed in the computer mouse, in the student's chair and on the wrist. These sensors measure skin conductance to gauge how tense the student feels. Cameras at the computer station decipher the student's facial expression. All of these help teachers distinguish between alert students and students who are too tired, bored or frustrated to learn effectively. In the future, it is possible that such systems will be simpler to install and operate, and will not only be connected to the teacher's computer station, but will also affect the speed and content of the computerized teaching, and perhaps also the expression of the emotions of the virtual character that will appear on the computer screen.

On the robot's body appears a "heart" that beats at a variable rate, thus giving us a clue about its "feelings" Illustration: GettyImages/Imagebank
On the robot's body appears a "heart" that beats at a variable rate, thus giving us a clue about its "feelings"
Illustration: GettyImages/Imagebank
A beating heart and a smelling robot

In order to reach a deeper understanding of the expectations of humans from social robots, a special combination of a doll and a robot, called the "heart robot", was developed in the Bristol robotics laboratories (shared with several British academic bodies). The robot is named so because a "heart" appears on its body that beats at a variable rate, thus giving us another clue about its "feelings". Part of the robot is operated by software that allows it to appear fearful or happy, calm or tense, etc., and other parts are operated by a puppeteer that provides the social interaction that today software still cannot create. The site dedicated to this robot also has a collection of links to other social and emotional robots. Another robot with a heartbeat display is Pomi, a South Korean robot that, unlike most of the other robots mentioned here, will be released commercially in the coming months. Pumi can hold simple conversations, respond to instructions, and is endowed with the senses of hearing, sight and touch. He can also smell, in both senses of the word: he senses odors, and he can also emit two odors of his own to express his feelings (the press release at the link at the end of the column did not specify what these odors are). Time will tell if robots that maintain such social interaction will indeed be popular.

The developments described here are, of course, very far from giving real emotions to machines, and from the fundamental and moral questions regarding the feasibility of machines with mental and emotional states. It is possible that these questions will be even more difficult than the questions about the feasibility of "real" artificial intelligence. However, the developments not only serve important practical purposes: they also show that to function within human social contexts, robots need emotional artificial intelligence just as much as they need classical artificial intelligence. In this, at least, there is a lot of similarity between robots and humans.

Israel Binyamini works at ClickSoftware developing advanced optimization methods

10 תגובות

  1. proteasome,

    I share your opinion, but it is clear that it will take tens (or maybe hundreds) of years before we reach such an achievement.

    You may be interested to hear that ~a month ago they managed to create a robot with a biological brain consisting of neurons. Every time the robot has to decide whether to turn right or left, electrodes send an electrical message to the neurons. The message is processed accordingly, and the electrodes pick up the retransmission of the neurons and translate it into movement. The research is currently focused on testing whether such a robot can learn familiar paths 'by itself', because the neurons grow and change during the life of the robot.

  2. proteasome
    Your words are humorous and my words are not?
    interesting.
    Don't just check prices. You should also stock up.

    I do not agree with your claim, but since it is not based on anything, I continue to see it as an attempt at prophecy.

  3. My answer to you, commenter 3, is as follows:
    a) These are not "my" tissues or those of any other person, but tissues that will be designed and engineered according to certain principles that exist in nature. Today, different proteins are produced in a way almost identical to the way they are produced in the human body, for example, insulin which is produced by inserting a human gene into E.coli bacteria. The day is not far when an entirely artificial unicellular life form will be built, built from a genome assembled by human hands. I believe that from there it will be possible to continue towards whole tissues, etc.

    b) Allow me to direct the question back to you. This is an article about emotional intelligence in robots. Let's assume and we will be able to maintain this situation. Is it possible that the robots will rebel against us? Should we stop researching the industry under these conditions? I think the answer is no. I believe and hope that engineers who created such a way of life, will respond to the phenomenon you mentioned.

    c) Your question is trending. What are invalid needs? Does an ordinary robot serve invalid needs?
    d) Within the framework of ethics accepted by the scientific community, I do not propose to enslave people, but to produce robots in a different way than is accepted today. The psychological problem lies in the fact that these substances also make us up as living beings (so what?)

    Michael - I do not claim to be a prophet. Last time I checked sense of humor was free.

  4. Proteasome:
    Your words do not disappoint anyone because we all (maybe except you) know that you are not a prophet.
    did you say something So you said!

  5. To respondent-2 a) Would you be willing for your tissues and organs to be engineered and used for other people's needs?
    b) Is it possible that the cellular memory of a "robot-

    "Biological" will rebel against his engineers when it becomes clear to him that it is being used to destroy the sanctity of his origin?
    c) In a rude way: Would you be willing to be the toilet paper for the other people's improper needs (waste needs)? d) Where is the limit that saves these questions e) Where is the internal potential that does not utilize another factor and thus neutralizes feedback (a negative boomerang).

  6. I am sorry to disappoint Asimov fans of all kinds, but in my opinion the future is not in a mechanical robot made of metal, but in robots made of living, organic tissues. When we reach a sufficient understanding of biological systems and biochemical pathways and ways to imitate them artificially, we will be able to engineer biological "robots" with brains and internal organs for our needs.
    It might even be possible to combine an electronic brain (=processor) with a biological body and vice versa.
    In order for this to take place, research in the biological and genetic field in particular needs to progress for hundreds of years.

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.