Breaking News

Robots to decipher our emotions?


If it wasn’t for grayish skin and transparent plexiglass instead of hair, Ameca could be your neighbor. When switched on, the robot created by the British company Engineered Arts has the same haggard look as a human when waking up. The same widening of the eyes during a surprise. The same slight sagging of the corner of the lips during a misunderstanding. In short, the expressions of humanity on silicone strokes.

→ READ. When our emotions govern us, our file

To reproduce these attitudes, the company scans real humans. “We observe exactly how the skin deforms, we recreate this movement with mechanical elements, and we add a very sophisticated silicone skin”, describes Will Jackson, head of Engineered Arts. Manufacturing, halfway between artistic sculpture and high technology, takes time. But the result is there and the robot is considered one of the most humanoid today. A surprise for Will Jackson, who recalls that his creation is clearly gray, with all its components visible.

Humans attached to their machines

Without going as far as these ultra-realistic creatures, research centers and digital giants dissect our emotions to make them understandable by machines, and to be able to program the response of the voice assistants that abound in our daily lives. “The idea is not to recreate the human being but to have something that ‘understands’ us to best meet our needs.justifies Salima Hassas, specialist in artificial intelligence and autonomous agents. The machine does not feel anything, it is a computer program. » And this even if we tend to lend intentions and feelings to objects, from the car to the robot vacuum cleaner. “The human is irrepressibly social and the engineer cannot foresee this attachment”says Véronique Aubergé, researcher in social robotics in Grenoble.

→ CHRONICLE. Robots will be endowed with a feigned humanity

In the living room, if the connected speaker from Google responds pained that “this breaks the code for me” when told it’s a pain, it’s just a programmed response. Voice assistants aren’t always right. The same enclosure offers a walk in nature when it is said to be ” very angry “, even if the tone does not indicate any anger. All these exchanges go through the words, not the general behavior. How to recognize the annoyance of a click of the tongue or the happiness of a smile?

Databases of emotions played by actors

To train the algorithms to recognize the emotions expressed, the researchers use huge databases of videos of actors who act out fear, anger, joy… “But these are emotions played out, not felt in real life.says Catherine Pelachaud, director of research and expert in human-machine interactions. However, between the emotions played and the emotions felt, the bodily expressions are not the same, the muscles in play are not the same. »

→ LARGE FORMAT. Robots and emotions, when the machines will be melancholic

Another approach is to digitally model faces with different expressions. Then you just have to ask humans what emotion they see, and you get a sort of “numerical mask” corresponding to such a feeling. “By presenting the same expression to different people, we can take into account cultural differences in the perception of emotions”, adds Catherine Pelachaud. Because not everyone expresses their feelings in the same way or with the same intensity. ” Expression depends on the individual and the situationrecalls Salima Hassas. The more we know about our own emotions, the more algorithms will be able to recognize them accurately. »

A lack of context

If there are “basic” emotions that we feel strongly, such as anger, joy or even fear, these rarely occur. “Most everyday emotions are finer, such as disappointment, stress, optimism, confidence and others”, intervenes Véronique Aubergé. Emotions that humans themselves sometimes have trouble recognizing in their interlocutors, between sarcasm and seriousness! And unlike machines, humans benefit from context.

The machine doesn’t guess that your day went wrong or that you just had the scare of your life. It just listens to your intonations and observes your face, if it is equipped with a camera. “The algorithm knows how to recognize the emotion, but not what caused it”, summarizes Nicolas Spatola, specialist in social and cognitive psychology. Especially since sometimes it’s the machine itself that causes the feeling. “We still take too little into account how the very presence of a robot or digital avatar modifies the behavior of the human user,” continues the researcher. Interacting with it can arouse fear, curiosity, joy…”

The “strange valley” of inflexible robots

For decades, roboticists have been debating the concept of the “valley of the uncanny”. This idea is that the more you try to make a creature look like a human being, the more grotesque or frightening it will appear. “Virtually, we know how to make avatars that no longer cause these reactions, but when it comes to a physical robot, the results are still perfectible”concedes Catherine Pelachaud.

→ MAINTENANCE. Georges Vigarello: “Emotions take on an unprecedented place in history”

“Simply grabbing an object from the ground can be done in endless ways and the movement chosen reflects the state of the person, explains Philippe Souères, specialist in the movements of anthropomorphic systems. But where the human body has some 300 joints, the most evolved humanoids in our laboratory have about 40. They do not have the possibilities of expression of a human. » Hence the disturbing feeling of being faced with an inflexible robot. In his movements as in his reactions.

.

Leave a Reply

Your email address will not be published. Required fields are marked *