Can we really trust emotionally intelligent AI?

Some companies are now beginning to design and build "emotion AI", but can we trust tech with our emotions?

Some companies are now beginning to design and distribute emotionally intelligent AI, but can we really trust technology with our emotions?

In today’s technological landscape, artificial intelligence (AI) is becoming increasingly widespread. Some companies are even beginning to introduce emotionally intelligent AI, but it is debatable as to whether it is trustworthy.

What is emotionally intelligent AI?

In effect, emotionally intelligent AI is a subset of AI that comprehends, measures, simulates, and reacts to human emotions. Some also refer to the technology as affective computing, artificial emotional intelligence, or simply emotion AI.

It is no secret that AI continues to drive our decision-making in both a professional and personal capacity. However, some engineers are starting to develop emotionally intelligent AI in order to automate emotional tasks.

Tech giants such as Microsoft, IBM, and Amazon are now beginning to capitalise on this emerging trend. By developing emotion recognition algorithms for facial analysis, these companies are hoping to decipher and predict how people feel.

Despite its promises, new research commissioned by the Association for Psychological Science suggests that inferring human emotion from facial movements lacks scientific justification. In fact, the review found that people express emotions in a large variety of ways across various cultures and situations.

How accurate is emotional AI?

It is common knowledge that an emotional state often correlates with a person’s facial movements. However, this common view fails to recognise that people communicate anger, disgust, fear, happiness, and sadness in substantially different ways.

As the review observes, there is an urgent need for research that examines how people actually move their faces to express emotions and social information in a variety of contexts. It is also necessary to study the mechanisms by which people perceive emotion in one another.

With this in mind, it is contentious to claim that facial recognition technology is able to determine emotional states in humans. It is thus evident that companies offering this supposed “emotional AI” are misleading their customers.

So, how can companies offer “emotional AI”?

At present, many consumers of emotion AI believe that AI companies have adequately answered their questions about emotional expressions. Nevertheless, the research suggests that this is just not the case.

Tech companies such as Amazon and Microsoft are now spending million of dollars on research to build devices that detect emotion. However, it is more accurate to say that this technology merely detects facial movements, rather than emotions.

For example, Amazon are now exploring virtual human technology to educate children, train physicians and the military, as well as infer psychological disorders. Currently, however, the “science of emotion” does not support these initiatives.

Indeed, the research insists that emotional expressed are more variable and context-dependent than originally assumed. As a result, the scientific evidence offers very little less actionable guidance to consumers.

Are tech companies asking the wrong question?

Above all, the review highlights that there is little scientific evidence surrounding how and why certain facial movements express instances of emotion. This is particularly true at a level of detail sufficient for the conclusions to support important, real-world applications.

More generally, however, the review indicates that tech companies may be asking a question that is fundamentally wrong. In effect, efforts to “read” internal states from an analysis of facial movements alone are incomplete or lack validity.

It is vital to consider various aspects of context in order to investigate the expression and perception of emotions. At present, it is not possible to reach valid conclusions about how people feel based on facial recognition technology alone.

Do people value emotionally intelligent technology? Richard Orme, CTO of Photobox Group, offered his invaluable insights