Can AI Understand Your Emotions?

A hyperbole-free assessment of the state of Emotional AI

Jonathan Cook
Good Audience

--

This article is the first in a series assessing the condition of sentiment analysis, also increasingly referred to as Emotional AI. These terms refer to the attempt to create machine learning systems that can detect and understand human emotion. This series of articles is just the beginning what I hope will become a larger response to the overheated promotion of sentiment analysis.

Also see article 2 in this series — Is Emotional AI Ready For Prime Time? — as well as article 3, What Emotional AI Fails To Grasp About Emotion, article 4, It Isn’t Emotional AI. It’s Psychopathic AI, article 5, Should AI Rid Humanity of Emotion, article 6, The Mythology of Emotional AI, and finally, AI’s Missing Companion.

Recently, a number of astonishing claims have been made about the abilities of Emotional AI, often by companies selling their own Emotional AI services. A frank, critical assessment of these claims has been lacking, but the need has never been more clear. Over the last year, the world has seen how dramatically damaging digital technology can be when Silicon Valley’s optimistic sales pitches are allowed to overrule responsible skepticism. Facebook’s “Move Fast And Break Things” philosophy ended up breaking American democracy and international relations. The time for wide-eyed celebration of all things digital has ended.

My aim is not to undermine the work of those who are developing systems of sentiment analysis, but to temper the drive forward with caution, so that Emotional AI, when it is sufficiently developed, can be deployed in a manner that is socially beneficial, rather than destructively disruptive.

Between the extremism of the luddites and those who worship digital technology as the altar of a new divinity is a middle path of rational scrutiny. That’s what I’m aiming for in these articles.

In this first article, I’ll start out with an acknowledgement of the value that sentiment analysis brings to business and consumers alike. Then, I’ll shift to a quick review of the hyperbole about Emotional AI.

The Merit of Sentiment Analysis

In a recent article for the Harvard Business Review, Sophie Kleber, Executive Creative Director at Huge, summarized the ways in which artificial intelligence is “getting more emotional”. She identified:

  1. Systems that use machine learning that provide feedback on people’s emotional states, so that individuals or organizations can change in response
  2. Systems that use measurements of people’s emotional states to adjust the operation of a product or service
  3. Systems that imitate emotionally-informed human-to-human interactions

The remarkable technical accomplishments of the teams working on sentiment analysis should not be discounted. Though human infants spontaneously learn to recognize and respond to other people’s emotional states within just a matter of months, it has been extremely difficult to teach a machine to measure facial expressions to any extent, even after years of effort. The intellectual achievements of sentiment analysis teams have been remarkable.

The very fact that tech companies are recognizing the importance of emotion can be interpreted as a sign of cultural improvement. Too often, businesses have thought of human beings as rational decision makers, little more than information processing systems.

Emotional AI could be a gateway for companies to enter into a new world of intimacy. A quick digital scan for physical signs of emotion could be a positive first step into a deeper, mutually satisfying interaction between corporation and consumer. Sentiment analysis might be used to give companies clues about ways to begin more emotionally authentic human-to-human communications, or provide material to start workplace conversations about how to design professional settings with more empathy.

The development of artificial systems for the analysis of human emotion could be used to provoke a new flourishing of artistic creativity in what have been cold and dreary channels of commerce. Emotional AI could also prevent the kind of unintentional, but systematic, suffering that now typically occurs in the places where commercial institutions mean human individuals.

The hitch is that these beneficial impacts are just one possible outcome of the development of Emotional AI. The good intentions of the engineers of sentiment analysis systems could lead us into some very dark places.

The Hype of Emotional AI

Whether the future impact of sentiment analysis will be positive or negative, one thing about its present condition is quite clear: It’s rife with outrageous exaggeration.

The hyperbolic state of Emotional AI is best represented by the opening quotation in Kleber’s article: A prediction from the strategic consulting agency Gartner claiming that “By 2022, your personal device will know more about your emotional state than your own family.” This particular claim has been repeated so often by people writing about sentiment analysis that it has become accepted by many as fact.

Few have bothered to ask critical questions Gartner’s claim, however. As authoritative as Gartner’s prediction sounds, the truth is that the agency actually has no idea what the state of Emotional AI technology will be four years from now. Gartner is making a wild guess and asserting it as fact.

Gartner claims to have analysts who “independent, objective, accurate and rigorously research” its conclusions, but there is no research method in existence that can reliably predict the technological capability for emotional knowledge four years into the future, relative to human empathy. That’s because:

  • Nobody has been to the future
  • There is no objective, comprehensive, and reliable method to measure how much any piece of technology can “know” about human emotion
  • We lack a coherent, widely-agreed upon philosophical framework for what it means for a machine to know about any subjective state of mind
  • We don’t have a consistent conception of what emotion is when experienced by human beings, even among close family members

One Gartner-quoting article in PC Magazine opens with the clickbait headline, Your Phone Might Soon Be Able To Tell If You’re In Love. Of course, the history of human love is defined by the simultaneously comedic and tragic experience of falling in love without realizing that it’s happening. A facial scan with a smartphone can’t resolve a mystery that deep.

That doesn’t stop predictions of emotionally savvy microprocessors from flowing forth, unfortunately. The PC Mag article swoons, “By pulling emotional information from personal devices and bringing that data to the cloud, we’ll be able to achieve what brands have tried but never had the tech to accomplish: 100-percent, unadulterated customer sentiment insight.”

Claims like this are 100-percent, unadulterated something, that’s for sure. We’ve heard this kind of snake oil pitch before. Remember the promises of neuromarketing ten years ago, which told us that soon, we’d gain direct knowledge of consumers’ deepest desires by inserting their heads into medical scanners? The reality has been much less impressive.

Insight about emotion is never 100 percent pure. Translating subjective feelings into digital data, to be subjected to analytical algorithms in the aggregate, is definitive example of adulteration.

It’s possible that sentiment analysis firms will soon be gathering huge numbers of points of data about the physical manifestations of human emotion. Having a huge amount of data, however, isn’t the same thing as knowing something. Given the frequency with which these concepts are being ineptly conflated, we ought to take predictions of the imminent boom in Emotional AI with a grain of salt.

How Does Emotional AI Make You Feel?

As I’ll explain in upcoming articles, present-day Emotional AI consistently falls far short of its big promises. It’s a fair bet, however, that the push to develop adequate systems of machine learning capable of processing information about human emotion will continue. There’s just too much money to be made mucking about in our feelings for the dream of effective sentiment analysis to be ignored.

One thing to keep in mind is that the very effort to convert emotion into digital form will provoke strong emotions in unpredictable ways. Feelings aren’t cold, unresponsive objects that can be measured without consequences. Diving into the world of Emotional AI will be like a descent into the quantum realm, where the very act of being measured changes the reality that’s being measured. Clumsy deployments of sentiment analysis can end up stirring up emotional trouble at a scale that far exceeds the emotional insight they provide.

Wise business leaders will look before they leap into this new digital fad. Perhaps, in the end, a measured application of Emotional AI will pay off. In the meantime, caveat emptor.

Over the next week, this series of articles will be exploring both the vulnerabilities of sentiment analysis and the untapped opportunities to make the technology more effective in the future.

Up next, we’ll give the digital heart an EKG. Can Emotional AI actually do what it claims to do? Then, it’s time to contemplate what we mean by the word ‘emotion’, and what happens when we reduce that concept to something that can be automatically scanned by a digital device.

Connect with the Raven team on Telegram

--

--

Using immersive research to pursue a human vision of commerce, emotional motivation, symbolic analysis & ritual design