RSN Fundraising Banner
FB Share
Email This Page
add comment
Print

Devlin writes: "Artificial Intelligence systems that companies claim can 'read' facial expressions is based on outdated science and risks being unreliable and discriminatory, one of the world's leading experts on the psychology of emotion has warned."

A series of companies have been deploying facial recognition technology that claims to detect emotion. (photo: izusek/Getty Images/iStockphoto)
A series of companies have been deploying facial recognition technology that claims to detect emotion. (photo: izusek/Getty Images/iStockphoto)


AI Systems Claiming to 'Read' Emotions Pose Discrimination Risks

By Hannah Devlin, Guardian UK

17 February 20


Expert says technology deployed is based on outdated science and therefore is unreliable

rtificial Intelligence (AI) systems that companies claim can “read” facial expressions is based on outdated science and risks being unreliable and discriminatory, one of the world’s leading experts on the psychology of emotion has warned.

Lisa Feldman Barrett, professor of psychology at Northeastern University, said that such technologies appear to disregard a growing body of evidence undermining the notion that the basic facial expressions are universal across cultures. As a result, such technologies – some of which are already being deployed in real-world settings – run the risk of being unreliable or discriminatory, she said.

“I don’t know how companies can continue to justify what they’re doing when it’s really clear what the evidence is,” she said. “There are some companies that just continue to claim things that can’t possibly be true.”

Her warning comes as such systems are being rolled out for a growing number of applications. In October, Unilever claimed that it had saved 100,000 hours of human recruitment time last year by deploying such software to analyse video interviews.

The AI system, developed by the company HireVue, scans candidates’ facial expressions, body language and word choice and cross-references them with traits that considered to be correlated with job success.

Amazon claims its own facial recognition system, Rekognition, can detect seven basic emotions – happiness, sadness, anger, surprise, disgust, calmness and confusion. The EU is reported to be trialling software which purportedly can detect deception through an analysis of micro-expressions in an attempt to bolster border security.

“Based on the published scientific evidence, our judgment is that [these technologies] shouldn’t be rolled out and used to make consequential decisions about people’s lives,” said Feldman Barrett.

Speaking ahead of a talk at the American Association for the Advancement of Science’s annual meeting in Seattle, Feldman Barrett said the idea of universal facial expressions for happiness, sadness, fear, anger, surprise and disgust had gained traction in the 1960s after an American psychologist, Paul Ekman, conducted research in Papua New Guinea showing that members of an isolated tribe gave similar answers to Americans when asked to match photographs of people displaying facial expressions with different scenarios, such as “Bobby’s dog has died”.

However, a growing body of evidence has shown that beyond these basic stereotypes there is a huge range in how people express emotion, both across and within cultures.

In western cultures, for instance, people have been found to scowl only about 30% of the time when they’re angry, she said, meaning they move their faces in other ways about 70% of the time.

“There is low reliability,” Feldman Barrett said. “And people often scowl when they’re not angry. That’s what we’d call low specificity. People scowl when they’re concentrating really hard, when you tell a bad joke, when they have gas.”

The expression that is supposed to be universal for fear is the supposed stereotype for a threat or anger face in Malaysia, she said. There are also wide variations within cultures in terms of how people express emotions, while context such as body language and who a person is talking to is critical.

“AI is largely being trained on the assumption that everyone expresses emotion in the same way,” she said. “There’s very powerful technology being used to answer very simplistic questions.”

e-max.it: your social media marketing partner
Email This Page

 

THE NEW STREAMLINED RSN LOGIN PROCESS: Register once, then login and you are ready to comment. All you need is a Username and a Password of your choosing and you are free to comment whenever you like! Welcome to the Reader Supported News community.

RSNRSN