Apple buys emotion-reading AI company Emotient

Ever rolled your eyes over Siri’s sassy answers?

Apple just picked up a startup whose technology might someday let Siri know that you’re one grumpy user.

The company is called Emotient, and its technology uses artificial intelligence to read people’s emotions by analyzing their facial expressions.

Emotient describes itself as “the leader in emotion detection and sentiment analysis based on facial expression” that serves to measure customer reaction to ads, content, products, customer service or sales interactions.

Apple confirmed the purchase, but we don’t actually know how it’s going to integrate the AI emotion reading technology into its services, given that the company put out its standard, no-details, post-acquisition statement. An Apple spokeswoman added that the company “buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans.”

Fittingly enough, Emotient adopted radio silence last week by stripping its website of details about the services it had been selling, according to the Wall Street Journal.

According to The Stack, Emotient puzzles out emotions by analyzing facial expressions and then converting the analysis to quantitative data as consumers react to marketing and media.

That renders a statistically comparable alternative to a standard focus group, where consumer reactions are measured by feelings and impressions – i.e., pretty fuzzy and unquantifiable data.

Emotient’s software reportedly translates analyzed emotions into a quantifiable KPI (key performance indicator) that can be broken down, measured and compared.

According to The Stack, Emotient uses AI software to break down micro-emotions shown on each face in a video frame.

Then, it quantifies the micro-emotions into three indicators: is the subject paying attention to the advertising, are they emotionally engaged, and are they showing a positive or negative emotion?

The technology has made some bristle over privacy concerns.

The WSJ talked to Paul Ekman, an adviser to Emotient and a psychologist who pioneered the study of reading faces to determine emotions, having, in the 1970s, catalogued more than 5,000 muscle movements to read emotions, be it the slightest wrinkling of the nose or lifting of an eyebrow.

Several startups have been using the catalog – called the Facial Action Coding System – in efforts to read emotions with AI.

Last January, Ekman told the WSJ that the power of software to read emotions had him worried about infringements on personal privacy, given that it could lead to people’s emotions being misinterpreted or revealed without their consent.

Ekman last week told the paper that he still has these concerns and that he’s urged Emotient to warn people if it’s scanning their faces in public places, but that the company hasn’t agreed to do so.

An Emotient spokeswoman reportedly said that the company doesn’t reveal information about individuals, only aggregate data.

Emotient also received a patent in September for anonymizing individual faces before they’re processed for emotions and reactions, thereby protecting consumer privacy.

The patent covers “sophisticated pixelating or blurring of an individual’s features while retaining the expressive information before transmitting, so identity is never seen or captured, let alone transmitted.”

Dr. Javier Movellan, a co-founder of Emotient and one of the patent’s authors, said that the company isn’t interested in identifying people; rather, it just wants to identify what they’re feeling:

We do not want to recognize who is watching. All we care about is what they are watching and how they feel about it. Our goal is to provide avenues for people to express their feelings about content in an anonymous, effortless, and scalable manner, because there is enormous value to how people respond to what they're watching.

This isn’t the first time Apple’s shown interest in reading people’s feelings.

In February 2014, Apple was seeking a patent on something it called “Inferring user mood based on user and group characteristic data” that its application said would figure out how people are feeling and “… deliver content that is selected, at least in part, based on the inferred mood.”

Microsoft’s also on the mood-sniffing path.

In November, Microsoft released to developers a public beta of Project Oxford, a set of cloud-based algorithms that identify facial expressions in images and which recognize emotions.

Image of Emojis courtesy of