Microsoft’s training facial recognition apps to recognize our emotions

emojis1

This just in from the “All the better to target ads at you” department:

Microsoft last month released to developers a public beta of Project Oxford, a set of cloud-based algorithms that go beyond facial recognition to identify facial expressions in images and recognize emotions.

Wait, does this news make your face twist in anger, disgust, contempt, fear or surprise?

Great! Microsoft’s onto those emotions!

Eventually, there’ll very likely be an app that knows you’re growling at your phone: just send in an image!

According to the Project Oxford site, Microsoft’s so-called Emotion API takes an image as input and returns the confidence level across a set of emotions for each face in the image, as well as a bounding box for the face, using the company’s Face API.

Here’s what it can detect:

  • Anger
  • Contempt
  • Disgust
  • Fear
  • Happiness
  • Neutral
  • Sadness
  • Surprise

These facial expressions are universal.

You might not speak French, for example, but you can be disgusted in any language: Microsoft says that these emotions are “cross-culturally and universally communicated” with particular facial expressions.

In its emotion recognition demo, you can see that Microsoft’s managed to train its artificial intelligence algorithms to pick up on happiness quite well: many of the photos with grinning people were rated with a “happiness” level of 1 and miniscule, if any, reading on any other emotion, with the exception of an occasional, tiny possibility of “neutral” or other emotions thrown in.

Chris Bishop, head of Microsoft Research Cambridge, in November demonstrated the technology at Microsoft’s Future Decoded conference on the future of business and technology.

Microsoft said that its artificial intelligence can be trained for facial/emotions recognition on sets of images:

The system can learn to recognize certain traits from a training set of pictures it receives, and then it can apply that information to identify facial features in new pictures it sees.

In fact, earlier that week, before it released the Emotion API, in honor of the men’s health, facial-hair-growing fundraising effort Movember, Microsoft also released MyMoustache, which uses the technology to recognize and rate facial hair.

OK, moustache rating. That’s fun.

But why emotions, you ask? Why in the world would you want apps to be developed that recognize your emotions?

Well, when it comes to recognizing emotions, you’re talking marketing gold.

Ryan Galgon, a senior program manager within Microsoft’s Technology and Research group, suggests that developers might want to use the tools to create systems that marketers can use to “gauge people’s reaction to a store display, movie or food.”

Another option he suggested: developers might find an emotion recognition API valuable for creating a consumer tool, such as a messaging app, that offers up different options based on what emotion it recognizes in a photo.

I must confess, my imagination is running a little wild here. I can’t help but wonder, what might facial recognition kingpins like Facebook get up to with this technology?

What new things could be done by surveillance agencies when it comes to reading emotions in photos?

Your thoughts are welcome in the comments section below.

Image of set of emojis courtesy of Shutterstock.com