Welcome to the Microsoft Emotion API, which allows you to build more personalized apps with Microsoft’s cutting-edge cloud-based emotion recognition algorithm.
The Emotion API beta takes an image as an input, and returns the confidence across a set of emotions for each face in the image, as well as the bounding box for the face, from the Face API. The emotions detected are happiness, sadness, surprise, anger, fear, contempt, disgust or neutral. These emotions are communicated cross-culturally and universally via the same basic facial expressions, where are identified by Emotion API.
Interpreting Results:
In interpreting results from the Emotion API, the emotion detected should be interpreted as the emotion with the highest score, as scores are normalized to sum to one. Users may choose to set a higher confidence threshold within their application, depending on their needs.
For more details about emotion detection, please refer to the API Reference:
Please note, Emotion API for Video was deprecated on October 30, 2017. For a sample on how to interpret streaming video with Emotion API, please see How to Analyze Videos in Real Time.