You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/cognitive-services/Emotion/Home.md
+6-8Lines changed: 6 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,25 +11,23 @@ ms.date: 02/06/2017
11
11
ms.author: anroth
12
12
---
13
13
14
-
# Emotion API
14
+
# What is Emotion API?
15
15
16
16
> [!IMPORTANT]
17
-
> Video API Preview will end on October 30th, 2017. Try the new [Video Indexer API Preview](https://azure.microsoft.com/services/cognitive-services/video-indexer/) to easily extract insights from
18
-
videos and to enhance content discovery experiences, such as search results, by detecting spoken words, faces, characters, and emotions. [Learn more](https://docs.microsoft.com/azure/cognitive-services/video-indexer/video-indexer-overview).
17
+
> Emotion API was deprecated on October 30, 2017. The functionality is now part of [Face API](https://docs.microsoft.com/en-us/azure/cognitive-services/face/).
19
18
20
-
Welcome to the Microsoft Emotion API, which allows you to build more personalized apps with Microsoft’s cutting edge cloud-based emotion recognition algorithm.
19
+
Welcome to the Microsoft Emotion API, which allows you to build more personalized apps with Microsoft’s cloud-based emotion recognition algorithm.
21
20
22
21
### Emotion Recognition
23
22
24
-
The Emotion API beta takes an image as an input, and returns the confidence across a set of emotions for each face in the image, as well as bounding box for the face, from the Face API. The emotions detected are happiness, sadness, surprise, anger, fear, contempt, disgust or neutral. These emotions are communicated cross-culturally and universally via the same basic facial expressions, where are identified by Emotion API.
23
+
The Emotion API beta takes an image as an input and returns the confidence across a set of emotions for each face in the image, as well as a bounding box for the face from the Face API. The emotions detected are happiness, sadness, surprise, anger, fear, contempt, disgust, or neutral. These emotions are communicated cross-culturally and universally via the same basic facial expressions, where are identified by Emotion API.
25
24
26
25
**Interpreting Results:**
27
26
28
27
In interpreting results from the Emotion API, the emotion detected should be interpreted as the emotion with the highest score, as scores are normalized to sum to one. Users may choose to set a higher confidence threshold within their application, depending on their needs.
29
28
30
-
For more details about emotion detection, please refer to the API Reference:
29
+
For more information about emotion detection, see the API Reference:
31
30
* Basic: If a user has already called the Face API, they can submit the face rectangle as an input and use the basic tier. [API Reference](https://westus.dev.cognitive.microsoft.com/docs/services/5639d931ca73072154c1ce89/operations/56f23eb019845524ec61c4d7)
32
31
* Standard: If a user does not submit a face rectangle, they should use standard mode. [API Reference](https://westus.dev.cognitive.microsoft.com/docs/services/5639d931ca73072154c1ce89/operations/563b31ea778daf121cc3a5fa)
33
32
34
-
*Please note, Emotion API for Video was deprecated on October 30, 2017. For a sample on how to interpret streaming video with Emotion API, please see [How to Analyze Videos in Real Time](https://docs.microsoft.com/azure/cognitive-services/emotion/emotion-api-how-to-topics/howtoanalyzevideo_emotion).*
35
-
33
+
For a sample on how to interpret streaming video with Emotion API, see [How to Analyze Videos in Real Time](https://docs.microsoft.com/azure/cognitive-services/emotion/emotion-api-how-to-topics/howtoanalyzevideo_emotion).
0 commit comments