You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-video-indexer/emotions-detection.md
+3-8Lines changed: 3 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,15 +6,15 @@ author: juliako
6
6
ms.author: juliako
7
7
manager: femila
8
8
ms.service: azure-video-indexer
9
-
ms.date: 06/15/2022
9
+
ms.date: 04/17/2023
10
10
ms.topic: article
11
11
---
12
12
13
13
# Emotions detection
14
14
15
-
Emotion detection is an Azure Video Indexer AI feature that automatically detects emotions a video's transcript lines. Each sentence can either be detected as "Anger", "Fear", "Joy", "Neutral", and "Sad". The model works on text only (labeling emotions in video transcripts.) This model doesn't infer the emotional state of people, may not perform where input is ambiguous or unclear, like sarcastic remarks. Thus, the model shouldn't be used for things like assessing employee performance or the emotional state of a person.
15
+
Emotions detection is an Azure Video Indexer AI feature that automatically detects emotions in video's transcript lines. Each sentence can either be detected as "Anger", "Fear", "Joy", "Sad", or none of the above if no other emotion was detected.
16
16
17
-
The model doesn't have context of the input data, which can impact its accuracy. To increase the accuracy, it's recommended for the input data to be in a clear and unambiguous format.
17
+
The model works on text only (labeling emotions in video transcripts.) This model doesn't infer the emotional state of people, may not perform where input is ambiguous or unclear, like sarcastic remarks. Thus, the model shouldn't be used for things like assessing employee performance or the emotional state of a person.
18
18
19
19
## Prerequisites
20
20
@@ -68,11 +68,6 @@ During the emotions detection procedure, the transcript of the video is processe
68
68
|Emotions detection |Each sentence is sent to the emotions detection model. The model produces the confidence level of each emotion. If the confidence level exceeds a specific threshold, and there is no ambiguity between positive and negative emotions, the emotion is detected. In any other case, the sentence is labeled as neutral.|
69
69
|Confidence level |The estimated confidence level of the detected emotions is calculated as a range of 0 to 1. The confidence score represents the certainty in the accuracy of the result. For example, an 82% certainty is represented as an 0.82 score. |
70
70
71
-
## Example use cases
72
-
73
-
* Personalization of keywords to match customer interests, for example websites about England posting promotions about English movies or festivals.
74
-
* Deep-searching archives for insights on specific keywords to create feature stories about companies, personas or technologies, for example by a news agency.
75
-
76
71
## Considerations and limitations when choosing a use case
77
72
78
73
Below are some considerations to keep in mind when using emotions detection:
0 commit comments