You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-video-indexer/ams-deprecation-faq.yml
+7-7Lines changed: 7 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -20,42 +20,42 @@ sections:
20
20
21
21
- question: Is Azure Video Indexer also being deprecated?
22
22
answer: |
23
-
No. While Azure Video Indexer currently relies on Azure Media Services for video encoding, Azure Video Indexer is not being deprecated.
23
+
No. While Azure Video Indexer currently relies on Azure Media Services for video encoding, Azure Video Indexer isn't being deprecated.
24
24
25
25
- question: Will I be able to use Video Indexer after Media Services is deprecated?
26
26
answer: |
27
27
Yes, you continue to be able to use Azure Video Indexer.
28
28
29
-
- question: What will I need to do before the deprecation?
29
+
- question: What do I need to do before the deprecation?
30
30
answer: |
31
31
We try to keep the impact of Azure Media Services deprecation to a minimum for Azure Video Indexer customers. We'll continue to inform you and post updates to our documentation when changes or actions are requires from you.
32
32
33
33
- question: Why do I still need to create a Media Services account even when it’s being deprecated?
34
34
answer: |
35
-
While today you need to create an Azure Media Services account as part of Azure Video Indexer we're working on removing this dependency before Azure Media Services is fully deprecated.
35
+
While today you need to create an Azure Media Services account as part of Azure Video Indexer, we're working on removing this dependency before Azure Media Services is fully deprecated.
36
36
37
37
- question: Will I see a change in features or behavior of Azure Video Indexer?
38
38
answer: |
39
39
Azure Video Indexer uses Azure Media Services for encoding and streaming. We'll replace these capabilities with “like for like” alternatives as much as possible. We'll continue to provide streaming capability for videos hosted through Video Indexer.
40
40
41
41
- question: Why should I move to Video Indexer ARM accounts?
42
42
answer: |
43
-
ARM accounts provide enhancements for management like Monitoring, Identity Access Management, Automation and other AI Analysis capabilities. When transitioning from Media Services Analysis you get more capabilities at a lower price.
43
+
ARM accounts provide enhancements for management like Monitoring, Identity Access Management, Automation and other AI Analysis capabilities. When transitioning from Media Services Analysis, you get more capabilities at a lower price.
44
44
45
45
- question: Can I use Video Indexer for encoding or streaming?
46
46
answer: |
47
47
No, while we use these functions as part of our product they aren't provided as standalone functionality to replace Media Services. You can find partner solutions in the Azure Marketplace.
48
48
49
49
- question: Will my billing change after the deprecation?
50
50
answer: |
51
-
Video Indexer will replace the encoding and streaming functionality provided by Media Services. What was previously billed by Media Services will now be billed by Video Indexer directly. You will not pay more for the replacement encoding and packaging service offered by Video Indexer.
51
+
Video Indexer will replace the encoding and streaming functionality provided by Media Services. What was previously billed by Media Services will now be billed by Video Indexer directly. You won't pay more for the replacement encoding and packaging service offered by Video Indexer.
52
52
- question: Can I use Video Indexer for live Analysis?
53
53
answer: |
54
-
No, while Media Services provided live transcription on live events, Video Indexer does not provide live streaming or Analysis capability.
54
+
No, while Media Services provided live transcription on live events, Video Indexer doesn't provide live streaming or Analysis capability.
55
55
56
56
- question: What will happen to my Azure Media Services account?
57
57
answer: |
58
-
Video Indexer will gradually migrate away from Azure Media Services and continue to operate without Media Services. After the migration you can follow Media Services guidance how to migrate/terminate your Media Services account.
58
+
Video Indexer will gradually migrate away from Azure Media Services and continue to operate without Media Services. After the migration, you can follow Media Services guidance how to migrate/terminate your Media Services account.
59
59
60
60
- question: What can I do if I have more questions?
Copy file name to clipboardExpand all lines: articles/azure-video-indexer/clapperboard-metadata.md
+16-16Lines changed: 16 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,19 +1,19 @@
1
1
---
2
-
title: Enable and view a clapperboard with extracted metadata
3
-
description: Learn about how to enable and view a clapperboard with extracted metadata.
2
+
title: Enable and view a clapper board with extracted metadata
3
+
description: Learn about how to enable and view a clapper board with extracted metadata.
4
4
ms.topic: article
5
5
ms.date: 09/20/2022
6
6
ms.author: inhenkel
7
7
author: IngridAtMicrosoft
8
8
---
9
9
10
-
# Enable and view a clapperboard with extracted metadata (preview)
10
+
# Enable and view a clapper board with extracted metadata (preview)
11
11
12
-
A clapperboard insight is used to detect clapperboard instances and information written on each. For example, *head* or *tail* (the board is upside-down), *production*, *roll*, *scene*, *take*, *date*, etc. The [clapperboard](https://en.wikipedia.org/wiki/Clapperboard)'s extracted metadata is most useful to customers involved in the movie post-production process.
12
+
A clapper board insight is used to detect clapper board instances and information written on each. For example, *head* or *tail* (the board is upside-down), *production*, *roll*, *scene*, *take*, *date*, etc. The [clapper board](https://en.wikipedia.org/wiki/Clapperboard)'s extracted metadata is most useful to customers involved in the movie post-production process.
13
13
14
-
When the movie is being edited, a clapperboard is removed from the scene; however, the information that was written on the clapperboard is important. Azure AI Video Indexer extracts the data from clapperboards, preserves, and presents the metadata.
14
+
When the movie is being edited, a clapper board is removed from the scene; however, the information that was written on the clapper board is important. Azure AI Video Indexer extracts the data from clapper boards, preserves, and presents the metadata.
15
15
16
-
This article shows how to enable the post-production insight and view clapperboard instances with extracted metadata.
16
+
This article shows how to enable the post-production insight and view clapper board instances with extracted metadata.
17
17
18
18
## View the insight
19
19
@@ -29,16 +29,16 @@ After the file has been uploaded and indexed, if you want to view the timeline o
29
29
> [!div class="mx-imgBorder"]
30
30
> :::image type="content" source="./media/slate-detection-process/post-production-checkmark.png" alt-text="This image shows the post-production checkmark needed to view clapperboards.":::
31
31
32
-
### Clapperboards
32
+
### Clapper boards
33
33
34
-
Clapperboards contain fields with titles (for example, *production*, *roll*, *scene*, *take*) and values (content) associated with each title.
34
+
Clapper boards contain fields with titles (for example, *production*, *roll*, *scene*, *take*) and values (content) associated with each title.
35
35
36
-
For example, take this clapperboard:
36
+
For example, take this clapper board:
37
37
38
38
> [!div class="mx-imgBorder"]
39
39
> :::image type="content" source="./media/slate-detection-process/clapperboard.png" alt-text="This image shows a clapperboard.":::
40
40
41
-
In the following example the board contains the following fields:
41
+
In the following example, the board contains the following fields:
42
42
43
43
|title|content|
44
44
|---|---|
@@ -51,14 +51,14 @@ In the following example the board contains the following fields:
51
51
52
52
#### View the insight
53
53
54
-
To see the instances on the website, select **Insights** and scroll to **Clapperboards**. You can hover over each clapperboard, or unfold **Show/Hide clapperboard info** and see the metadata:
54
+
To see the instances on the website, select **Insights** and scroll to **Clapper boards**. You can hover over each clapper board, or unfold **Show/Hide clapper board info** and see the metadata:
55
55
56
56
> [!div class="mx-imgBorder"]
57
57
> :::image type="content" source="./media/slate-detection-process/clapperboard-metadata.png" alt-text="This image shows the clapperboard metadata.":::
58
58
59
59
#### View the timeline
60
60
61
-
If you checked the **Post-production** insight, You can also find the clapperboard instance and its timeline (includes time, fields' values) on the **Timeline** tab.
61
+
If you checked the **Post-production** insight, You can also find the clapper board instance and its timeline (includes time, fields' values) on the **Timeline** tab.
62
62
63
63
#### View JSON
64
64
@@ -74,21 +74,21 @@ The following table describes fields found in json:
74
74
75
75
|Name|Description|
76
76
|---|---|
77
-
|`id`|The clapperboard ID.|
77
+
|`id`|The clapper board ID.|
78
78
|`thumbnailId`|The ID of the thumbnail.|
79
79
|`isHeadSlate`|The value stands for head or tail (the board is upside-down) of the clapper board: `true` or `false`.|
80
80
|`fields`|The fields found in the clapper board; also each field's name and value.|
81
81
|`instances`|A list of time ranges where this element appeared.|
82
82
83
-
## Clapperboard limitations
83
+
## Clapper board limitations
84
84
85
85
The values may not always be correctly identified by the detection algorithm. Here are some limitations:
86
86
87
87
- The titles of the fields appearing on the clapper board are optimized to identify the most popular fields appearing on top of clapper boards.
88
88
- Handwritten text or digital digits may not be correctly identified by the fields detection algorithm.
89
89
- The algorithm is optimized to identify fields' categories that appear horizontally.
90
-
- The clapperboard may not be detected if the frame is blurred or that the text written on it can't be identified by the human eye.
91
-
- Empty fields’ values may lead to to wrong fields categories.
90
+
- The clapper board may not be detected if the frame is blurred or that the text written on it can't be identified by the human eye.
91
+
- Empty fields’ values may lead to wrong fields categories.
92
92
<!-- If a part of a clapper board is hidden a value with the highest confidence is shown. -->
Copy file name to clipboardExpand all lines: articles/azure-video-indexer/customize-language-model-overview.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,30 +11,30 @@ author: IngridAtMicrosoft
11
11
12
12
Azure AI Video Indexer supports automatic speech recognition through integration with the Microsoft [Custom Speech Service](https://azure.microsoft.com/services/cognitive-services/custom-speech-service/). You can customize the Language model by uploading adaptation text, namely text from the domain whose vocabulary you'd like the engine to adapt to. Once you train your model, new words appearing in the adaptation text will be recognized, assuming default pronunciation, and the Language model will learn new probable sequences of words. See the list of supported by Azure AI Video Indexer languages in [supported langues](language-support.md).
13
13
14
-
Let's take a word that is highly specific, like *"Kubernetes"* (in the context of Azure Kubernetes service), as an example. Since the word is new to Azure AI Video Indexer, it is recognized as *"communities"*. You need to train the model to recognize it as *"Kubernetes"*. In other cases, the words exist, but the Language model is not expecting them to appear in a certain context. For example, *"container service"*is not a 2-word sequence that a non-specialized Language model would recognize as a specific set of words.
14
+
Let's take a word that is highly specific, like *"Kubernetes"* (in the context of Azure Kubernetes service), as an example. Since the word is new to Azure AI Video Indexer, it's recognized as *"communities"*. You need to train the model to recognize it as *"Kubernetes"*. In other cases, the words exist, but the Language model isn't expecting them to appear in a certain context. For example, *"container service"*isn't a 2-word sequence that a nonspecialized Language model would recognize as a specific set of words.
15
15
16
-
There are 2 ways to customize a language model:
16
+
There are two ways to customize a language model:
17
17
18
-
-**Option 1**: Edit the transcript that was generated by Azure AI Video Indexer. By editing and correcting the transcript, you are training a language model to provide improved results in the future.
18
+
-**Option 1**: Edit the transcript that was generated by Azure AI Video Indexer. By editing and correcting the transcript, you're training a language model to provide improved results in the future.
19
19
-**Option 2**: Upload text file(s) to train the language model. The upload file can either contain a list of words as you would like them to appear in the Video Indexer transcript or the relevant words included naturally in sentences and paragraphs. As better results are achieved with the latter approach, it's recommended for the upload file to contain full sentences or paragraphs related to your content.
20
20
21
21
> [!Important]
22
22
> Do not include in the upload file the words or sentences as currently incorrectly transcribed (for example, *"communities"*) as this will negate the intended impact.
23
23
> Only include the words as you would like them to appear (for example, *"Kubernetes"*).
24
24
25
-
You can use the Azure AI Video Indexer APIs or the website to create and edit custom Language models, as described in topics in the [Next steps](#next-steps) section of this topic.
25
+
You can use the Azure AI Video Indexer APIs or the website to create and edit custom Language models, as described in articles in the [Next steps](#next-steps) section of this article.
26
26
27
27
## Best practices for custom Language models
28
28
29
29
Azure AI Video Indexer learns based on probabilities of word combinations, so to learn best:
30
30
31
31
* Give enough real examples of sentences as they would be spoken.
32
32
* Put only one sentence per line, not more. Otherwise the system will learn probabilities across sentences.
33
-
* It is okay to put one word as a sentence to boost the word against others, but the system learns best from full sentences.
33
+
* It's okay to put one word as a sentence to boost the word against others, but the system learns best from full sentences.
34
34
* When introducing new words or acronyms, if possible, give as many examples of usage in a full sentence to give as much context as possible to the system.
35
35
* Try to put several adaptation options, and see how they work for you.
36
36
* Avoid repetition of the exact same sentence multiple times. It may create bias against the rest of the input.
37
-
* Avoid including uncommon symbols (~, # @ % &) as they will get discarded. The sentences in which they appear will also get discarded.
37
+
* Avoid including uncommon symbols (~, # @ % &) as they'll get discarded. The sentences in which they appear will also get discarded.
38
38
* Avoid putting too large inputs, such as hundreds of thousands of sentences, because doing so will dilute the effect of boosting.
Copy file name to clipboardExpand all lines: articles/azure-video-indexer/customize-language-model-with-api.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ Azure AI Video Indexer lets you create custom Language models to customize speec
14
14
15
15
For a detailed overview and best practices for custom Language models, see [Customize a Language model with Azure AI Video Indexer](customize-language-model-overview.md).
16
16
17
-
You can use the Azure AI Video Indexer APIs to create and edit custom Language models in your account, as described in this topic. You can also use the website, as described in [Customize Language model using the Azure AI Video Indexer website](customize-language-model-with-api.md).
17
+
You can use the Azure AI Video Indexer APIs to create and edit custom Language models in your account, as described in this article. You can also use the website, as described in [Customize Language model using the Azure AI Video Indexer website](customize-language-model-with-api.md).
18
18
19
19
## Create a Language model
20
20
@@ -25,8 +25,8 @@ The [create a language model](https://api-portal.videoindexer.ai/api-details#api
25
25
26
26
To upload files to be added to the Language model, you must upload files in the body using FormData in addition to providing values for the required parameters above. There are two ways to do this task:
27
27
28
-
* Key will be the file name and value will be the txt file.
29
-
* Key will be the file name and value will be a URL to txt file.
28
+
* Key is the file name and value is the txt file.
29
+
* Key is the file name and value is a URL to txt file.
30
30
31
31
### Response
32
32
@@ -100,7 +100,7 @@ The returned `id` is a unique ID used to distinguish between language models, wh
100
100
101
101
## Delete a Language model
102
102
103
-
The [delete a language model](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Delete-Language-Model) API deletes a custom Language model from the specified account. Any video that was using the deleted Language model will keep the same index until you reindex the video. If you reindex the video, you can assign a new Language model to the video. Otherwise, Azure AI Video Indexer will use its default model to reindex the video.
103
+
The [delete a language model](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Delete-Language-Model) API deletes a custom Language model from the specified account. Any video that was using the deleted Language model keeps the same index until you reindex the video. If you reindex the video, you can assign a new Language model to the video. Otherwise, Azure AI Video Indexer uses its default model to reindex the video.
104
104
105
105
### Response
106
106
@@ -115,8 +115,8 @@ The [update a Language model](https://api-portal.videoindexer.ai/api-details#api
115
115
116
116
To upload files to be added to the Language model, you must upload files in the body using FormData in addition to providing values for the required parameters above. There are two ways to do this task:
117
117
118
-
* Key will be the file name and value will be the txt file.
119
-
* Key will be the file name and value will be a URL to txt file.
118
+
* Key is the file name and value is the txt file.
119
+
* Key is the file name and value is a URL to txt file.
120
120
121
121
### Response
122
122
@@ -286,7 +286,7 @@ The [download a file](https://api-portal.videoindexer.ai/api-details#api=Operati
286
286
287
287
### Response
288
288
289
-
The response will be the download of a text file with the contents of the file in the JSON format.
289
+
The response is the download of a text file with the contents of the file in the JSON format.
0 commit comments