Skip to content

Commit 921deac

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into remove-horizon
2 parents 8c18832 + bb10a7f commit 921deac

File tree

1,113 files changed

+14789
-8847
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,113 files changed

+14789
-8847
lines changed

.openpublishing.publish.config.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1257,6 +1257,7 @@
12571257
"articles/stream-analytics/.openpublishing.redirection.stream-analytics.json",
12581258
"articles/synapse-analytics/.openpublishing.redirection.synapse-analytics.json",
12591259
"articles/virtual-machine-scale-sets/.openpublishing.redirection.virtual-machine-scale-sets.json",
1260-
"articles/virtual-machines/.openpublishing.redirection.virtual-machines.json"
1260+
"articles/virtual-machines/.openpublishing.redirection.virtual-machines.json",
1261+
"articles/operator-nexus/.openpublishing.redirection.operator-nexus.json"
12611262
]
12621263
}

.openpublishing.redirection.azure-monitor.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6463,6 +6463,11 @@
64636463
"source_path_from_root": "/articles/azure-monitor/logs/resource-expression.md",
64646464
"redirect_url": "/azure/azure-monitor/logs/cross-workspace-query",
64656465
"redirect_document_id": false
6466+
},
6467+
{
6468+
"source_path_from_root": "/articles/azure-monitor/vm/vminsights-configure-workspace.md",
6469+
"redirect_url": "/azure/azure-monitor/vm/vminsights-enable-overview",
6470+
"redirect_document_id": false
64666471
}
64676472
]
64686473
}

.openpublishing.redirection.json

Lines changed: 23 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2770,6 +2770,11 @@
27702770
"redirect_url": "/azure/container-registry/tutorial-customer-managed-keys",
27712771
"redirect_document_id": false
27722772
},
2773+
{
2774+
"source_path_from_root": "/articles/container-registry/container-registry-enable-conditional-access-policy.md",
2775+
"redirect_url": "/azure/container-registry/container-registry-configure-conditional-access.md",
2776+
"redirect_document_id": false
2777+
},
27732778
{
27742779
"source_path": "articles/site-recovery/vmware-physical-secondary-disaster-recovery.md",
27752780
"redirect_url": "/azure/site-recovery/vmware-physical-secondary-architecture",
@@ -3101,8 +3106,8 @@
31013106
"redirect_document_id": false
31023107
},
31033108
{
3104-
"source_path": "active-directory/authentication/concept-mfa-plan.md",
3105-
"redirect_url": "active-directory/authentication/howto-mfa-getstarted",
3109+
"source_path": "articles/active-directory/authentication/concept-mfa-plan.md",
3110+
"redirect_url": "/entra/identity/authentication/howto-mfa-getstarted",
31063111
"redirect_document_id": false
31073112
},
31083113
{
@@ -9622,6 +9627,11 @@
96229627
"redirect_url": "/azure/azure-functions/functions-reference-python?pivots=python-mode-decorators#triggers-and-inputs",
96239628
"redirect_document_id": false
96249629
},
9630+
{
9631+
"source_path_from_root": "/articles/azure-functions/update-java-versions.md",
9632+
"redirect_url": "/azure/azure-functions/update-language-versions",
9633+
"redirect_document_id": false
9634+
},
96259635
{
96269636
"source_path_from_root": "/articles/azure-government/documentation-government-k8.md",
96279637
"redirect_url": "/azure/azure-government",
@@ -15392,6 +15402,11 @@
1539215402
"redirect_url": "/azure/event-grid/scripts/event-grid-cli-subscribe-custom-topic",
1539315403
"redirect_document_id": false
1539415404
},
15405+
{
15406+
"source_path_from_root": "/articles/ai-studio/how-to/deploy-models.md",
15407+
"redirect_URL": "/azure/ai-studio/concepts/deployments-overview",
15408+
"redirect_document_id": true
15409+
},
1539515410
{
1539615411
"source_path_from_root": "/articles/notebooks/use-machine-learning-services-jupyter-notebooks.md",
1539715412
"redirect_url": "/azure/machine-learning/samples-notebooks",
@@ -23447,6 +23462,11 @@
2344723462
"redirect_url": "/azure/lighthouse/concepts/architecture",
2344823463
"redirect_document_id": true
2344923464
},
23465+
{
23466+
"source_path_from_root": "/articles/lighthouse/how-to/partner-earned-credit.md",
23467+
"redirect_url": "/azure/cost-management-billing/manage/link-partner-id",
23468+
"redirect_document_id": false
23469+
},
2345023470
{
2345123471
"source_path_from_root": "/articles/service-fabric-mesh/index.yml",
2345223472
"redirect_url": "/previous-versions/azure/service-fabric-mesh/service-fabric-mesh-overview",
@@ -25779,5 +25799,6 @@
2577925799
"redirect_url": "https://azure.microsoft.com/updates/preview-ai-toolchain-operator-addon-for-aks/",
2578025800
"redirect_document_id": false
2578125801
}
25802+
2578225803
]
2578325804
}

articles/active-directory-b2c/configure-tokens.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ You can configure the token lifetime, including:
2828

2929
- **Access and ID token lifetimes (minutes)** - The lifetime of the OAuth 2.0 bearer token and ID tokens. The default is 60 minutes (1 hour). The minimum (inclusive) is 5 minutes. The maximum (inclusive) is 1,440 minutes (24 hours).
3030
- **Refresh token lifetime (days)** - The maximum time period before which a refresh token can be used to acquire a new access token, if your application had been granted the `offline_access` scope. The default is 14 days. The minimum (inclusive) is one day. The maximum (inclusive) 90 days.
31-
- **Refresh token sliding window lifetime** - The refresh token sliding window type. `Bounded` indicates that the refresh token can be extended as specify in the **Lifetime length (days)**. `No expiry` indicates that the refresh token sliding window lifetime never expires.
31+
- **Refresh token sliding window lifetime** - The refresh token sliding window type. `Bounded` indicates that the refresh token can be extended as specified in the **Lifetime length (days)**. `No expiry` indicates that the refresh token sliding window lifetime never expires.
3232
- **Lifetime length (days)** - After this time period elapses the user is forced to reauthenticate, irrespective of the validity period of the most recent refresh token acquired by the application. The value must be greater than or equal to the **Refresh token lifetime** value.
3333

3434
The following diagram shows the refresh token sliding window lifetime behavior.

articles/ai-services/computer-vision/concept-object-detection-40.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,9 @@ Try out the capabilities of object detection quickly and easily in your browser
2424
> [!div class="nextstepaction"]
2525
> [Try Vision Studio](https://portal.vision.cognitive.azure.com/)
2626
27+
> [!TIP]
28+
> You can use the Object detection feature through the [Azure OpenAI](/azure/ai-services/openai/overview) service. The **GPT-4 Turbo with Vision** model lets you chat with an AI assistant that can analyze the images you share, and the Vision Enhancement option uses Image Analysis to give the AI assistance more details (readable text and object locations) about the image. For more information, see the [GPT-4 Turbo with Vision quickstart](/azure/ai-services/openai/gpt-v-quickstart).
29+
2730
## Object detection example
2831

2932
The following JSON response illustrates what the Analysis 4.0 API returns when detecting objects in the example image.

articles/ai-services/computer-vision/concept-ocr.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,9 @@ OCR traditionally started as a machine-learning-based technique for extracting t
2424

2525
The new Computer Vision Image Analysis 4.0 REST API offers the ability to extract printed or handwritten text from images in a unified performance-enhanced synchronous API that makes it easy to get all image insights including OCR results in a single API operation. The Read OCR engine is built on top of multiple deep learning models supported by universal script-based models for [global language support](./language-support.md).
2626

27+
> [!TIP]
28+
> You can use the OCR feature through the [Azure OpenAI](/azure/ai-services/openai/overview) service. The **GPT-4 Turbo with Vision** model lets you chat with an AI assistant that can analyze the images you share, and the Vision Enhancement option uses Image Analysis to give the AI assistance more details (readable text and object locations) about the image. For more information, see the [GPT-4 Turbo with Vision quickstart](/azure/ai-services/openai/gpt-v-quickstart).
29+
2730
## Text extraction example
2831

2932
The following JSON response illustrates what the Image Analysis 4.0 API returns when extracting text from the given image.

articles/ai-services/computer-vision/how-to/background-removal.md

Lines changed: 9 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,7 @@ To authenticate against the Image Analysis service, you need an Azure AI Vision
3838
3939
The SDK example assumes that you defined the environment variables `VISION_KEY` and `VISION_ENDPOINT` with your key and endpoint.
4040

41+
<!--
4142
#### [C#](#tab/csharp)
4243
4344
Start by creating a [VisionServiceOptions](/dotnet/api/azure.ai.vision.common.visionserviceoptions) object using one of the constructors. For example:
@@ -67,15 +68,16 @@ Where we used this helper function to read the value of an environment variable:
6768
[!code-cpp[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/cpp/image-analysis/how-to/how-to.cpp?name=get_env_var)]
6869
6970
#### [REST API](#tab/rest)
71+
-->
7072

7173
Authentication is done by adding the HTTP request header **Ocp-Apim-Subscription-Key** and setting it to your vision key. The call is made to the URL `https://<endpoint>/computervision/imageanalysis:segment?api-version=2023-02-01-preview`, where `<endpoint>` is your unique Azure AI Vision endpoint URL. See [Select a mode ](./background-removal.md#select-a-mode) section for another query string you add to this URL.
7274

73-
---
7475

7576
## Select the image to analyze
7677

7778
The code in this guide uses remote images referenced by URL. You may want to try different images on your own to see the full capability of the Image Analysis features.
7879

80+
<!--
7981
#### [C#](#tab/csharp)
8082
8183
Create a new **VisionSource** object from the URL of the image you want to analyze, using the static constructor [VisionSource.FromUrl](/dotnet/api/azure.ai.vision.common.visionsource.fromurl).
@@ -117,15 +119,17 @@ Create a new **VisionSource** object from the URL of the image you want to analy
117119
> You can also analyze a local image by passing in the full-path image file name (see [VisionSource::FromFile](/cpp/cognitive-services/vision/input-visionsource#fromfile)), or by copying the image into the SDK's input buffer (see [VisionSource::FromImageSourceBuffer](/cpp/cognitive-services/vision/input-visionsource#fromimagesourcebuffer)). For more details, see [Call the Analyze API](./call-analyze-image-40.md?pivots=programming-language-cpp#select-the-image-to-analyze).
118120
119121
#### [REST API](#tab/rest)
122+
-->
120123

121124
When analyzing a remote image, you specify the image's URL by formatting the request body like this: `{"url":"https://learn.microsoft.com/azure/ai-services/computer-vision/images/windows-kitchen.jpg"}`. The **Content-Type** should be `application/json`.
122125

123126
To analyze a local image, you'd put the binary image data in the HTTP request body. The **Content-Type** should be `application/octet-stream` or `multipart/form-data`.
124127

125-
---
128+
126129

127130
## Select a mode
128131

132+
<!--
129133
### [C#](#tab/csharp)
130134
131135
Create a new [ImageAnalysisOptions](/dotnet/api/azure.ai.vision.imageanalysis.imageanalysisoptions) object and set the property [SegmentationMode](/dotnet/api/azure.ai.vision.imageanalysis.imageanalysisoptions.segmentationmode#azure-ai-vision-imageanalysis-imageanalysisoptions-segmentationmode). This property must be set if you want to do segmentation. See [ImageSegmentationMode](/dotnet/api/azure.ai.vision.imageanalysis.imagesegmentationmode) for supported values.
@@ -151,6 +155,7 @@ Create a new [ImageAnalysisOptions](/cpp/cognitive-services/vision/imageanalysis
151155
[!code-cpp[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/cpp/image-analysis/segmentation/segmentation.cpp?name=segmentation_mode)]
152156
153157
### [REST](#tab/rest)
158+
-->
154159

155160
Set the query string *mode** to one of these two values. This query string is mandatory if you want to do image segmentation.
156161

@@ -161,12 +166,12 @@ Set the query string *mode** to one of these two values. This query string is ma
161166

162167
A populated URL for backgroundRemoval would look like this: `https://<endpoint>/computervision/imageanalysis:segment?api-version=2023-02-01-preview&mode=backgroundRemoval`
163168

164-
---
165169

166170
## Get results from the service
167171

168172
This section shows you how to make the API call and parse the results.
169173

174+
<!--
170175
#### [C#](#tab/csharp)
171176
172177
The following code calls the Image Analysis API and saves the resulting segmented image to a file named **output.png**. It also displays some metadata about the segmented image.
@@ -196,10 +201,10 @@ The following code calls the Image Analysis API and saves the resulting segmente
196201
[!code-cpp[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/cpp/image-analysis/segmentation/segmentation.cpp?name=segment)]
197202
198203
#### [REST](#tab/rest)
204+
-->
199205

200206
The service returns a `200` HTTP response on success with `Content-Type: image/png`, and the body contains the returned PNG image in the form of a binary stream.
201207

202-
---
203208

204209
As an example, assume background removal is run on the following image:
205210

articles/ai-services/computer-vision/how-to/model-customization.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -473,4 +473,4 @@ The API call returns an **ImageAnalysisResult** JSON object, which contains all
473473
In this guide, you created and trained a custom image classification model using Image Analysis. Next, learn more about the Analyze Image 4.0 API, so you can call your custom model from an application using REST or library SDKs.
474474

475475
* See the [Model customization concepts](../concept-model-customization.md) guide for a broad overview of this feature and a list of frequently asked questions.
476-
* [Call the Analyze Image API](./call-analyze-image-40.md). Note the sections [Set model name when using a custom model](./call-analyze-image-40.md#set-model-name-when-using-a-custom-model) and [Get results using custom model](./call-analyze-image-40.md#get-results-using-custom-model).
476+
* [Call the Analyze Image API](./call-analyze-image-40.md). <!--Note the sections [Set model name when using a custom model](./call-analyze-image-40.md#set-model-name-when-using-a-custom-model) and [Get results using custom model](./call-analyze-image-40.md#get-results-using-custom-model).-->

articles/ai-services/computer-vision/how-to/shelf-model-customization.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -46,15 +46,14 @@ When your custom model is trained and ready (you've completed the steps in the [
4646
The API call will look like this:
4747

4848
```bash
49-
curl.exe -H "Ocp-Apim-Subscription-Key: <subscriptionKey>" -H "Content-Type: application/json" "https://<endpoint>/computervision/productrecognition/ms-pretrained-product-detection/models/<your_model_name>/runs/<your_run_name>?api-version=2023-04-01-preview" -d "{
49+
curl.exe -H "Ocp-Apim-Subscription-Key: <subscriptionKey>" -H "Content-Type: application/json" "https://<endpoint>/computervision/productrecognition/ms-pretrained-product-detection/runs/<your_run_name>?api-version=2023-04-01-preview" -d "{
5050
'url':'<your_url_string>'
5151
}"
5252
```
5353

5454
1. Make the following changes in the command where needed:
5555
1. Replace the `<subscriptionKey>` with your Vision resource key.
5656
1. Replace the `<endpoint>` with your Vision resource endpoint. For example: `https://YourResourceName.cognitiveservices.azure.com`.
57-
1. Replace the `<your_model_name>` with your unique custom model name. This will be the name of the customized model you have trained with your own data. For example, `.../models/mymodel1/runs/...`
5857
2. Replace the `<your_run_name>` with your unique test run name for the task queue. It is an async API task queue name for you to be able retrieve the API response later. For example, `.../runs/test1?api-version...`
5958
1. Replace the `<your_url_string>` contents with the blob URL of the image
6059
1. Open a command prompt window.

articles/ai-services/computer-vision/includes/custom-vision-ia-compare.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@ ms.author: pafarley
1919
|Web Portal |[Customvision.ai](https://www.customvision.ai/) |[Vision Studio](https://portal.vision.cognitive.azure.com/gallery/featured) |
2020
|Libraries |REST, SDK |REST, Python Sample |
2121
|Minimum training data needed |15 images per category |2-5 images per category |
22-
|Training data storage |Uploaded to service |In customer’s blob storage account |
23-
|Model hosting |Cloud and Edge |Cloud hosting only, Edge container hosting to come |
24-
|AI quotient (AIQ) | <table ><col ><col ></colgroup><thead><tr><th>context</th><th>IC</br>(top-1 accuracy, 22 datasets)</th><th>OD</br>(mAP@50, 59 datasets)</th></tr></thead><tbody><tr><td>0 shot</td><td>N/A</td><td>N/A</td></tr><tr><td>2 shot</td><td>51.47</td><td>33.3</td></tr><tr><td>3 shot</td><td>56.73</td><td>37.0</td></tr><tr><td>5 shot</td><td>63.01</td><td>43.4</td></tr><tr><td>10 shot</td><td>68.95</td><td>54.0</td></tr><tr><td>full</td><td>85.25</td><td>76.6</td></tr></tbody></table> | <table ><col ><col ></colgroup><thead><tr><th>context</th><th>IC</br>(top-1 accuracy, 22 datasets)</th><th>OD</br>(mAP@50, 59 datasets)</th></tr></thead><tbody><tr><td>0 shot</td><td>57.8</td><td>27.1</td></tr><tr><td>2 shot</td><td>73.02</td><td>49.2</td></tr><tr><td>3 shot</td><td>75.51</td><td>61.1</td></tr><tr><td>5 shot</td><td>79.14</td><td>68.2</td></tr><tr><td>10 shot</td><td>81.31</td><td>75.0</td></tr><tr><td>full</td><td>90.98</td><td>85.4</td></tr></tbody></table> |
22+
|Training data storage |Uploaded to service |Customer’s blob storage account |
23+
|Model hosting |Cloud and edge |Cloud hosting only, edge container hosting to come |
24+
|AI quality | <table ><col ><col ></colgroup><thead><tr><th>context</th><th>IC</br>(top-1 accuracy, 22 datasets)</th><th>OD</br>(mAP@50, 59 datasets)</th></tr></thead><tbody><tr><td>0 shot</td><td>N/A</td><td>N/A</td></tr><tr><td>2 shot</td><td>51.47</td><td>33.3</td></tr><tr><td>3 shot</td><td>56.73</td><td>37.0</td></tr><tr><td>5 shot</td><td>63.01</td><td>43.4</td></tr><tr><td>10 shot</td><td>68.95</td><td>54.0</td></tr><tr><td>full</td><td>85.25</td><td>76.6</td></tr></tbody></table> | <table ><col ><col ></colgroup><thead><tr><th>context</th><th>IC</br>(top-1 accuracy, 22 datasets)</th><th>OD</br>(mAP@50, 59 datasets)</th></tr></thead><tbody><tr><td>0 shot</td><td>57.8</td><td>27.1</td></tr><tr><td>2 shot</td><td>73.02</td><td>49.2</td></tr><tr><td>3 shot</td><td>75.51</td><td>61.1</td></tr><tr><td>5 shot</td><td>79.14</td><td>68.2</td></tr><tr><td>10 shot</td><td>81.31</td><td>75.0</td></tr><tr><td>full</td><td>90.98</td><td>85.4</td></tr></tbody></table> |
25+
| Pricing | [Custom Vision pricing](/pricing/details/cognitive-services/custom-vision-service/) | [Image Analysis pricing](/pricing/details/cognitive-services/computer-vision/) |

0 commit comments

Comments
 (0)