Skip to content

Commit 00da560

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into vnet-peer-update-2
2 parents 310077a + 63c2c02 commit 00da560

File tree

771 files changed

+8814
-7461
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

771 files changed

+8814
-7461
lines changed

.openpublishing.redirection.azure-monitor.json

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6738,9 +6738,11 @@
67386738
"source_path_from_root": "/articles/azure-monitor/essentials/pipeline-overview.md",
67396739
"redirect_url": "/azure/azure-monitor/essentials/data-collection-rule-overview",
67406740
"redirect_document_id": false
6741-
}
6742-
6743-
6744-
6741+
},
6742+
{
6743+
"source_path_from_root": "/articles/azure-monitor/agents/azure-monitor-agent-migration-tools.md",
6744+
"redirect_url": "/azure/azure-monitor/agents/azure-monitor-agent-migration",
6745+
"redirect_document_id": false
6746+
}
67456747
]
67466748
}

.openpublishing.redirection.json

Lines changed: 26 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3043,6 +3043,11 @@
30433043
"redirect_url": "/sql/sql-server/stretch-database/stretch-database-databases-and-tables-stretch-database-advisor",
30443044
"redirect_document_id": false
30453045
},
3046+
{
3047+
"source_path_from_root": "/articles/dms/tutorial-azure-postgresql-to-azure-postgresql-online-portal.md",
3048+
"redirect_url": "/azure/postgresql/migrate/migration-service/tutorial-migration-service-single-to-flexible",
3049+
"redirect_document_id": false
3050+
},
30463051
{
30473052
"source_path_from_root": "/articles/vs-azure-tools-access-private-azure-clouds-with-visual-studio.md",
30483053
"redirect_url": "/visualstudio/azure/vs-azure-tools-access-private-azure-clouds-with-visual-studio",
@@ -4089,7 +4094,27 @@
40894094
"redirect_document_id": false
40904095
},
40914096
{
4092-
"source_path_from_root": "/articles/data-factory/continuous-integration-delivery-automate-github-actions.md",
4097+
"source_path_from_root": "/articles/openshift/tutorial-create-cluster.md",
4098+
"redirect_url": "/azure/openshift/create-cluster",
4099+
"redirect_document_id": false
4100+
},
4101+
{
4102+
"source_path_from_root": "/articles/openshift/tutorial-connect-cluster.md",
4103+
"redirect_url": "/azure/openshift/connect-cluster",
4104+
"redirect_document_id": false
4105+
},
4106+
{
4107+
"source_path_from_root": "/articles/openshift/tutorial-delete-cluster.md",
4108+
"redirect_url": "/azure/openshift/delete-cluster",
4109+
"redirect_document_id": false
4110+
},
4111+
{
4112+
"source_path_from_root": "/articles/openshift/quickstart-portal.md",
4113+
"redirect_url": "/azure/openshift/create-cluster",
4114+
"redirect_document_id": false
4115+
},
4116+
{
4117+
"source_path_from_root": "/articles/data-factory/continuous-integration-delivery-automate-github-actions.md",
40934118
"redirect_url": "/azure",
40944119
"redirect_document_id": false
40954120
}

articles/ai-services/computer-vision/how-to/shelf-analyze.md

Lines changed: 37 additions & 49 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Analyze a shelf image using pretrained models
33
titleSuffix: Azure AI services
4-
description: Use the Product Understanding API to analyze a shelf image and receive rich product data.
4+
description: Use the Product Recognition API to analyze a shelf image and receive rich product data.
55
author: PatrickFarley
66
manager: nitinme
77
ms.service: azure-ai-vision
@@ -13,7 +13,7 @@ ms.custom: build-2023, build-2023-dataai
1313

1414
# Shelf Product Recognition (preview): Analyze shelf images using pretrained model
1515

16-
The fastest way to start using Product Recognition is to use the built-in pretrained AI models. With the Product Understanding API, you can upload a shelf image and get the locations of products and gaps.
16+
The fastest way to start using Product Recognition is to use the built-in pretrained AI models. With the Product Recognition API, you can upload a shelf image and get the locations of products and gaps.
1717

1818
:::image type="content" source="../media/shelf/shelf-analysis-pretrained.png" alt-text="Photo of a retail shelf with products and gaps highlighted with rectangles.":::
1919

@@ -51,62 +51,50 @@ To analyze a shelf image, do the following steps:
5151

5252
## Examine the response
5353

54-
A successful response is returned in JSON. The product understanding API results are returned in a `ProductUnderstandingResultApiModel` JSON field:
54+
A successful response is returned in JSON. The product recognition API results are returned in a `ProductRecognitionResultApiModel` JSON field:
5555

5656
```json
57-
{
58-
"imageMetadata": {
59-
"width": 2000,
60-
"height": 1500
61-
},
62-
"products": [
63-
{
64-
"id": "string",
65-
"boundingBox": {
66-
"x": 1234,
67-
"y": 1234,
68-
"w": 12,
69-
"h": 12
70-
},
71-
"classifications": [
72-
{
73-
"confidence": 0.9,
74-
"label": "string"
75-
}
76-
]
77-
}
57+
"ProductRecognitionResultApiModel": {
58+
"description": "Results from the product understanding operation.",
59+
"required": [
60+
"gaps",
61+
"imageMetadata",
62+
"products"
7863
],
79-
"gaps": [
80-
{
81-
"id": "string",
82-
"boundingBox": {
83-
"x": 1234,
84-
"y": 1234,
85-
"w": 123,
86-
"h": 123
87-
},
88-
"classifications": [
89-
{
90-
"confidence": 0.8,
91-
"label": "string"
92-
}
93-
]
64+
"type": "object",
65+
"properties": {
66+
"imageMetadata": {
67+
"$ref": "#/definitions/ImageMetadataApiModel"
68+
},
69+
"products": {
70+
"description": "Products detected in the image.",
71+
"type": "array",
72+
"items": {
73+
"$ref": "#/definitions/DetectedObject"
74+
}
75+
},
76+
"gaps": {
77+
"description": "Gaps detected in the image.",
78+
"type": "array",
79+
"items": {
80+
"$ref": "#/definitions/DetectedObject"
81+
}
9482
}
95-
]
83+
}
9684
}
9785
```
9886

9987
See the following sections for definitions of each JSON field.
10088

101-
### Product Understanding Result API model
89+
### Product Recognition Result API model
10290

103-
Results from the product understanding operation.
91+
Results from the product recognition operation.
10492

10593
| Name | Type | Description | Required |
10694
| ---- | ---- | ----------- | -------- |
10795
| `imageMetadata` | [ImageMetadataApiModel](#image-metadata-api-model) | The image metadata information such as height, width and format. | Yes |
108-
| `products` |[DetectedObjectApiModel](#detected-object-api-model) | Products detected in the image. | Yes |
109-
| `gaps` | [DetectedObjectApiModel](#detected-object-api-model) | Gaps detected in the image. | Yes |
96+
| `products` |[DetectedObject](#detected-object-api-model) | Products detected in the image. | Yes |
97+
| `gaps` | [DetectedObject](#detected-object-api-model) | Gaps detected in the image. | Yes |
11098

11199
### Image Metadata API model
112100

@@ -124,8 +112,8 @@ Describes a detected object in an image.
124112
| Name | Type | Description | Required |
125113
| ---- | ---- | ----------- | -------- |
126114
| `id` | string | ID of the detected object. | No |
127-
| `boundingBox` | [BoundingBoxApiModel](#bounding-box-api-model) | A bounding box for an area inside an image. | Yes |
128-
| `classifications` | [ImageClassificationApiModel](#image-classification-api-model) | Classification confidences of the detected object. | Yes |
115+
| `boundingBox` | [BoundingBox](#bounding-box-api-model) | A bounding box for an area inside an image. | Yes |
116+
| `tags` | [TagsApiModel](#image-tags-api-model) | Classification confidences of the detected object. | Yes |
129117

130118
### Bounding Box API model
131119

@@ -138,18 +126,18 @@ A bounding box for an area inside an image.
138126
| `w` | integer | Width measured from the top-left point of the area, in pixels. | Yes |
139127
| `h` | integer | Height measured from the top-left point of the area, in pixels. | Yes |
140128

141-
### Image Classification API model
129+
### Image Tags API model
142130

143131
Describes the image classification confidence of a label.
144132

145133
| Name | Type | Description | Required |
146134
| ---- | ---- | ----------- | -------- |
147135
| `confidence` | float | Confidence of the classification prediction. | Yes |
148-
| `label` | string | Label of the classification prediction. | Yes |
136+
| `name` | string | Label of the classification prediction. | Yes |
149137

150138
## Next steps
151139

152-
In this guide, you learned how to make a basic analysis call using the pretrained Product Understanding REST API. Next, learn how to use a custom Product Recognition model to better meet your business needs.
140+
In this guide, you learned how to make a basic analysis call using the pretrained Product Recognition REST API. Next, learn how to use a custom Product Recognition model to better meet your business needs.
153141

154142
> [!div class="nextstepaction"]
155143
> [Train a custom model for Product Recognition](../how-to/shelf-model-customization.md)

articles/ai-services/computer-vision/overview-image-analysis.md

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -142,11 +142,6 @@ To use the Image Analysis APIs, you must create your Azure AI Vision resource in
142142
| Japan East || | || |
143143

144144

145-
146-
### Query rates
147-
148-
tbd
149-
150145
## Data privacy and security
151146

152147
As with all of the Azure AI services, developers using the Azure AI Vision service should be aware of Microsoft's policies on customer data. See the [Azure AI services page](https://www.microsoft.com/trustcenter/cloudservices/cognitiveservices) on the Microsoft Trust Center to learn more.

articles/ai-services/openai/how-to/quota.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ author: mrbullwinkle
77
manager: nitinme
88
ms.service: azure-ai-openai
99
ms.topic: how-to
10-
ms.date: 05/31/2024
10+
ms.date: 06/18/2024
1111
ms.author: mbullwin
1212
---
1313

@@ -91,6 +91,9 @@ As each request is received, Azure OpenAI computes an estimated max processed-to
9191

9292
As requests come into the deployment endpoint, the estimated max-processed-token count is added to a running token count of all requests that is reset each minute. If at any time during that minute, the TPM rate limit value is reached, then further requests will receive a 429 response code until the counter resets.
9393

94+
> [!IMPORTANT]
95+
> The token count used in the rate limit calculation is an estimate based in part on the character count of the API request. The rate limit token estimate is not the same as the token calculation that is used for billing/determining that a request is below a model's input token limit. Due to the approximate nature of the rate limit token calculation, it is expected behavior that a rate limit can be triggered prior to what might be expected in comparison to an exact token count measurement for each request.
96+
9497
RPM rate limits are based on the number of requests received over time. The rate limit expects that requests be evenly distributed over a one-minute period. If this average flow isn't maintained, then requests may receive a 429 response even though the limit isn't met when measured over the course of a minute. To implement this behavior, Azure OpenAI Service evaluates the rate of incoming requests over a small period of time, typically 1 or 10 seconds. If the number of requests received during that time exceeds what would be expected at the set RPM limit, then new requests will receive a 429 response code until the next evaluation period. For example, if Azure OpenAI is monitoring request rate on 1-second intervals, then rate limiting will occur for a 600-RPM deployment if more than 10 requests are received during each 1-second period (600 requests per minute = 10 requests per second).
9598

9699
### Rate limit best practices

articles/ai-services/openai/how-to/working-with-models.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ titleSuffix: Azure OpenAI
44
description: Learn about managing model deployment life cycle, updates, & retirement.
55
ms.service: azure-ai-openai
66
ms.topic: conceptual
7-
ms.date: 10/04/2023
7+
ms.date: 06/18/2023
88
ms.custom: references_regions, build-2023, build-2023-dataai, devx-track-azurepowershell
99
manager: nitinme
1010
author: mrbullwinkle #ChrisHMSFT
@@ -75,7 +75,7 @@ There are three distinct model deployment upgrade options:
7575
|`NoAutoUpgrade` | The model deployment never automatically upgrades. Once the retirement date is reached the model deployment stops working. You need to update your code referencing that deployment to point to a nonexpired model deployment. |
7676

7777
> [!NOTE]
78-
> `null` is equivalent to `AutoUpgradeWhenExpired`. If the **Version update policy** option is not present in the properties for a model that supports model upgrades this indicates the value is currently `null`. Once you explicitly modify this value, the property is visible in the studio properties page as well as via the REST API.
78+
> `null` is equivalent to `OnceCurrentVersionExpired`. If the **Version update policy** option is not present in the properties for a model that supports model upgrades this indicates the value is currently `null`. Once you explicitly modify this value, the property is visible in the studio properties page as well as via the REST API.
7979
8080
### Examples
8181

articles/ai-services/openai/includes/model-matrix/provisioned-models.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,15 @@ description: PTU-managed model availability by region.
55
manager: nitinme
66
ms.service: azure-ai-openai
77
ms.topic: include
8-
ms.date: 06/11/2024
8+
ms.date: 06/18/2024
99
---
1010

1111
| **Region** | **gpt-4**, **0613** | **gpt-4**, **1106-Preview** | **gpt-4**, **0125-Preview** | **gpt-4**, **turbo-2024-04-09** | **gpt-4o**, **2024-05-13** | **gpt-4-32k**, **0613** | **gpt-35-turbo**, **1106** | **gpt-35-turbo**, **0125** |
1212
|:-------------------|:-------------------:|:---------------------------:|:---------------------------:|:-------------------------------:|:--------------------------:|:-----------------------:|:--------------------------:|:--------------------------:|
13-
| australiaeast ||||| - ||||
13+
| australiaeast ||||| ||||
1414
| brazilsouth |||| - | - ||| - |
1515
| canadacentral || - | - | - | - || - ||
16-
| canadaeast ||| - || - | - || - |
16+
| canadaeast ||| - || | - || - |
1717
| eastus ||||| - ||||
1818
| eastus2 ||||| - ||||
1919
| francecentral |||| - | - || - ||

articles/ai-services/openai/whats-new.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.custom:
1010
- ignite-2023
1111
- references_regions
1212
ms.topic: whats-new
13-
ms.date: 06/13/2024
13+
ms.date: 06/18/2024
1414
recommendations: false
1515
---
1616

@@ -28,7 +28,7 @@ This article provides a summary of the latest releases and major documentation u
2828

2929
* GPT-4o is now also available in:
3030
- Sweden Central for standard regional deployment.
31-
- Japan East, Korea Central, Sweden Central, Switzerland North, & West US 3 for provisioned deployment.
31+
- Australia East, Canada East, Japan East, Korea Central, Sweden Central, Switzerland North, & West US 3 for provisioned deployment.
3232

3333
For the latest information on model availability, see the [models page](./concepts/models.md).
3434

0 commit comments

Comments
 (0)