Skip to content

Commit 808855b

Browse files
authored
Merge branch 'main' into rel-comm-train
2 parents 6088354 + 811a5f1 commit 808855b

File tree

1,159 files changed

+13964
-10339
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,159 files changed

+13964
-10339
lines changed

.openpublishing.publish.config.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1257,6 +1257,7 @@
12571257
"articles/stream-analytics/.openpublishing.redirection.stream-analytics.json",
12581258
"articles/synapse-analytics/.openpublishing.redirection.synapse-analytics.json",
12591259
"articles/virtual-machine-scale-sets/.openpublishing.redirection.virtual-machine-scale-sets.json",
1260-
"articles/virtual-machines/.openpublishing.redirection.virtual-machines.json"
1260+
"articles/virtual-machines/.openpublishing.redirection.virtual-machines.json",
1261+
"articles/operator-nexus/.openpublishing.redirection.operator-nexus.json"
12611262
]
12621263
}

.openpublishing.redirection.azure-monitor.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6463,6 +6463,11 @@
64636463
"source_path_from_root": "/articles/azure-monitor/logs/resource-expression.md",
64646464
"redirect_url": "/azure/azure-monitor/logs/cross-workspace-query",
64656465
"redirect_document_id": false
6466+
},
6467+
{
6468+
"source_path_from_root": "/articles/azure-monitor/vm/vminsights-configure-workspace.md",
6469+
"redirect_url": "/azure/azure-monitor/vm/vminsights-enable-overview",
6470+
"redirect_document_id": false
64666471
}
64676472
]
64686473
}

.openpublishing.redirection.json

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9622,6 +9622,11 @@
96229622
"redirect_url": "/azure/azure-functions/functions-reference-python?pivots=python-mode-decorators#triggers-and-inputs",
96239623
"redirect_document_id": false
96249624
},
9625+
{
9626+
"source_path_from_root": "/articles/azure-functions/update-java-versions.md",
9627+
"redirect_url": "/azure/azure-functions/update-language-versions",
9628+
"redirect_document_id": false
9629+
},
96259630
{
96269631
"source_path_from_root": "/articles/azure-government/documentation-government-k8.md",
96279632
"redirect_url": "/azure/azure-government",
@@ -23447,6 +23452,11 @@
2344723452
"redirect_url": "/azure/lighthouse/concepts/architecture",
2344823453
"redirect_document_id": true
2344923454
},
23455+
{
23456+
"source_path_from_root": "/articles/lighthouse/how-to/partner-earned-credit.md",
23457+
"redirect_url": "/azure/cost-management-billing/manage/link-partner-id",
23458+
"redirect_document_id": false
23459+
},
2345023460
{
2345123461
"source_path_from_root": "/articles/service-fabric-mesh/index.yml",
2345223462
"redirect_url": "/previous-versions/azure/service-fabric-mesh/service-fabric-mesh-overview",
@@ -25737,6 +25747,11 @@
2573725747
"redirect_url": "/azure/update-manager/workbooks",
2573825748
"redirect_document_id": false
2573925749
},
25750+
{
25751+
"source_path": "articles/update-manager/whats-upcoming.md",
25752+
"redirect_url": "/azure/update-manager/whats-new",
25753+
"redirect_document_id": false
25754+
},
2574025755
{
2574125756
"source_path_from_root": "/articles/orbital/delete-contact.md",
2574225757
"redirect_url": "/azure/orbital/spacecraft-object",

.openpublishing.redirection.virtual-desktop.json

Lines changed: 60 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -239,6 +239,66 @@
239239
"source_path_from_root": "/articles/virtual-desktop/troubleshoot-statuses-checks.md",
240240
"redirect_url": "/azure/virtual-desktop/session-host-status-health-checks",
241241
"redirect_document_id": true
242+
},
243+
{
244+
"source_path_from_root": "/articles/virtual-desktop/what-is-app-attach.md",
245+
"redirect_url": "/azure/virtual-desktop/app-attach-overview",
246+
"redirect_document_id": true
247+
},
248+
{
249+
"source_path_from_root": "/articles/virtual-desktop/app-attach-faq.yml",
250+
"redirect_url": "/azure/virtual-desktop/app-attach-overview",
251+
"redirect_document_id": false
252+
},
253+
{
254+
"source_path_from_root": "/articles/virtual-desktop/app-attach.md",
255+
"redirect_url": "/azure/virtual-desktop/app-attach-test-msix-packages",
256+
"redirect_document_id": false
257+
},
258+
{
259+
"source_path_from_root": "/articles/virtual-desktop/app-attach-file-share.md",
260+
"redirect_url": "/azure/virtual-desktop/app-attach-overview",
261+
"redirect_document_id": false
262+
},
263+
{
264+
"source_path_from_root": "/articles/virtual-desktop/create-netapp-files.md",
265+
"redirect_url": "/azure/virtual-desktop/app-attach-overview",
266+
"redirect_document_id": false
267+
},
268+
{
269+
"source_path_from_root": "/articles/virtual-desktop/app-attach-azure-portal.md",
270+
"redirect_url": "/azure/virtual-desktop/app-attach-setup",
271+
"redirect_document_id": true
272+
},
273+
{
274+
"source_path_from_root": "/articles/virtual-desktop/app-attach-powershell.md",
275+
"redirect_url": "/azure/virtual-desktop/app-attach-setup",
276+
"redirect_document_id": false
277+
},
278+
{
279+
"source_path_from_root": "/articles/virtual-desktop/msix-app-attach-create-msix-image.md",
280+
"redirect_url": "/azure/virtual-desktop/app-attach-create-msix-image",
281+
"redirect_document_id": false
282+
},
283+
{
284+
"source_path_from_root": "/articles/virtual-desktop/manage-app-groups.md",
285+
"redirect_url": "/azure/virtual-desktop/publish-applications",
286+
"redirect_document_id": true
287+
},
288+
{
289+
"source_path_from_root": "/articles/virtual-desktop/manage-app-groups-powershell.md",
290+
"redirect_url": "/azure/virtual-desktop/publish-applications",
291+
"redirect_document_id": false
292+
},
293+
{
294+
"source_path_from_root": "/articles/virtual-desktop/publish-apps.md",
295+
"redirect_url": "/azure/virtual-desktop/publish-applications",
296+
"redirect_document_id": false
297+
},
298+
{
299+
"source_path_from_root": "/articles/virtual-desktop/sandbox.md",
300+
"redirect_url": "/azure/virtual-desktop/publish-applications",
301+
"redirect_document_id": false
242302
}
243303
]
244304
}

articles/active-directory-b2c/configure-tokens.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ You can configure the token lifetime, including:
2828

2929
- **Access and ID token lifetimes (minutes)** - The lifetime of the OAuth 2.0 bearer token and ID tokens. The default is 60 minutes (1 hour). The minimum (inclusive) is 5 minutes. The maximum (inclusive) is 1,440 minutes (24 hours).
3030
- **Refresh token lifetime (days)** - The maximum time period before which a refresh token can be used to acquire a new access token, if your application had been granted the `offline_access` scope. The default is 14 days. The minimum (inclusive) is one day. The maximum (inclusive) 90 days.
31-
- **Refresh token sliding window lifetime** - The refresh token sliding window type. `Bounded` indicates that the refresh token can be extended as specify in the **Lifetime length (days)**. `No expiry` indicates that the refresh token sliding window lifetime never expires.
31+
- **Refresh token sliding window lifetime** - The refresh token sliding window type. `Bounded` indicates that the refresh token can be extended as specified in the **Lifetime length (days)**. `No expiry` indicates that the refresh token sliding window lifetime never expires.
3232
- **Lifetime length (days)** - After this time period elapses the user is forced to reauthenticate, irrespective of the validity period of the most recent refresh token acquired by the application. The value must be greater than or equal to the **Refresh token lifetime** value.
3333

3434
The following diagram shows the refresh token sliding window lifetime behavior.

articles/ai-services/computer-vision/concept-describe-images-40.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,8 @@ ms.author: pafarley
1313
ms.custom: seodec18, ignite-2022, references_regions
1414
---
1515

16-
# Image captions (version 4.0 preview)
17-
Image captions in Image Analysis 4.0 (preview) are available through the **Caption** and **Dense Captions** features.
16+
# Image captions (version 4.0)
17+
Image captions in Image Analysis 4.0 are available through the **Caption** and **Dense Captions** features.
1818

1919
Caption generates a one sentence description for all image contents. Dense Captions provides more detail by generating one sentence descriptions of up to 10 regions of the image in addition to describing the whole image. Dense Captions also returns bounding box coordinates of the described image regions. Both these features use the latest groundbreaking Florence based AI models.
2020

@@ -122,7 +122,7 @@ The following JSON response illustrates what the Analysis 4.0 API returns when g
122122
}
123123
]
124124
},
125-
"modelVersion": "2023-02-01-preview",
125+
"modelVersion": "2023-10-01",
126126
"metadata": {
127127
"width": 850,
128128
"height": 567

articles/ai-services/computer-vision/concept-object-detection-40.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.author: pafarley
1313
ms.custom: seodec18, ignite-2022
1414
---
1515

16-
# Object detection (version 4.0 preview)
16+
# Object detection (version 4.0)
1717

1818
Object detection is similar to [tagging](concept-tag-images-40.md), but the API returns the bounding box coordinates (in pixels) for each object found in the image. For example, if an image contains a dog, cat and person, the object detection operation will list those objects with their coordinates in the image. You can use this functionality to process the relationships between the objects in an image. It also lets you determine whether there are multiple instances of the same object in an image.
1919

articles/ai-services/computer-vision/concept-ocr.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,15 +12,15 @@ ms.date: 07/04/2023
1212
ms.author: pafarley
1313
---
1414

15-
# OCR for images (version 4.0 preview)
15+
# OCR for images (version 4.0)
1616

1717
> [!NOTE]
1818
>
1919
> For extracting text from PDF, Office, and HTML documents and document images, use the [Document Intelligence Read OCR model](../../ai-services/document-intelligence/concept-read.md) optimized for text-heavy digital and scanned documents with an asynchronous API that makes it easy to power your intelligent document processing scenarios.
2020
2121
OCR traditionally started as a machine-learning-based technique for extracting text from in-the-wild and non-document images like product labels, user-generated images, screenshots, street signs, and posters. For several scenarios, such as single images that aren't text-heavy, you need a fast, synchronous API or service. This allows OCR to be embedded in near real-time user experiences to enrich content understanding and follow-up user actions with fast turn-around times.
2222

23-
## What is Computer Vision v4.0 Read OCR (preview)?
23+
## What is Computer Vision v4.0 Read OCR?
2424

2525
The new Computer Vision Image Analysis 4.0 REST API offers the ability to extract printed or handwritten text from images in a unified performance-enhanced synchronous API that makes it easy to get all image insights including OCR results in a single API operation. The Read OCR engine is built on top of multiple deep learning models supported by universal script-based models for [global language support](./language-support.md).
2626

articles/ai-services/computer-vision/concept-people-detection.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.date: 09/12/2022
1313
ms.author: pafarley
1414
---
1515

16-
# People detection (version 4.0 preview)
16+
# People detection (version 4.0)
1717

1818
Version 4.0 of Image Analysis offers the ability to detect people appearing in images. The bounding box coordinates of each detected person are returned, along with a confidence score.
1919

@@ -28,7 +28,7 @@ The following JSON response illustrates what the Analysis 4.0 API returns when d
2828

2929
```json
3030
{
31-
"modelVersion": "2023-02-01-preview",
31+
"modelVersion": "2023-10-01",
3232
"metadata": {
3333
"width": 300,
3434
"height": 231

articles/ai-services/computer-vision/concept-tag-images-40.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.author: pafarley
1313
ms.custom: seodec18, ignite-2022
1414
---
1515

16-
# Image tagging (version 4.0 preview)
16+
# Image tagging (version 4.0)
1717

1818
Image Analysis can return content tags for thousands of recognizable objects, living beings, scenery, and actions that appear in images. Tagging is not limited to the main subject, such as a person in the foreground, but also includes the setting (indoor or outdoor), furniture, tools, plants, animals, accessories, gadgets, and so on. Tags are not organized as a taxonomy and do not have inheritance hierarchies. When tags are ambiguous or not common knowledge, the API response provides hints to clarify the meaning of the tag in context of a known setting.
1919

0 commit comments

Comments
 (0)