Skip to content

Commit dbc361c

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into as-monitor
2 parents 45a1c20 + 385cc78 commit dbc361c

File tree

329 files changed

+3674
-2383
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

329 files changed

+3674
-2383
lines changed

.openpublishing.redirection.json

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10645,11 +10645,6 @@
1064510645
"redirect_url": "/azure/orbital/overview",
1064610646
"redirect_document_id": false
1064710647
},
10648-
{
10649-
"source_path_from_root": "/articles/load-balancer/cross-region-overview.md",
10650-
"redirect_url": "/azure/reliability/reliability-load-balancer",
10651-
"redirect_document_id": false
10652-
},
1065310648
{
1065410649
"source_path_from_root": "/articles/load-balancer/load-balancer-standard-availability-zones.md",
1065510650
"redirect_url": "/azure/reliability/reliability-load-balancer",

articles/active-directory-b2c/custom-policies-series-validate-user-input.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ Azure Active Directory B2C (Azure AD B2C) custom policy not only allows you to m
4949

5050
## Step 1 - Validate user input by limiting user input options
5151

52-
If you know all the possible values that a user can enter for a given input, you can provide a finite set of values that a user must select from. You can use *DropdownSinglSelect*, *CheckboxMultiSelect*, and *RadioSingleSelect* [UserInputType](claimsschema.md#userinputtype) for this purpose. In this article, you'll use a *RadioSingleSelect* input type:
52+
If you know all the possible values that a user can enter for a given input, you can provide a finite set of values that a user must select from. You can use *DropdownSingleSelect*, *CheckboxMultiSelect*, and *RadioSingleSelect* [UserInputType](claimsschema.md#userinputtype) for this purpose. In this article, you'll use a *RadioSingleSelect* input type:
5353

5454
1. In VS Code, open the file `ContosoCustomPolicy.XML`.
5555

articles/active-directory-b2c/find-help-open-support-ticket.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ If you're unable to find answers by using self-help resources, you can open an o
4141
4242
1. Sign in to the [Azure portal](https://portal.azure.com).
4343

44-
1. If you have access to multiple tenants, select the **Settings** icon in the top menu to switch to your Azure AD B2C tenant from the **Directories + subscriptions** menu.
44+
1. If you have access to multiple tenants, select the **Settings** icon in the top menu to switch to your Microsoft Entra tenant from the **Directories + subscriptions** menu. Currently, you can't submit support cases directly from your Azure AD B2C tenant.
4545

4646
1. In the Azure portal, search for and select **Microsoft Entra ID**.
4747

articles/active-directory-b2c/session-behavior.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,9 @@ With single sign-on, users sign in once with a single account and get access to
2525

2626
When the user initially signs in to an application, Azure AD B2C persists a cookie-based session. Upon subsequent authentication requests, Azure AD B2C reads and validates the cookie-based session, and issues an access token without prompting the user to sign in again. If the cookie-based session expires or becomes invalid, the user is prompted to sign-in again.
2727

28+
>[!NOTE]
29+
>If the user uses a browser that blocks third-party cookies, there are limitations with SSO due to limited access to the cookie-based session. The most user-visible impact is that there are more interactions required for sign-in. Additionally, the front channel sign-out doesn't immediately clear authentication state from federated applications. Check our recommended ways about [how to handle third-party cookie blocking in browsers](/entra/identity-platform/reference-third-party-cookies-spas).
30+
2831
## Prerequisites
2932

3033
[!INCLUDE [active-directory-b2c-customization-prerequisites](../../includes/active-directory-b2c-customization-prerequisites.md)]

articles/ai-services/document-intelligence/concept-read.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ ms.author: lajanuar
3131

3232
> [!NOTE]
3333
>
34-
> For extracting text from external images like labels, street signs, and posters, use the [Azure AI Vision v4.0 preview Read](../../ai-services/Computer-vision/concept-ocr.md) feature optimized for general, non-document images with a performance-enhanced synchronous API that makes it easier to embed OCR in your user experience scenarios.
34+
> For extracting text from external images like labels, street signs, and posters, use the [Azure AI Image Analysis v4.0 Read](../../ai-services/Computer-vision/concept-ocr.md) feature optimized for general, non-document images with a performance-enhanced synchronous API that makes it easier to embed OCR in your user experience scenarios.
3535
>
3636
3737
Document Intelligence Read Optical Character Recognition (OCR) model runs at a higher resolution than Azure AI Vision Read and extracts print and handwritten text from PDF documents and scanned images. It also includes support for extracting text from Microsoft Word, Excel, PowerPoint, and HTML documents. It detects paragraphs, text lines, words, locations, and languages. The Read model is the underlying OCR engine for other Document Intelligence prebuilt models like Layout, General Document, Invoice, Receipt, Identity (ID) document, Health insurance card, W2 in addition to custom models.

articles/ai-services/document-intelligence/faq.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -426,7 +426,7 @@ sections:
426426
- question: |
427427
How can I move my trained models from one environment (like beta) to another (like production)?
428428
answer: |
429-
The Copy API enables this scenario by allowing you to copy custom models from one Document Intelligence account or into others, which can exist in any supported geographical region. Follow [this document](disaster-recovery.md) for detailed instructions. The copy operation is limited to copying models within the specific cloud environment the model was trained in. For instance, copying models from the public cloud to the Azure Government clod isn't supported.
429+
The Copy API enables this scenario by allowing you to copy custom models from one Document Intelligence account or into others, which can exist in any supported geographical region. Follow [this document](disaster-recovery.md) for detailed instructions. The copy operation is limited to copying models within the specific cloud environment the model was trained in. For instance, copying models from the public cloud to the Azure Government cloud isn't supported.
430430
431431
- question: |
432432
Why was I charged for Layout when running custom training?

articles/ai-services/language-service/native-document-support/use-native-documents.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -28,15 +28,15 @@ ms.author: lajanuar
2828
2929
Azure AI Language is a cloud-based service that applies Natural Language Processing (NLP) features to text-based data. The native document support capability enables you to send API requests asynchronously, using an HTTP POST request body to send your data and HTTP GET request query string to retrieve the processed data.
3030

31-
A native document refers to the file format used to create the original document such as Microsoft Word (docx) or a portable document file (pdf). Native document support eliminates the need for text preprocessing prior to using Azure AI Language resource capabilities. Currently, native document support is available for the following capabilities:
31+
A native document refers to the file format used to create the original document such as Microsoft Word (docx) or a portable document file (pdf). Native document support eliminates the need for text preprocessing before using Azure AI Language resource capabilities. Currently, native document support is available for the following capabilities:
3232

3333
* [Personally Identifiable Information (PII)](../personally-identifiable-information/overview.md). The PII detection feature can identify, categorize, and redact sensitive information in unstructured text. The `PiiEntityRecognition` API supports native document processing.
3434

3535
* [Document summarization](../summarization/overview.md). Document summarization uses natural language processing to generate extractive (salient sentence extraction) or abstractive (contextual word extraction) summaries for documents. Both `AbstractiveSummarization` and `ExtractiveSummarization` APIs support native document processing.
3636

3737
## Supported document formats
3838

39-
Applications use native file formats to create, save, or open native documents. Currently **PII** and **Document summarization** capabilities supports the following native document formats:
39+
Applications use native file formats to create, save, or open native documents. Currently **PII** and **Document summarization** capabilities supports the following native document formats:
4040

4141
|File type|File extension|Description|
4242
|---------|--------------|-----------|
@@ -69,7 +69,7 @@ A native document refers to the file format used to create the original document
6969

7070
> [!NOTE]
7171
> The cURL package is pre-installed on most Windows 10 and Windows 11 and most macOS and Linux distributions. You can check the package version with the following commands:
72-
> Windows: `curl.exe -V`.
72+
> Windows: `curl.exe -V`
7373
> macOS `curl -V`
7474
> Linux: `curl --version`
7575
@@ -78,7 +78,7 @@ A native document refers to the file format used to create the original document
7878
* [Windows](https://curl.haxx.se/windows/).
7979
* [Mac or Linux](https://learn2torials.com/thread/how-to-install-curl-on-mac-or-linux-(ubuntu)-or-windows).
8080

81-
* An active [**Azure account**](https://azure.microsoft.com/free/cognitive-services/). If you don't have one, you can [**create a free account**](https://azure.microsoft.com/free/).
81+
* An active [**Azure account**](https://azure.microsoft.com/free/cognitive-services/). If you don't have one, you can [**create a free account**](https://azure.microsoft.com/free/).
8282

8383
* An [**Azure Blob Storage account**](https://portal.azure.com/#create/Microsoft.StorageAccount-ARM). You also need to [create containers](#create-azure-blob-storage-containers) in your Azure Blob Storage account for your source and target files:
8484

@@ -128,7 +128,7 @@ Your Language resource needs granted access to your storage account before it ca
128128

129129
* [**Shared access signature (SAS) tokens**](shared-access-signatures.md). User delegation SAS tokens are secured with Microsoft Entra credentials. SAS tokens provide secure, delegated access to resources in your Azure storage account.
130130

131-
* [**Managed identity role-based access control (RBAC)**](managed-identities.md). Managed identities for Azure resources are service principals that create a Microsoft Entra identity and specific permissions for Azure managed resources
131+
* [**Managed identity role-based access control (RBAC)**](managed-identities.md). Managed identities for Azure resources are service principals that create a Microsoft Entra identity and specific permissions for Azure managed resources.
132132

133133
For this project, we authenticate access to the `source location` and `target location` URLs with Shared Access Signature (SAS) tokens appended as query strings. Each token is assigned to a specific blob (file).
134134

@@ -177,7 +177,7 @@ For this quickstart, you need a **source document** uploaded to your **source co
177177
"language": "en-US",
178178
"id": "Output-excel-file",
179179
"source": {
180-
"location": "{your-source-container-with-SAS-URL}"
180+
"location": "{your-source-blob-with-SAS-URL}"
181181
},
182182
"target": {
183183
"location": "{your-target-container-with-SAS-URL}"
@@ -189,8 +189,8 @@ For this quickstart, you need a **source document** uploaded to your **source co
189189
{
190190
"kind": "PiiEntityRecognition",
191191
"parameters":{
192-
"excludePiiCategoriesredac" : ["PersonType", "Category2", "Category3"],
193-
"redactionPolicy": "UseEntityTypeName"
192+
"excludePiiCategories" : ["PersonType", "Category2", "Category3"],
193+
"redactionPolicy": "UseRedactionCharacterWithRefId"
194194
}
195195
}
196196
]
@@ -344,7 +344,7 @@ For this project, you need a **source document** uploaded to your **source conta
344344
"documents":[
345345
{
346346
"source":{
347-
"location":"{your-source-container-SAS-URL}"
347+
"location":"{your-source-blob-SAS-URL}"
348348
},
349349
"targets":
350350
{

articles/ai-services/language-service/question-answering/how-to/azure-openai-integration.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.service: azure-ai-language
66
author: jboback
77
ms.author: jboback
88
ms.topic: how-to
9-
ms.date: 12/19/2023
9+
ms.date: 02/09/2024
1010
---
1111

1212
# Connect Custom Question Answering with Azure OpenAI on your data
@@ -72,7 +72,7 @@ At the same time, customers often require a custom answer authoring experience t
7272

7373
:::image type="content" source="../media/question-answering/chat-playground.png" alt-text="A screenshot of the playground page of the Azure OpenAI Studio with sections highlighted." lightbox="../media/question-answering/chat-playground.png":::
7474

75-
You can now start exploring Azure OpenAI capabilities with a no-code approach through the chat playground. It's simply a text box where you can submit a prompt to generate a completion. From this page, you can quickly iterate and experiment with the capabilities. You can also launch a [web app](../../..//openai/concepts/use-your-data.md#using-the-web-app) to chat with the model over the web.
75+
You can now start exploring Azure OpenAI capabilities with a no-code approach through the chat playground. It's simply a text box where you can submit a prompt to generate a completion. From this page, you can quickly iterate and experiment with the capabilities. You can also launch a [web app](../../../openai/how-to/use-web-app.md) to chat with the model over the web.
7676

7777
## Next steps
7878
* [Using Azure OpenAI on your data](../../../openai/concepts/use-your-data.md)

0 commit comments

Comments
 (0)