Skip to content

Commit 3c71e39

Browse files
author
Joey Chen
authored
Merge branch 'MicrosoftDocs:main' into patch-3
2 parents 07b4810 + 682e09e commit 3c71e39

File tree

78 files changed

+563
-674
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

78 files changed

+563
-674
lines changed

articles/advisor/advisor-reference-cost-recommendations.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -261,8 +261,17 @@ We noticed that your virtual network gateway has been idle for over 90 days. Thi
261261

262262
Learn more about [Virtual network gateway - IdleVNetGateway (Repurpose or delete idle virtual network gateways)](https://aka.ms/aa_idlevpngateway_learnmore).
263263

264+
### Consider migrating to Front Door Standard/Premium
264265

266+
Your Front Door Classic tier contains a large number of domains or routing rules, which adds extra charges. Front Door Standard or Premium do not charge per additional domain or routing rule. Consider migrating to save costs.
265267

268+
Learn more about [Front Door pricing](https://aka.ms/afd-pricing).
269+
270+
### Consider using multiple endpoints under one single Front Door Standard/Premium profile
271+
272+
We detected your subscription contains multiple Front Door Standard/Premium profiles with a small number of endpoints on them. You can save costs in base fees by using multiple endpoints within one profile. You can use a maximum of 10 endpoints with Standard tier and 25 endpoints with Premium tier.
273+
274+
Learn more about [Front Door endpoints](https://aka.ms/afd-endpoints).
266275

267276
## Reserved instances
268277

articles/ai-services/document-intelligence/concept-retrieval-augumented-generation.md

Lines changed: 10 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -19,9 +19,9 @@ monikerRange: '>=doc-intel-3.1.0'
1919

2020
## Introduction
2121

22-
Retrieval-Augmented Generation (RAG) is a document generative AI solution that combines a pretrained Large Language Model (LLM) like ChatGPT with an external data retrieval system to generate an enhanced response incorporating new data outside of the original training data. Adding an information retrieval system to your applications enables you to chat with your documents, generate captivating content, and access the power of Azure OpenAI models for your data. You also have more control over the data used by the LLM as it formulates a response.
22+
Retrieval-Augmented Generation (RAG) is a design pattern that combines a pretrained Large Language Model (LLM) like ChatGPT with an external data retrieval system to generate an enhanced response incorporating new data outside of the original training data. Adding an information retrieval system to your applications enables you to chat with your documents, generate captivating content, and access the power of Azure OpenAI models for your data. You also have more control over the data used by the LLM as it formulates a response.
2323

24-
The Document Intelligence [Layout model](concept-layout.md) is an advanced machine-learning based document analysis API. With semantic chunking, the Layout model offers a comprehensive solution for advanced content extraction and document structure analysis capabilities. With the Layout model, you can easily extract text and structural to divide large bodies of text into smaller, meaningful chunks based on semantic content rather than arbitrary splits. The extracted information can be conveniently outputted to Markdown format, enabling you to define your semantic chunking strategy based on the provided building blocks.
24+
The Document Intelligence [Layout model](concept-layout.md) is an advanced machine-learning based document analysis API. The Layout model offers a comprehensive solution for advanced content extraction and document structure analysis capabilities. With the Layout model, you can easily extract text and structural to divide large bodies of text into smaller, meaningful chunks based on semantic content rather than arbitrary splits. The extracted information can be conveniently outputted to Markdown format, enabling you to define your semantic chunking strategy based on the provided building blocks.
2525

2626
:::image type="content" source="media/rag/azure-rag-processing.png" alt-text="Screenshot depicting semantic chunking with RAG using Azure AI Document Intelligence.":::
2727

@@ -47,7 +47,7 @@ Markdown is a structured and formatted markup language and a popular input for e
4747

4848
* **Large learning model (LLM) compatibility**. The Layout model Markdown formatted output is LLM friendly and facilitates seamless integration into your workflows. You can turn any table in a document into Markdown format and avoid extensive effort parsing the documents for greater LLM understanding.
4949

50-
**Text image processed with Document Intelligence Studio using Layout model**
50+
**Text image processed with Document Intelligence Studio and output to markdown using Layout model**
5151

5252
:::image type="content" source="media/rag/markdown-text-output.png" alt-text="Screenshot of newspaper article processed by Layout model and outputted to Markdown.":::
5353

@@ -103,13 +103,15 @@ You can follow the [Document Intelligence Studio quickstart](quickstarts/try-doc
103103

104104
* [Java](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/documentintelligence/azure-ai-documentintelligence/src/samples/java/com/azure/ai/documentintelligence/AnalyzeLayoutMarkdownOutput.java)
105105

106+
* [.NET](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/documentintelligence/Azure.AI.DocumentIntelligence/samples/Sample_ExtractLayout.md)
107+
106108
## Build document chat with semantic chunking
107109

108110
* [Azure OpenAI on your data](../openai/concepts/use-your-data.md) enables you to run supported chat on your documents. Azure OpenAI on your data applies the Document Intelligence Layout model to extract and parse document data by chunking long text based on tables and paragraphs. You can also customize your chunking strategy using [Azure OpenAI sample scripts](https://github.com/microsoft/sample-app-aoai-chatGPT/tree/main/scripts) located in our GitHub repo.
109111

110-
* Azure AI Document Intelligence is now integrated with [LangChain](https://python.langchain.com/docs/integrations/document_loaders/azure_document_intelligence) as one of its document loaders. You can use it to easily load the data and output to Markdown format. This [notebook](https://microsoft.github.io/SynapseML/docs/Explore%20Algorithms/AI%20Services/Quickstart%20-%20Document%20Question%20and%20Answering%20with%20PDFs/) shows a simple demo for RAG pattern with Azure AI Document Intelligence as document loader and Azure Search as retriever in LangChain.
112+
* Azure AI Document Intelligence is now integrated with [LangChain](https://python.langchain.com/docs/integrations/document_loaders/azure_document_intelligence) as one of its document loaders. You can use it to easily load the data and output to Markdown format. This [notebook](https://github.com/microsoft/Form-Recognizer-Toolkit/blob/main/SampleCode/Python/sample_rag_langchain.ipynb) shows a simple demo for RAG pattern with Azure AI Document Intelligence as document loader and Azure Search as retriever in LangChain.
111113

112-
* The chat with your data solution accelerator[code sample](https://github.com/Azure-Samples/chat-with-your-data-solution-accelerator) demonstrates an end-to-end baseline RAG pattern sample. It uses Azure AI Search as a retriever and Azure AI Document Intelligence for document loading and semantic chunking.
114+
* The chat with your data solution accelerator [code sample](https://github.com/Azure-Samples/chat-with-your-data-solution-accelerator) demonstrates an end-to-end baseline RAG pattern sample. It uses Azure AI Search as a retriever and Azure AI Document Intelligence for document loading and semantic chunking.
113115

114116
## Use case
115117

@@ -122,20 +124,15 @@ If you're looking for a specific section in a document, you can use semantic chu
122124
# pip install langchain langchain-community azure-ai-documentintelligence
123125

124126
from azure.ai.documentintelligence import DocumentIntelligenceClient
125-
from azure.core.credentials import AzureKeyCredential
126127

127-
endpoint = "https://<my-custom-subdomain>.cognitiveservices.azure.com/"
128-
credential = AzureKeyCredential("<api_key>")
128+
endpoint = "https://<my-custom-subdomain>.cognitiveservices.azure.com/"
129+
key = "<api_key>"
129130

130-
document_intelligence_client = DocumentIntelligenceClient(
131-
endpoint, credential)
132-
133131
from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader
134-
135132
from langchain.text_splitter import MarkdownHeaderTextSplitter
136133

137134
# Initiate Azure AI Document Intelligence to load the document. You can either specify file_path or url_path to load the document.
138-
loader = AzureAIDocumentIntelligenceLoader(file_path="<path to your file>", api_key = doc_intelligence_key, api_endpoint = doc_intelligence_endpoint, api_model="prebuilt-layout")
135+
loader = AzureAIDocumentIntelligenceLoader(file_path="<path to your file>", api_key = key, api_endpoint = endpoint, api_model="prebuilt-layout")
139136
docs = loader.load()
140137

141138
# Split the document into chunks base on markdown headers.

articles/ai-services/document-intelligence/toc.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -172,6 +172,7 @@ items:
172172
href: concept-layout.md
173173
- name: 🆕 Add-on capabilities
174174
displayName: extract, formula, font, styles, fontStyle, ocr.highResolution, ocr.formula, high resolution, background color, inline, display
175+
href: concept-add-on-capabilities.md
175176
- name: 🆕 Query field extraction
176177
displayName: queries, fields, OpenAI, chat
177178
href: concept-query-fields.md

articles/app-service/overview-managed-identity.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -432,9 +432,9 @@ The **IDENTITY_ENDPOINT** is a local URL from which your app can request tokens.
432432
> | resource | Query | The Microsoft Entra resource URI of the resource for which a token should be obtained. This could be one of the [Azure services that support Microsoft Entra authentication](../active-directory/managed-identities-azure-resources/services-support-managed-identities.md#azure-services-that-support-azure-ad-authentication) or any other resource URI. |
433433
> | api-version | Query | The version of the token API to be used. Use `2019-08-01`. |
434434
> | X-IDENTITY-HEADER | Header | The value of the IDENTITY_HEADER environment variable. This header is used to help mitigate server-side request forgery (SSRF) attacks. |
435-
> | client_id | Query | (Optional) The client ID of the user-assigned identity to be used. Cannot be used on a request that includes `principal_id`, `mi_res_id`, or `object_id`. If all ID parameters (`client_id`, `principal_id`, `object_id`, and `mi_res_id`) are omitted, the system-assigned identity is used. |
436-
> | principal_id | Query | (Optional) The principal ID of the user-assigned identity to be used. `object_id` is an alias that may be used instead. Cannot be used on a request that includes client_id, mi_res_id, or object_id. If all ID parameters (`client_id`, `principal_id`, `object_id`, and `mi_res_id`) are omitted, the system-assigned identity is used. |
437-
> | mi_res_id | Query | (Optional) The Azure resource ID of the user-assigned identity to be used. Cannot be used on a request that includes `principal_id`, `client_id`, or `object_id`. If all ID parameters (`client_id`, `principal_id`, `object_id`, and `mi_res_id`) are omitted, the system-assigned identity is used. |
435+
> | client_id | Query | (Optional) The client ID of the user-assigned identity to be used. Cannot be used on a request that includes `principal_id`, `msi_res_id`, or `object_id`. If all ID parameters (`client_id`, `principal_id`, `object_id`, and `msi_res_id`) are omitted, the system-assigned identity is used. |
436+
> | principal_id | Query | (Optional) The principal ID of the user-assigned identity to be used. `object_id` is an alias that may be used instead. Cannot be used on a request that includes client_id, msi_res_id, or object_id. If all ID parameters (`client_id`, `principal_id`, `object_id`, and `msi_res_id`) are omitted, the system-assigned identity is used. |
437+
> | msi_res_id | Query | (Optional) The Azure resource ID of the user-assigned identity to be used. Cannot be used on a request that includes `principal_id`, `client_id`, or `object_id`. If all ID parameters (`client_id`, `principal_id`, `object_id`, and `msi_res_id`) are omitted, the system-assigned identity is used. |
438438
439439
> [!IMPORTANT]
440440
> If you are attempting to obtain tokens for user-assigned identities, you must include one of the optional properties. Otherwise the token service will attempt to obtain a token for a system-assigned identity, which may or may not exist.

articles/azure-arc/system-center-virtual-machine-manager/disaster-recovery.md

Lines changed: 9 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Recover from accidental deletion of resource bridge VM
33
description: Learn how to perform recovery operations for the Azure Arc resource bridge VM in Azure Arc-enabled System Center Virtual Machine Manager disaster scenarios.
44
ms.topic: how-to
55
ms.custom:
6-
ms.date: 12/06/2023
6+
ms.date: 12/28/2023
77
ms.services: azure-arc
88
ms.subservice: azure-arc-scvmm
99
author: Farha-Bano
@@ -27,12 +27,14 @@ To recover from Arc resource bridge VM deletion, you need to deploy a new resour
2727
> DHCP-based Arc Resource Bridge deployment is no longer supported.<br><br>
2828
If you had deployed Arc Resource Bridge earlier using DHCP, you must clean up your deployment by removing your resources from Azure and do a [fresh onboarding](./quickstart-connect-system-center-virtual-machine-manager-to-arc.md).
2929

30-
1. Copy the Azure region and resource IDs of the Arc resource bridge, custom location, and SCVMM Azure resources.
30+
### Recover Arc resource bridge from a Windows machine
3131

32-
2. Download the [onboarding script](/azure/azure-arc/system-center-virtual-machine-manager/quickstart-connect-system-center-virtual-machine-manager-to-arc#download-the-onboarding-script) from the Azure portal and update the following section in the script, using the same information as the original resources in Azure.
32+
1. Copy the Azure region and resource IDs of the Arc resource bridge, custom location, and SCVMM management server Azure resources.
33+
34+
2. Download [this script](https://download.microsoft.com/download/a/a/8/aa8687e4-1a30-485f-9de4-4f15fc576724/arcvmm-windows-dr.ps1) and update the following section in the script using the same information as the original resources in Azure.
3335

3436
```powershell
35-
$location = <Azure region of the resources>
37+
$location = <Azure region of the original Arc resource bridge>
3638
$applianceSubscriptionId = <subscription-id>
3739
$applianceResourceGroupName = <resource-group-name>
3840
$applianceName = <resource-bridge-name>
@@ -45,20 +47,10 @@ If you had deployed Arc Resource Bridge earlier using DHCP, you must clean up yo
4547
$vmmserverResourceGroupName = <resource-group-name>
4648
$vmmserverName= <SCVMM-name-in-azure>
4749
```
50+
51+
3. Run the updated script from the same location where the config YAML files are stored after the initial onboarding. This is most likely the same folder from where you ran the initial onboarding script unless the config files were moved later to a different location. [Provide the inputs](quickstart-connect-system-center-virtual-machine-manager-to-arc.md#script-runtime) as prompted.
4852

49-
3. [Run the onboarding script](/azure/azure-arc/system-center-virtual-machine-manager/quickstart-connect-system-center-virtual-machine-manager-to-arc#download-the-onboarding-script) again with the `-Force` parameter.
50-
51-
``` powershell-interactive
52-
./resource-bridge-onboarding-script.ps1 -Force
53-
```
54-
55-
4. [Provide the inputs](/azure/azure-arc/system-center-virtual-machine-manager/quickstart-connect-system-center-virtual-machine-manager-to-arc#script-runtime) as prompted.
56-
57-
5. In the same machine, run the following scripts, as applicable:
58-
- [Download the script](https://download.microsoft.com/download/6/b/4/6b4a5009-fed8-46c2-b22b-b24a4d0a06e3/arcvmm-appliance-dr.ps1) if you're running the script from a Windows machine
59-
- [Download the script](https://download.microsoft.com/download/0/5/c/05c2bcb8-87f8-4ead-9757-a87a0759071c/arcvmm-appliance-dr.sh) if you're running the script from a Linux machine
60-
61-
6. Once the script is run successfully, the old Resource Bridge is recovered and the connection is re-established to the existing Azure-enabled SCVMM resources.
53+
4. Once the script is run successfully, the old Resource Bridge is recovered, and the connection is re-established to the existing Azure-enabled SCVMM resources.
6254

6355
## Next steps
6456

articles/azure-monitor/essentials/data-collection-endpoint-overview.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,8 @@ This table describes the components of a data collection endpoint, related regio
2626

2727
| Component | Description | Regionality considerations |Data collection rule configuration |
2828
|:---|:---|:---|
29-
| Configuration access endpoint | The endpoint from which Azure Monitor Agent retrieves data collection rules (DCRs).<br>Example: `<unique-dce-identifier>.<regionname>-1.handler.control`. | Same region as the monitored resources. | Set on the **Basics** tab when you create a data collection rule using the portal. |
30-
| Logs ingestion endpoint | The endpoint that ingests logs into the data ingestion pipeline. Azure Monitor transforms the data and sends it to the defined destination Log Analytics workspace and table based on a DCR ID sent with the collected data.<br>Example: `<unique-dce-identifier>.<regionname>-1.ingest`. |Same region as the destination Log Analytics workspace. |Set on the **Resources** tab when you create a data collection rule using the portal.|
29+
| Logs ingestion endpoint | The endpoint that ingests logs into the data ingestion pipeline. Azure Monitor transforms the data and sends it to the defined destination Log Analytics workspace and table based on a DCR ID sent with the collected data.<br>Example: `<unique-dce-identifier>.<regionname>-1.ingest`. |Same region as the destination Log Analytics workspace. |Set on the **Basics** tab when you create a data collection rule using the portal. |
30+
| Configuration access endpoint | The endpoint from which Azure Monitor Agent retrieves data collection rules (DCRs).<br>Example: `<unique-dce-identifier>.<regionname>-1.handler.control`. | Same region as the monitored resources. | Set on the **Resources** tab when you create a data collection rule using the portal.|
3131

3232

3333
## How to set up data collection endpoints based on your deployment

articles/azure-monitor/logs/cross-workspace-query.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Query and correlated data from multiple Log Analytics workspaces, a
44
ms.topic: how-to
55
author: guywi-ms
66
ms.author: guywild
7-
ms.date: 05/30/2023
7+
ms.date: 12/28/2023
88
# Customer intent: As a data analyst, I want to write KQL queries that correlate data from multiple Log Analytics workspaces, applications, or resources, to enable my analysis.
99

1010
---

articles/azure-monitor/logs/ingest-logs-event-hub.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ author: guywi-ms
66
ms.author: guywild
77
ms.reviewer: ilanawaitser
88
ms.topic: tutorial
9-
ms.date: 09/20/2022
9+
ms.date: 12/28/2023
1010
ms.custom: references_regions
1111

1212
# customer-intent: As a DevOps engineer, I want to ingest data from an event hub into a Log Analytics workspace so that I can monitor logs that I send to Azure Event Hubs.

articles/azure-monitor/logs/log-analytics-overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: Overview of Log Analytics in Azure Monitor
33
description: This overview describes Log Analytics, which is a tool in the Azure portal used to edit and run log queries for analyzing data in Azure Monitor logs.
44
ms.topic: conceptual
5-
ms.date: 06/28/2022
5+
ms.date: 12/28/2023
66

77
---
88

articles/azure-monitor/logs/log-analytics-workspace-insights-overview.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,8 @@ services: azure-monitor
55
ms.topic: conceptual
66
author: guywild
77
ms.author: guywild
8-
ms.reviewer: noakuper
9-
ms.date: 06/27/2022
8+
ms.reviewer: osalzberg
9+
ms.date: 12/28/2023
1010

1111
---
1212

0 commit comments

Comments
 (0)