Skip to content

Commit 5061b30

Browse files
authored
Merge pull request #262031 from MicrosoftDocs/main
12/28/2023 AM Publish
2 parents 035f398 + 33e6a5d commit 5061b30

File tree

12 files changed

+57
-62
lines changed

12 files changed

+57
-62
lines changed

articles/advisor/advisor-reference-cost-recommendations.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -261,8 +261,17 @@ We noticed that your virtual network gateway has been idle for over 90 days. Thi
261261

262262
Learn more about [Virtual network gateway - IdleVNetGateway (Repurpose or delete idle virtual network gateways)](https://aka.ms/aa_idlevpngateway_learnmore).
263263

264+
### Consider migrating to Front Door Standard/Premium
264265

266+
Your Front Door Classic tier contains a large number of domains or routing rules, which adds extra charges. Front Door Standard or Premium do not charge per additional domain or routing rule. Consider migrating to save costs.
265267

268+
Learn more about [Front Door pricing](https://aka.ms/afd-pricing).
269+
270+
### Consider using multiple endpoints under one single Front Door Standard/Premium profile
271+
272+
We detected your subscription contains multiple Front Door Standard/Premium profiles with a small number of endpoints on them. You can save costs in base fees by using multiple endpoints within one profile. You can use a maximum of 10 endpoints with Standard tier and 25 endpoints with Premium tier.
273+
274+
Learn more about [Front Door endpoints](https://aka.ms/afd-endpoints).
266275

267276
## Reserved instances
268277

articles/ai-services/document-intelligence/concept-retrieval-augumented-generation.md

Lines changed: 10 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -19,9 +19,9 @@ monikerRange: '>=doc-intel-3.1.0'
1919

2020
## Introduction
2121

22-
Retrieval-Augmented Generation (RAG) is a document generative AI solution that combines a pretrained Large Language Model (LLM) like ChatGPT with an external data retrieval system to generate an enhanced response incorporating new data outside of the original training data. Adding an information retrieval system to your applications enables you to chat with your documents, generate captivating content, and access the power of Azure OpenAI models for your data. You also have more control over the data used by the LLM as it formulates a response.
22+
Retrieval-Augmented Generation (RAG) is a design pattern that combines a pretrained Large Language Model (LLM) like ChatGPT with an external data retrieval system to generate an enhanced response incorporating new data outside of the original training data. Adding an information retrieval system to your applications enables you to chat with your documents, generate captivating content, and access the power of Azure OpenAI models for your data. You also have more control over the data used by the LLM as it formulates a response.
2323

24-
The Document Intelligence [Layout model](concept-layout.md) is an advanced machine-learning based document analysis API. With semantic chunking, the Layout model offers a comprehensive solution for advanced content extraction and document structure analysis capabilities. With the Layout model, you can easily extract text and structural to divide large bodies of text into smaller, meaningful chunks based on semantic content rather than arbitrary splits. The extracted information can be conveniently outputted to Markdown format, enabling you to define your semantic chunking strategy based on the provided building blocks.
24+
The Document Intelligence [Layout model](concept-layout.md) is an advanced machine-learning based document analysis API. The Layout model offers a comprehensive solution for advanced content extraction and document structure analysis capabilities. With the Layout model, you can easily extract text and structural to divide large bodies of text into smaller, meaningful chunks based on semantic content rather than arbitrary splits. The extracted information can be conveniently outputted to Markdown format, enabling you to define your semantic chunking strategy based on the provided building blocks.
2525

2626
:::image type="content" source="media/rag/azure-rag-processing.png" alt-text="Screenshot depicting semantic chunking with RAG using Azure AI Document Intelligence.":::
2727

@@ -47,7 +47,7 @@ Markdown is a structured and formatted markup language and a popular input for e
4747

4848
* **Large learning model (LLM) compatibility**. The Layout model Markdown formatted output is LLM friendly and facilitates seamless integration into your workflows. You can turn any table in a document into Markdown format and avoid extensive effort parsing the documents for greater LLM understanding.
4949

50-
**Text image processed with Document Intelligence Studio using Layout model**
50+
**Text image processed with Document Intelligence Studio and output to markdown using Layout model**
5151

5252
:::image type="content" source="media/rag/markdown-text-output.png" alt-text="Screenshot of newspaper article processed by Layout model and outputted to Markdown.":::
5353

@@ -103,13 +103,15 @@ You can follow the [Document Intelligence Studio quickstart](quickstarts/try-doc
103103

104104
* [Java](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/documentintelligence/azure-ai-documentintelligence/src/samples/java/com/azure/ai/documentintelligence/AnalyzeLayoutMarkdownOutput.java)
105105

106+
* [.NET](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/documentintelligence/Azure.AI.DocumentIntelligence/samples/Sample_ExtractLayout.md)
107+
106108
## Build document chat with semantic chunking
107109

108110
* [Azure OpenAI on your data](../openai/concepts/use-your-data.md) enables you to run supported chat on your documents. Azure OpenAI on your data applies the Document Intelligence Layout model to extract and parse document data by chunking long text based on tables and paragraphs. You can also customize your chunking strategy using [Azure OpenAI sample scripts](https://github.com/microsoft/sample-app-aoai-chatGPT/tree/main/scripts) located in our GitHub repo.
109111

110-
* Azure AI Document Intelligence is now integrated with [LangChain](https://python.langchain.com/docs/integrations/document_loaders/azure_document_intelligence) as one of its document loaders. You can use it to easily load the data and output to Markdown format. This [notebook](https://microsoft.github.io/SynapseML/docs/Explore%20Algorithms/AI%20Services/Quickstart%20-%20Document%20Question%20and%20Answering%20with%20PDFs/) shows a simple demo for RAG pattern with Azure AI Document Intelligence as document loader and Azure Search as retriever in LangChain.
112+
* Azure AI Document Intelligence is now integrated with [LangChain](https://python.langchain.com/docs/integrations/document_loaders/azure_document_intelligence) as one of its document loaders. You can use it to easily load the data and output to Markdown format. This [notebook](https://github.com/microsoft/Form-Recognizer-Toolkit/blob/main/SampleCode/Python/sample_rag_langchain.ipynb) shows a simple demo for RAG pattern with Azure AI Document Intelligence as document loader and Azure Search as retriever in LangChain.
111113

112-
* The chat with your data solution accelerator[code sample](https://github.com/Azure-Samples/chat-with-your-data-solution-accelerator) demonstrates an end-to-end baseline RAG pattern sample. It uses Azure AI Search as a retriever and Azure AI Document Intelligence for document loading and semantic chunking.
114+
* The chat with your data solution accelerator [code sample](https://github.com/Azure-Samples/chat-with-your-data-solution-accelerator) demonstrates an end-to-end baseline RAG pattern sample. It uses Azure AI Search as a retriever and Azure AI Document Intelligence for document loading and semantic chunking.
113115

114116
## Use case
115117

@@ -122,20 +124,15 @@ If you're looking for a specific section in a document, you can use semantic chu
122124
# pip install langchain langchain-community azure-ai-documentintelligence
123125

124126
from azure.ai.documentintelligence import DocumentIntelligenceClient
125-
from azure.core.credentials import AzureKeyCredential
126127

127-
endpoint = "https://<my-custom-subdomain>.cognitiveservices.azure.com/"
128-
credential = AzureKeyCredential("<api_key>")
128+
endpoint = "https://<my-custom-subdomain>.cognitiveservices.azure.com/"
129+
key = "<api_key>"
129130

130-
document_intelligence_client = DocumentIntelligenceClient(
131-
endpoint, credential)
132-
133131
from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader
134-
135132
from langchain.text_splitter import MarkdownHeaderTextSplitter
136133

137134
# Initiate Azure AI Document Intelligence to load the document. You can either specify file_path or url_path to load the document.
138-
loader = AzureAIDocumentIntelligenceLoader(file_path="<path to your file>", api_key = doc_intelligence_key, api_endpoint = doc_intelligence_endpoint, api_model="prebuilt-layout")
135+
loader = AzureAIDocumentIntelligenceLoader(file_path="<path to your file>", api_key = key, api_endpoint = endpoint, api_model="prebuilt-layout")
139136
docs = loader.load()
140137

141138
# Split the document into chunks base on markdown headers.

articles/azure-arc/system-center-virtual-machine-manager/disaster-recovery.md

Lines changed: 9 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Recover from accidental deletion of resource bridge VM
33
description: Learn how to perform recovery operations for the Azure Arc resource bridge VM in Azure Arc-enabled System Center Virtual Machine Manager disaster scenarios.
44
ms.topic: how-to
55
ms.custom:
6-
ms.date: 12/06/2023
6+
ms.date: 12/28/2023
77
ms.services: azure-arc
88
ms.subservice: azure-arc-scvmm
99
author: Farha-Bano
@@ -27,12 +27,14 @@ To recover from Arc resource bridge VM deletion, you need to deploy a new resour
2727
> DHCP-based Arc Resource Bridge deployment is no longer supported.<br><br>
2828
If you had deployed Arc Resource Bridge earlier using DHCP, you must clean up your deployment by removing your resources from Azure and do a [fresh onboarding](./quickstart-connect-system-center-virtual-machine-manager-to-arc.md).
2929

30-
1. Copy the Azure region and resource IDs of the Arc resource bridge, custom location, and SCVMM Azure resources.
30+
### Recover Arc resource bridge from a Windows machine
3131

32-
2. Download the [onboarding script](/azure/azure-arc/system-center-virtual-machine-manager/quickstart-connect-system-center-virtual-machine-manager-to-arc#download-the-onboarding-script) from the Azure portal and update the following section in the script, using the same information as the original resources in Azure.
32+
1. Copy the Azure region and resource IDs of the Arc resource bridge, custom location, and SCVMM management server Azure resources.
33+
34+
2. Download [this script](https://download.microsoft.com/download/a/a/8/aa8687e4-1a30-485f-9de4-4f15fc576724/arcvmm-windows-dr.ps1) and update the following section in the script using the same information as the original resources in Azure.
3335

3436
```powershell
35-
$location = <Azure region of the resources>
37+
$location = <Azure region of the original Arc resource bridge>
3638
$applianceSubscriptionId = <subscription-id>
3739
$applianceResourceGroupName = <resource-group-name>
3840
$applianceName = <resource-bridge-name>
@@ -45,20 +47,10 @@ If you had deployed Arc Resource Bridge earlier using DHCP, you must clean up yo
4547
$vmmserverResourceGroupName = <resource-group-name>
4648
$vmmserverName= <SCVMM-name-in-azure>
4749
```
50+
51+
3. Run the updated script from the same location where the config YAML files are stored after the initial onboarding. This is most likely the same folder from where you ran the initial onboarding script unless the config files were moved later to a different location. [Provide the inputs](quickstart-connect-system-center-virtual-machine-manager-to-arc.md#script-runtime) as prompted.
4852

49-
3. [Run the onboarding script](/azure/azure-arc/system-center-virtual-machine-manager/quickstart-connect-system-center-virtual-machine-manager-to-arc#download-the-onboarding-script) again with the `-Force` parameter.
50-
51-
``` powershell-interactive
52-
./resource-bridge-onboarding-script.ps1 -Force
53-
```
54-
55-
4. [Provide the inputs](/azure/azure-arc/system-center-virtual-machine-manager/quickstart-connect-system-center-virtual-machine-manager-to-arc#script-runtime) as prompted.
56-
57-
5. In the same machine, run the following scripts, as applicable:
58-
- [Download the script](https://download.microsoft.com/download/6/b/4/6b4a5009-fed8-46c2-b22b-b24a4d0a06e3/arcvmm-appliance-dr.ps1) if you're running the script from a Windows machine
59-
- [Download the script](https://download.microsoft.com/download/0/5/c/05c2bcb8-87f8-4ead-9757-a87a0759071c/arcvmm-appliance-dr.sh) if you're running the script from a Linux machine
60-
61-
6. Once the script is run successfully, the old Resource Bridge is recovered and the connection is re-established to the existing Azure-enabled SCVMM resources.
53+
4. Once the script is run successfully, the old Resource Bridge is recovered, and the connection is re-established to the existing Azure-enabled SCVMM resources.
6254

6355
## Next steps
6456

articles/azure-monitor/essentials/data-collection-endpoint-overview.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,8 @@ This table describes the components of a data collection endpoint, related regio
2626

2727
| Component | Description | Regionality considerations |Data collection rule configuration |
2828
|:---|:---|:---|
29-
| Configuration access endpoint | The endpoint from which Azure Monitor Agent retrieves data collection rules (DCRs).<br>Example: `<unique-dce-identifier>.<regionname>-1.handler.control`. | Same region as the monitored resources. | Set on the **Basics** tab when you create a data collection rule using the portal. |
30-
| Logs ingestion endpoint | The endpoint that ingests logs into the data ingestion pipeline. Azure Monitor transforms the data and sends it to the defined destination Log Analytics workspace and table based on a DCR ID sent with the collected data.<br>Example: `<unique-dce-identifier>.<regionname>-1.ingest`. |Same region as the destination Log Analytics workspace. |Set on the **Resources** tab when you create a data collection rule using the portal.|
29+
| Logs ingestion endpoint | The endpoint that ingests logs into the data ingestion pipeline. Azure Monitor transforms the data and sends it to the defined destination Log Analytics workspace and table based on a DCR ID sent with the collected data.<br>Example: `<unique-dce-identifier>.<regionname>-1.ingest`. |Same region as the destination Log Analytics workspace. |Set on the **Basics** tab when you create a data collection rule using the portal. |
30+
| Configuration access endpoint | The endpoint from which Azure Monitor Agent retrieves data collection rules (DCRs).<br>Example: `<unique-dce-identifier>.<regionname>-1.handler.control`. | Same region as the monitored resources. | Set on the **Resources** tab when you create a data collection rule using the portal.|
3131

3232

3333
## How to set up data collection endpoints based on your deployment

articles/azure-monitor/logs/ingest-logs-event-hub.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ author: guywi-ms
66
ms.author: guywild
77
ms.reviewer: ilanawaitser
88
ms.topic: tutorial
9-
ms.date: 09/20/2022
9+
ms.date: 12/28/2023
1010
ms.custom: references_regions
1111

1212
# customer-intent: As a DevOps engineer, I want to ingest data from an event hub into a Log Analytics workspace so that I can monitor logs that I send to Azure Event Hubs.

articles/azure-resource-manager/bicep/deployment-stacks.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -626,7 +626,7 @@ New-AzResourceGroupDeploymentStack `
626626
-TemplateFile "<bicep-file-name>" `
627627
-DenySettingsMode "DenyDelete" `
628628
-DenySettingsExcludedAction "Microsoft.Compute/virtualMachines/write Microsoft.StorageAccounts/delete" `
629-
-DenySettingsExcludedPrincipal "<object-id>" "<object-id>"
629+
-DenySettingsExcludedPrincipal "<object-id> <object-id>"
630630
```
631631

632632
# [CLI](#tab/azure-cli)
@@ -638,7 +638,7 @@ az stack group create \
638638
--template-file '<bicep-file-name>' \
639639
--deny-settings-mode 'denyDelete' \
640640
--deny-settings-excluded-actions 'Microsoft.Compute/virtualMachines/write Microsoft.StorageAccounts/delete' \
641-
--deny-settings-excluded-principals '<object-id>' '<object-id>'
641+
--deny-settings-excluded-principals '<object-id> <object-id>'
642642
```
643643

644644
# [Portal](#tab/azure-portal)
@@ -658,7 +658,7 @@ New-AzSubscriptionDeploymentStack `
658658
-TemplateFile "<bicep-file-name>" `
659659
-DenySettingsMode "DenyDelete" `
660660
-DenySettingsExcludedAction "Microsoft.Compute/virtualMachines/write Microsoft.StorageAccounts/delete" `
661-
-DenySettingsExcludedPrincipal "<object-id>" "<object-id>"
661+
-DenySettingsExcludedPrincipal "<object-id> <object-id>"
662662
```
663663

664664
Use the `DeploymentResourceGroupName` parameter to specify the resource group name at which the deployment stack is created. If a scope isn't specified, it uses the scope of the deployment stack.
@@ -672,7 +672,7 @@ az stack sub create \
672672
--template-file '<bicep-file-name>' \
673673
--deny-settings-mode 'denyDelete' \
674674
--deny-settings-excluded-actions 'Microsoft.Compute/virtualMachines/write Microsoft.StorageAccounts/delete' \
675-
--deny-settings-excluded-principals '<object-id>' '<object-id>'
675+
--deny-settings-excluded-principals '<object-id> <object-id>'
676676
```
677677

678678
Use the `deployment-resource-group` parameter to specify the resource group at which the deployment stack is created. If a scope isn't specified, it uses the scope of the deployment stack.
@@ -694,7 +694,7 @@ New-AzManagmentGroupDeploymentStack `
694694
-TemplateFile "<bicep-file-name>" `
695695
-DenySettingsMode "DenyDelete" `
696696
-DenySettingsExcludedActions "Microsoft.Compute/virtualMachines/write Microsoft.StorageAccounts/delete" `
697-
-DenySettingsExcludedPrincipal "<object-id>" "<object-id>"
697+
-DenySettingsExcludedPrincipal "<object-id> <object-id>"
698698
```
699699

700700
Use the `DeploymentSubscriptionId ` parameter to specify the subscription ID at which the deployment stack is created. If a scope isn't specified, it uses the scope of the deployment stack.
@@ -708,7 +708,7 @@ az stack mg create \
708708
--template-file '<bicep-file-name>' \
709709
--deny-settings-mode 'denyDelete' \
710710
--deny-settings-excluded-actions 'Microsoft.Compute/virtualMachines/write Microsoft.StorageAccounts/delete' \
711-
--deny-settings-excluded-principals '<object-id>' '<object-id>'
711+
--deny-settings-excluded-principals '<object-id> <object-id>'
712712
```
713713

714714
Use the `deployment-subscription ` parameter to specify the subscription ID at which the deployment stack is created. If a scope isn't specified, it uses the scope of the deployment stack.

articles/communication-services/concepts/email/email-domain-and-sender-authentication.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ Email Communication Service resources are designed to enable domain validation s
4545

4646
* [Get started with create and manage Email Communication Service in Azure Communication Service](../../quickstarts/email/create-email-communication-resource.md)
4747

48-
* [Get started by connecting Email Communication Service with an Azure Communication Service resource](../../quickstarts/email/connect-email-communication-resource.md)
48+
* [Get started by connecting Email Communication Service with Azure Communication Service resource](../../quickstarts/email/connect-email-communication-resource.md)
4949

5050
The following documents may be interesting to you:
5151

articles/cosmos-db/vector-database.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -71,6 +71,15 @@ The native vector search feature in our NoSQL API is under development. In the m
7171
- [.NET tutorial - recipe chatbot](https://github.com/microsoft/AzureDataRetrievalAugmentedGenerationSamples/tree/main/C%23/CosmosDB-NoSQL_CognitiveSearch)
7272
- [.NET tutorial - recipe chatbot w/ Semantic Kernel](https://github.com/microsoft/AzureDataRetrievalAugmentedGenerationSamples/tree/main/C%23/CosmosDB-NoSQL_CognitiveSearch_SemanticKernel)
7373
- [Python notebook tutorial - Azure product chatbot](https://github.com/microsoft/AzureDataRetrievalAugmentedGenerationSamples/tree/main/Python/CosmosDB-NoSQL_CognitiveSearch)
74+
75+
## Next step
76+
77+
[30-day Free Trial without Azure subscription](https://azure.microsoft.com/try/cosmosdb/)
78+
79+
[90-day Free Trial with Azure AI Advantage](ai-advantage.md)
80+
81+
> [!div class="nextstepaction"]
82+
> [Use the Azure Cosmos DB lifetime free tier](free-tier.md)
7483
7584
## Vector database related concepts
7685

@@ -112,12 +121,3 @@ The process of creating good prompts for a scenario is called prompt engineering
112121
### Tokens
113122

114123
Tokens are small chunks of text generated by splitting the input text into smaller segments. These segments can either be words or groups of characters, varying in length from a single character to an entire word. For instance, the word hamburger would be divided into tokens such as ham, bur, and ger while a short and common word like pear would be considered a single token. LLMs like ChatGPT, GPT-3.5, or GPT-4 break words into tokens for processing.
115-
116-
## Next step
117-
118-
[30-day Free Trial without Azure subscription](https://azure.microsoft.com/try/cosmosdb/)
119-
120-
[90-day Free Trial with Azure AI Advantage](ai-advantage.md)
121-
122-
> [!div class="nextstepaction"]
123-
> [Use the Azure Cosmos DB lifetime free tier](free-tier.md)

0 commit comments

Comments
 (0)