Skip to content

Commit 7551960

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into subnet-prefixes
2 parents a1657d6 + c1ac821 commit 7551960

37 files changed

+633
-810
lines changed

articles/ai-services/openai/concepts/assistants.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,9 @@ Assistants API supports persistent automatically managed threads. This means tha
2525
- [Code Interpreter](../how-to/code-interpreter.md)
2626
- [Function calling](../how-to/assistant-functions.md)
2727

28+
> [!TIP]
29+
> There is no additional [pricing](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/) or [quota](../quotas-limits.md) for using Assistants unless you use the [code interpreter](../how-to/code-interpreter.md) tool.
30+
2831
Assistant API is built on the same capabilities that power OpenAI’s GPT product. Some possible use cases range from AI-powered product recommender, sales analyst app, coding assistant, employee Q&A chatbot, and more. Start building on the no-code Assistants playground on the Azure OpenAI Studio or start building with the API.
2932

3033
> [!IMPORTANT]

articles/ai-services/openai/faq.yml

Lines changed: 54 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,7 @@ sections:
118118
answer:
119119
If the service performs processing, you will be charged even if the status code is not successful (not 200).
120120
Common examples of this are, a 400 error due to a content filter or input limit, or a 408 error due to a timeout. Charges will also occur when a `status 200` is received with a `finish_reason` of `content_filter`.
121-
In this case the prompt did not have any issues, but the completion generated by the model was detected to violate the content filtering rules which results in the completion being filtered.
121+
In this case the prompt did not have any issues, but the completion generated by the model was detected to violate the content filtering rules, which result in the completion being filtered.
122122
123123
If the service doesn't perform processing, you won't be charged.
124124
For example, a 401 error due to authentication or a 429 error due to exceeding the Rate Limit.
@@ -228,7 +228,58 @@ sections:
228228
What are the known limitations of GPT-4 Turbo with Vision?
229229
answer: |
230230
See the [limitations](./concepts/gpt-with-vision.md#limitations) section of the GPT-4 Turbo with Vision concepts guide.
231+
- name: Assistants
232+
questions:
233+
- question: |
234+
Do you store any data used in the Assistants API?
235+
answer: |
236+
Yes. Unlike Chat Completions API, Azure OpenAI Assistants is a stateful API, meaning it retains data. There are two types of data stored in the Assistants API:
237+
* Stateful entities: Threads, messages, and runs created during Assistants use.
238+
* Files: Uploaded during Assistants setup or as part of a message.
239+
- question: |
240+
Where is this data stored?
241+
answer: |
242+
Data is stored in a secure, Microsoft-managed storage account that is logically separated.
243+
- question: |
244+
How long is this data stored?
245+
answer: |
246+
All used data persists in this system unless you explicitly delete this data. Use the [delete function](./assistants-reference-threads.md) with the thread ID of the thread you want to delete. Clearing the Run in the Assistants Playground does not delete threads, however deleting them using delete function will not list them in the thread page.
247+
- question: |
248+
Can I bring my own data store to use with Assistants?
249+
answer: |
250+
No. Currently Assistants supports only local files uploaded to the Assistants-managed storage. You cannot use your private storage account with Assistants.
251+
- question: |
252+
Is my data used by Microsoft for training models?
253+
answer: |
254+
No. Data is not used for Microsoft not used for training models. See the [Responsible AI documentation](/legal/cognitive-services/openai/data-privacy?context=%2Fazure%2Fai-services%2Fopenai%2Fcontext%2Fcontext) for more information.
255+
- question: |
256+
Where is data stored geographically?
257+
answer: |
258+
Azure OpenAI Assistants endpoints are regional, and data is stored in the same region as the endpoint. For more information, see the [Azure data residency documentation](https://azure.microsoft.com/explore/global-infrastructure/data-residency/#overview).
259+
- question: |
260+
How am I charged for Assistants?
261+
answer: |
262+
Currently, when you use Assistants API, you're billed for the following:
263+
- Inference cost (input and output) of the base model you're using for each Assistant (for example gpt-4-0125). If you've created multiple Assistants, you will be charged for the base model attached to each Assistant.
264+
- If you've enabled the Code Interpreter tool. For example if your assistant calls Code Interpreter simultaneously in two different threads, this would create two Code Interpreter sessions, each of which would be charged. Each session is active by default for one hour, which means that you would only pay this fee once if your user keeps giving instructions to Code Interpreter in the same thread for up to one hour.
231265
266+
For more information, see the [pricing page](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/).
267+
- question: |
268+
Is there any additional pricing or quota for using Assistants?
269+
answer: |
270+
No. All [quotas](./quotas-limits.md) apply to using models with Assistants.
271+
- question: |
272+
Does the Assistants API support non-Azure OpenAI models?
273+
answer: |
274+
Assistants API only supports Azure OpenAI models.
275+
- question: |
276+
Is the Assistants API generally available?
277+
answer: |
278+
The Assistants API is currently in public preview. Stay informed of our latest product updates by regularly visiting our [What's New](./whats-new.md) page.
279+
- question: |
280+
What are some examples or other resources I can use to learn about Assistants?
281+
answer: |
282+
See the [Conceptual](./concepts/assistants.md), [quickstart](./assistants-quickstart.md), [how-to](./how-to/assistant.md) articles for information on getting started and using Assistants. You can also check out Azure OpenAI Assistants code samples on [GitHub](https://github.com/Azure-Samples/azureai-samples/tree/main/scenarios/Assistants).
232283
- name: Web app
233284
questions:
234285
- question: |
@@ -260,7 +311,7 @@ sections:
260311
- question: |
261312
How can I customize or automate the index creation process?
262313
answer:
263-
You can prepare the index yourself using a [script provided on GitHub](https://go.microsoft.com/fwlink/?linkid=2244395). Using this script will create an Azure AI Search index with all the information needed to better leverage your data, with your documents broken down into manageable chunks. Please see the README file with the data preparation code for details on how to run it.
314+
You can prepare the index yourself using a [script provided on GitHub](https://go.microsoft.com/fwlink/?linkid=2244395). Using this script will create an Azure AI Search index with all the information needed to better use your data, with your documents broken down into manageable chunks. See the README file with the data preparation code for details on how to run it.
264315
- question: |
265316
How can I update my index?
266317
answer:
@@ -289,7 +340,7 @@ sections:
289340
If Semantic Search is enabled for my Azure AI Search resource, will it be automatically applied to Azure OpenAI on your data in the Azure OpenAI Studio?
290341
answer:
291342
When you select "Azure AI Search" as the data source, you can choose to apply semantic search.
292-
If you select "Azure Blob Container" or "Upload files" as the data source, you can create the index as usual. Afterwards you would re-ingest the data using the "Azure AI Search" option to select the same index and apply Semantic Search. You will then be ready to chat on your data with semantic search applied.
343+
If you select "Azure Blob Container" or "Upload files" as the data source, you can create the index as usual. Afterwards you would reingest the data using the "Azure AI Search" option to select the same index and apply Semantic Search. You will then be ready to chat on your data with semantic search applied.
293344
- question: |
294345
How can I add vector embeddings when indexing my data?
295346
answer:

articles/ai-services/openai/how-to/managed-identity.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ In the following sections, you'll use the Azure CLI to sign in, and obtain a bea
2929
- Azure CLI - [Installation Guide](/cli/azure/install-azure-cli)
3030
- The following Python libraries: os, requests, json, openai, azure-identity
3131

32-
## Assign yourself to the Cognitive Services User role
32+
## Assign role
3333

3434
Assign yourself either the [Cognitive Services OpenAI User](role-based-access-control.md#cognitive-services-openai-user) or [Cognitive Services OpenAI Contributor](role-based-access-control.md#cognitive-services-openai-contributor) role to allow you to use your account to make Azure OpenAI inference API calls rather than having to use key-based auth. After you make this change it can take up to 5 minutes before the change takes effect.
3535

articles/azure-functions/consumption-plan.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ You can also create function apps in a Consumption plan when you publish a Funct
3838

3939
## Multiple apps in the same plan
4040

41-
The general recommendation is for each function app to have its own Consumption plan. However, if needed, function apps in the same region can be assigned to the same Consumption plan. Keep in mind that there is a [limit to the number of function apps that can run in a Consumption plan](functions-scale.md#service-limits). Function apps in a given plan are all scaled together, so any issues with scaling can affect all apps in the plan.
41+
The general recommendation is for each function app to have its own Consumption plan. However, if needed, function apps in the same region can be assigned to the same Consumption plan. Keep in mind that there is a [limit to the number of function apps that can run in a Consumption plan](functions-scale.md#service-limits). Function apps in the same plan still scale independently of each other.
4242

4343
## Next steps
4444

articles/azure-monitor/agents/azure-monitor-agent-data-collection-endpoint.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ Data Collection Endpoints public IP addresses are not part of the abovementioned
3535
| Microsoft Azure operated by 21Vianet | Replace '.com' above with '.cn' | Same as above | Same as above | Same as above| Same as above |
3636

3737
>[!NOTE]
38-
> If you use private links on the agent, you must **only** add the [private data collection endpoints (DCEs)](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint). The agent does not use the non-private endpoints listed above when using private links/data collection endpoints.
38+
> If you use private links on the agent, you must **only** add the [private data collection endpoints (DCEs)](../essentials/data-collection-endpoint-overview.md#components-of-a-dce). The agent does not use the non-private endpoints listed above when using private links/data collection endpoints.
3939
> The Azure Monitor Metrics (custom metrics) preview isn't available in Azure Government and Azure operated by 21Vianet clouds.
4040
4141
## Proxy configuration
@@ -265,7 +265,7 @@ New-AzConnectedMachineExtension -Name AzureMonitorLinuxAgent -ExtensionType Azur
265265
1. Add the **configuration endpoint URL** to fetch data collection rules to the allowlist for the gateway
266266
`Add-OMSGatewayAllowedHost -Host global.handler.control.monitor.azure.com`
267267
`Add-OMSGatewayAllowedHost -Host <gateway-server-region-name>.handler.control.monitor.azure.com`.
268-
(If you're using private links on the agent, you must also add the [data collection endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint).)
268+
(If you're using private links on the agent, you must also add the [data collection endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-dce).)
269269
1. Add the **data ingestion endpoint URL** to the allowlist for the gateway
270270
`Add-OMSGatewayAllowedHost -Host <log-analytics-workspace-id>.ods.opinsights.azure.com`.
271271
1. Restart the **OMS Gateway** service to apply the changes

articles/azure-monitor/agents/azure-monitor-agent-manage.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,8 +60,8 @@ The following prerequisites must be met prior to installing Azure Monitor Agent.
6060
- global.handler.control.monitor.azure.com
6161
- `<virtual-machine-region-name>`.handler.control.monitor.azure.com (example: westus.handler.control.monitor.azure.com)
6262
- `<log-analytics-workspace-id>`.ods.opinsights.azure.com (example: 12345a01-b1cd-1234-e1f2-1234567g8h99.ods.opinsights.azure.com)
63-
(If you use private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint)).
64-
- **Disk Space**: Required disk space can vary greatly depending upon how an agent is utilized or if the agent is unable to communicate with the destinations where it is instructed to send monitoring data. By default the agent requires 10Gb of disk space to run and requires 500MB for agent installation. The following provides guidance for capacity planning:
63+
(If you use private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-dce)).
64+
- **Disk Space**: Required disk space can vary greatly depending upon how an agent is utilized or if the agent is unable to communicate with the destinations where it is instructed to send monitoring data. By default the agent requires 10Gb of disk space to run. The following provides guidance for capacity planning:
6565

6666
| Purpose | Environment | Path | Suggested Space |
6767
|:---|:---|:---|:---|

articles/azure-monitor/agents/azure-monitor-agent-windows-client.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ Here is a comparison between client installer and VM extension for Azure Monitor
4949
- global.handler.control.monitor.azure.com
5050
- `<virtual-machine-region-name>`.handler.control.monitor.azure.com (example: westus.handler.control.azure.com)
5151
- `<log-analytics-workspace-id>`.ods.opinsights.azure.com (example: 12345a01-b1cd-1234-e1f2-1234567g8h99.ods.opinsights.azure.com)
52-
(If using private links on the agent, you must also add the [data collection endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint))
52+
(If using private links on the agent, you must also add the [data collection endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-dce))
5353
6. A data collection rule you want to associate with the devices. If it doesn't exist already, [create a data collection rule](./data-collection-rule-azure-monitor-agent.md#create-a-data-collection-rule). **Do not associate the rule to any resources yet**.
5454
7. Before using any PowerShell cmdlet, ensure cmdlet related PowerShell module is installed and imported.
5555

articles/azure-monitor/agents/gateway.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,7 @@ To configure the Azure Monitor agent (installed on the gateway server) to use th
205205
2. Add the **configuration endpoint URL** to fetch data collection rules to the allowlist for the gateway
206206
`Add-OMSGatewayAllowedHost -Host global.handler.control.monitor.azure.com`
207207
`Add-OMSGatewayAllowedHost -Host <gateway-server-region-name>.handler.control.monitor.azure.com`
208-
(If using private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint))
208+
(If using private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-dce))
209209
3. Add the **data ingestion endpoint URL** to the allowlist for the gateway
210210
`Add-OMSGatewayAllowedHost -Host <log-analytics-workspace-id>.ods.opinsights.azure.com`
211211
3. Restart the **OMS Gateway** service to apply the changes

articles/azure-monitor/essentials/data-collection-endpoint-overview.md

Lines changed: 12 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,19 @@ ms.reviwer: nikeist
1212

1313
# Data collection endpoints in Azure Monitor
1414

15-
A data collection endpoint (DCE) is a connection that the [Logs ingestion API](../logs/logs-ingestion-api-overview.md) uses to send collected data for processing and ingestion into Azure Monitor. [Azure Monitor Agent](../agents/agents-overview.md) also uses data collection endpoints to receive configuration files from Azure Monitor and to send collected log data for processing and ingestion.
15+
A data collection endpoint (DCE) is a connection where data sources send collected data for processing and ingestion into Azure Monitor. This article provides an overview of data collection endpoints and explains how to create and set them up based on your deployment.
1616

17-
This article provides an overview of data collection endpoints and explains how to create and set them up based on your deployment.
17+
## When is a DCE required?
18+
Prior to March 31, 2024, a DCE was required for all data collection scenarios using a DCR that required an endpoint. Any DCR created after this date includes its own endpoints for logs and metrics. The URL for these endpoints can be found in the [`logsIngestion` and `metricsIngestion`](./data-collection-rule-structure.md#endpoints) properties of the DCR. These endpoints can be used instead of a DCE for any direct ingestion scenarios.
1819

19-
## Components of a data collection endpoint
20+
Endpoints cannot be added to an existing DCR, but you can keep using any existing DCRs with existing DCEs. If you want to move to a DCR endpoint, then you must create a new DCR to replace the existing one. A DCR with endpoints can also use a DCE. In this case, you can choose whether to use the DCE or the DCR endpoints for each of the clients that use the DCR.
21+
22+
The following scenarios can currently use DCR endpoints. A DCE required if private link is used.
23+
24+
- [Logs ingestion API](../logs/logs-ingestion-api-overview.md).
25+
26+
27+
## Components of a DCE
2028

2129
A data collection endpoint includes components required to ingest data into Azure Monitor and send configuration files to Azure Monitor Agent.
2230

@@ -25,7 +33,7 @@ A data collection endpoint includes components required to ingest data into Azur
2533
This table describes the components of a data collection endpoint, related regionality considerations, and how to set up the data collection endpoint when you create a data collection rule using the portal:
2634

2735
| Component | Description | Regionality considerations |Data collection rule configuration |
28-
|:---|:---|:---|
36+
|:---|:---|:---|:---|
2937
| Logs ingestion endpoint | The endpoint that ingests logs into the data ingestion pipeline. Azure Monitor transforms the data and sends it to the defined destination Log Analytics workspace and table based on a DCR ID sent with the collected data.<br>Example: `<unique-dce-identifier>.<regionname>-1.ingest`. |Same region as the destination Log Analytics workspace. |Set on the **Basics** tab when you create a data collection rule using the portal. |
3038
| Configuration access endpoint | The endpoint from which Azure Monitor Agent retrieves data collection rules (DCRs).<br>Example: `<unique-dce-identifier>.<regionname>-1.handler.control`. | Same region as the monitored resources. | Set on the **Resources** tab when you create a data collection rule using the portal.|
3139

@@ -121,7 +129,6 @@ The sample data collection endpoint (DCE) below is for virtual machines with Azu
121129

122130
- Data collection endpoints only support Log Analytics workspaces as a destination for collected data. [Custom metrics (preview)](../essentials/metrics-custom-overview.md) collected and uploaded via Azure Monitor Agent aren't currently controlled by DCEs.
123131

124-
- Data collection endpoints are where [Logs ingestion API ingestion limits](../service-limits.md#logs-ingestion-api) are applied.
125132

126133
## Next steps
127134
- [Associate endpoints to machines](../agents/data-collection-rule-azure-monitor-agent.md#create-a-data-collection-rule)

0 commit comments

Comments
 (0)