You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> There is no additional [pricing](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/) or [quota](../quotas-limits.md) for using Assistants unless you use the [code interpreter](../how-to/code-interpreter.md) tool.
30
+
28
31
Assistant API is built on the same capabilities that power OpenAI’s GPT product. Some possible use cases range from AI-powered product recommender, sales analyst app, coding assistant, employee Q&A chatbot, and more. Start building on the no-code Assistants playground on the Azure OpenAI Studio or start building with the API.
Copy file name to clipboardExpand all lines: articles/ai-services/openai/faq.yml
+54-3Lines changed: 54 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -118,7 +118,7 @@ sections:
118
118
answer:
119
119
If the service performs processing, you will be charged even if the status code is not successful (not 200).
120
120
Common examples of this are, a 400 error due to a content filter or input limit, or a 408 error due to a timeout. Charges will also occur when a `status 200` is received with a `finish_reason` of `content_filter`.
121
-
In this case the prompt did not have any issues, but the completion generated by the model was detected to violate the content filtering rules which results in the completion being filtered.
121
+
In this case the prompt did not have any issues, but the completion generated by the model was detected to violate the content filtering rules, which result in the completion being filtered.
122
122
123
123
If the service doesn't perform processing, you won't be charged.
124
124
For example, a 401 error due to authentication or a 429 error due to exceeding the Rate Limit.
@@ -228,7 +228,58 @@ sections:
228
228
What are the known limitations of GPT-4 Turbo with Vision?
229
229
answer: |
230
230
See the [limitations](./concepts/gpt-with-vision.md#limitations) section of the GPT-4 Turbo with Vision concepts guide.
231
+
- name: Assistants
232
+
questions:
233
+
- question: |
234
+
Do you store any data used in the Assistants API?
235
+
answer: |
236
+
Yes. Unlike Chat Completions API, Azure OpenAI Assistants is a stateful API, meaning it retains data. There are two types of data stored in the Assistants API:
237
+
* Stateful entities: Threads, messages, and runs created during Assistants use.
238
+
* Files: Uploaded during Assistants setup or as part of a message.
239
+
- question: |
240
+
Where is this data stored?
241
+
answer: |
242
+
Data is stored in a secure, Microsoft-managed storage account that is logically separated.
243
+
- question: |
244
+
How long is this data stored?
245
+
answer: |
246
+
All used data persists in this system unless you explicitly delete this data. Use the [delete function](./assistants-reference-threads.md) with the thread ID of the thread you want to delete. Clearing the Run in the Assistants Playground does not delete threads, however deleting them using delete function will not list them in the thread page.
247
+
- question: |
248
+
Can I bring my own data store to use with Assistants?
249
+
answer: |
250
+
No. Currently Assistants supports only local files uploaded to the Assistants-managed storage. You cannot use your private storage account with Assistants.
251
+
- question: |
252
+
Is my data used by Microsoft for training models?
253
+
answer: |
254
+
No. Data is not used for Microsoft not used for training models. See the [Responsible AI documentation](/legal/cognitive-services/openai/data-privacy?context=%2Fazure%2Fai-services%2Fopenai%2Fcontext%2Fcontext) for more information.
255
+
- question: |
256
+
Where is data stored geographically?
257
+
answer: |
258
+
Azure OpenAI Assistants endpoints are regional, and data is stored in the same region as the endpoint. For more information, see the [Azure data residency documentation](https://azure.microsoft.com/explore/global-infrastructure/data-residency/#overview).
259
+
- question: |
260
+
How am I charged for Assistants?
261
+
answer: |
262
+
Currently, when you use Assistants API, you're billed for the following:
263
+
- Inference cost (input and output) of the base model you're using for each Assistant (for example gpt-4-0125). If you've created multiple Assistants, you will be charged for the base model attached to each Assistant.
264
+
- If you've enabled the Code Interpreter tool. For example if your assistant calls Code Interpreter simultaneously in two different threads, this would create two Code Interpreter sessions, each of which would be charged. Each session is active by default for one hour, which means that you would only pay this fee once if your user keeps giving instructions to Code Interpreter in the same thread for up to one hour.
231
265
266
+
For more information, see the [pricing page](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/).
267
+
- question: |
268
+
Is there any additional pricing or quota for using Assistants?
269
+
answer: |
270
+
No. All [quotas](./quotas-limits.md) apply to using models with Assistants.
271
+
- question: |
272
+
Does the Assistants API support non-Azure OpenAI models?
273
+
answer: |
274
+
Assistants API only supports Azure OpenAI models.
275
+
- question: |
276
+
Is the Assistants API generally available?
277
+
answer: |
278
+
The Assistants API is currently in public preview. Stay informed of our latest product updates by regularly visiting our [What's New](./whats-new.md) page.
279
+
- question: |
280
+
What are some examples or other resources I can use to learn about Assistants?
281
+
answer: |
282
+
See the [Conceptual](./concepts/assistants.md), [quickstart](./assistants-quickstart.md), [how-to](./how-to/assistant.md) articles for information on getting started and using Assistants. You can also check out Azure OpenAI Assistants code samples on [GitHub](https://github.com/Azure-Samples/azureai-samples/tree/main/scenarios/Assistants).
232
283
- name: Web app
233
284
questions:
234
285
- question: |
@@ -260,7 +311,7 @@ sections:
260
311
- question: |
261
312
How can I customize or automate the index creation process?
262
313
answer:
263
-
You can prepare the index yourself using a [script provided on GitHub](https://go.microsoft.com/fwlink/?linkid=2244395). Using this script will create an Azure AI Search index with all the information needed to better leverage your data, with your documents broken down into manageable chunks. Please see the README file with the data preparation code for details on how to run it.
314
+
You can prepare the index yourself using a [script provided on GitHub](https://go.microsoft.com/fwlink/?linkid=2244395). Using this script will create an Azure AI Search index with all the information needed to better use your data, with your documents broken down into manageable chunks. See the README file with the data preparation code for details on how to run it.
264
315
- question: |
265
316
How can I update my index?
266
317
answer:
@@ -289,7 +340,7 @@ sections:
289
340
If Semantic Search is enabled for my Azure AI Search resource, will it be automatically applied to Azure OpenAI on your data in the Azure OpenAI Studio?
290
341
answer:
291
342
When you select "Azure AI Search" as the data source, you can choose to apply semantic search.
292
-
If you select "Azure Blob Container" or "Upload files" as the data source, you can create the index as usual. Afterwards you would re-ingest the data using the "Azure AI Search" option to select the same index and apply Semantic Search. You will then be ready to chat on your data with semantic search applied.
343
+
If you select "Azure Blob Container" or "Upload files" as the data source, you can create the index as usual. Afterwards you would reingest the data using the "Azure AI Search" option to select the same index and apply Semantic Search. You will then be ready to chat on your data with semantic search applied.
293
344
- question: |
294
345
How can I add vector embeddings when indexing my data?
- The following Python libraries: os, requests, json, openai, azure-identity
31
31
32
-
## Assign yourself to the Cognitive Services User role
32
+
## Assign role
33
33
34
34
Assign yourself either the [Cognitive Services OpenAI User](role-based-access-control.md#cognitive-services-openai-user) or [Cognitive Services OpenAI Contributor](role-based-access-control.md#cognitive-services-openai-contributor) role to allow you to use your account to make Azure OpenAI inference API calls rather than having to use key-based auth. After you make this change it can take up to 5 minutes before the change takes effect.
Copy file name to clipboardExpand all lines: articles/azure-functions/consumption-plan.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ You can also create function apps in a Consumption plan when you publish a Funct
38
38
39
39
## Multiple apps in the same plan
40
40
41
-
The general recommendation is for each function app to have its own Consumption plan. However, if needed, function apps in the same region can be assigned to the same Consumption plan. Keep in mind that there is a [limit to the number of function apps that can run in a Consumption plan](functions-scale.md#service-limits). Function apps in a given plan are all scaled together, so any issues with scaling can affect all apps in the plan.
41
+
The general recommendation is for each function app to have its own Consumption plan. However, if needed, function apps in the same region can be assigned to the same Consumption plan. Keep in mind that there is a [limit to the number of function apps that can run in a Consumption plan](functions-scale.md#service-limits). Function apps in the same plan still scale independently of each other.
Copy file name to clipboardExpand all lines: articles/azure-monitor/agents/azure-monitor-agent-data-collection-endpoint.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,7 +35,7 @@ Data Collection Endpoints public IP addresses are not part of the abovementioned
35
35
| Microsoft Azure operated by 21Vianet | Replace '.com' above with '.cn' | Same as above | Same as above | Same as above| Same as above |
36
36
37
37
>[!NOTE]
38
-
> If you use private links on the agent, you must **only** add the [private data collection endpoints (DCEs)](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint). The agent does not use the non-private endpoints listed above when using private links/data collection endpoints.
38
+
> If you use private links on the agent, you must **only** add the [private data collection endpoints (DCEs)](../essentials/data-collection-endpoint-overview.md#components-of-a-dce). The agent does not use the non-private endpoints listed above when using private links/data collection endpoints.
39
39
> The Azure Monitor Metrics (custom metrics) preview isn't available in Azure Government and Azure operated by 21Vianet clouds.
(If you're using private links on the agent, you must also add the [data collection endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint).)
268
+
(If you're using private links on the agent, you must also add the [data collection endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-dce).)
269
269
1. Add the **data ingestion endpoint URL** to the allowlist for the gateway
(If you use private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint)).
64
-
- **Disk Space**: Required disk space can vary greatly depending upon how an agent is utilized or if the agent is unable to communicate with the destinations where it is instructed to send monitoring data. By default the agent requires 10Gb of disk space to run and requires 500MB for agent installation. The following provides guidance for capacity planning:
63
+
(If you use private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-dce)).
64
+
- **Disk Space**: Required disk space can vary greatly depending upon how an agent is utilized or if the agent is unable to communicate with the destinations where it is instructed to send monitoring data. By default the agent requires 10Gb of disk space to run. The following provides guidance for capacity planning:
65
65
66
66
| Purpose | Environment | Path | Suggested Space |
(If using private links on the agent, you must also add the [data collection endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint))
52
+
(If using private links on the agent, you must also add the [data collection endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-dce))
53
53
6. A data collection rule you want to associate with the devices. If it doesn't exist already, [create a data collection rule](./data-collection-rule-azure-monitor-agent.md#create-a-data-collection-rule). **Do not associate the rule to any resources yet**.
54
54
7. Before using any PowerShell cmdlet, ensure cmdlet related PowerShell module is installed and imported.
(If using private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint))
208
+
(If using private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-dce))
209
209
3. Add the **data ingestion endpoint URL** to the allowlist for the gateway
Copy file name to clipboardExpand all lines: articles/azure-monitor/essentials/data-collection-endpoint-overview.md
+12-5Lines changed: 12 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,11 +12,19 @@ ms.reviwer: nikeist
12
12
13
13
# Data collection endpoints in Azure Monitor
14
14
15
-
A data collection endpoint (DCE) is a connection that the [Logs ingestion API](../logs/logs-ingestion-api-overview.md) uses to send collected data for processing and ingestion into Azure Monitor. [Azure Monitor Agent](../agents/agents-overview.md) also uses data collection endpoints to receive configuration files from Azure Monitor and to send collected log data for processing and ingestion.
15
+
A data collection endpoint (DCE) is a connection where data sources send collected data for processing and ingestion into Azure Monitor. This article provides an overview of data collection endpoints and explains how to create and set them up based on your deployment.
16
16
17
-
This article provides an overview of data collection endpoints and explains how to create and set them up based on your deployment.
17
+
## When is a DCE required?
18
+
Prior to March 31, 2024, a DCE was required for all data collection scenarios using a DCR that required an endpoint. Any DCR created after this date includes its own endpoints for logs and metrics. The URL for these endpoints can be found in the [`logsIngestion` and `metricsIngestion`](./data-collection-rule-structure.md#endpoints) properties of the DCR. These endpoints can be used instead of a DCE for any direct ingestion scenarios.
18
19
19
-
## Components of a data collection endpoint
20
+
Endpoints cannot be added to an existing DCR, but you can keep using any existing DCRs with existing DCEs. If you want to move to a DCR endpoint, then you must create a new DCR to replace the existing one. A DCR with endpoints can also use a DCE. In this case, you can choose whether to use the DCE or the DCR endpoints for each of the clients that use the DCR.
21
+
22
+
The following scenarios can currently use DCR endpoints. A DCE required if private link is used.
A data collection endpoint includes components required to ingest data into Azure Monitor and send configuration files to Azure Monitor Agent.
22
30
@@ -25,7 +33,7 @@ A data collection endpoint includes components required to ingest data into Azur
25
33
This table describes the components of a data collection endpoint, related regionality considerations, and how to set up the data collection endpoint when you create a data collection rule using the portal:
| Logs ingestion endpoint | The endpoint that ingests logs into the data ingestion pipeline. Azure Monitor transforms the data and sends it to the defined destination Log Analytics workspace and table based on a DCR ID sent with the collected data.<br>Example: `<unique-dce-identifier>.<regionname>-1.ingest`. |Same region as the destination Log Analytics workspace. |Set on the **Basics** tab when you create a data collection rule using the portal. |
30
38
| Configuration access endpoint | The endpoint from which Azure Monitor Agent retrieves data collection rules (DCRs).<br>Example: `<unique-dce-identifier>.<regionname>-1.handler.control`. | Same region as the monitored resources. | Set on the **Resources** tab when you create a data collection rule using the portal.|
31
39
@@ -121,7 +129,6 @@ The sample data collection endpoint (DCE) below is for virtual machines with Azu
121
129
122
130
- Data collection endpoints only support Log Analytics workspaces as a destination for collected data. [Custom metrics (preview)](../essentials/metrics-custom-overview.md) collected and uploaded via Azure Monitor Agent aren't currently controlled by DCEs.
123
131
124
-
- Data collection endpoints are where [Logs ingestion API ingestion limits](../service-limits.md#logs-ingestion-api) are applied.
125
132
126
133
## Next steps
127
134
-[Associate endpoints to machines](../agents/data-collection-rule-azure-monitor-agent.md#create-a-data-collection-rule)
0 commit comments