You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/content-moderator/includes/tool-deprecation.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ ms.author: pafarley
13
13
14
14
15
15
> [!IMPORTANT]
16
-
> Azure Content Moderator is being deprecated in February 2024, and will be retired by February 2027. It is being replaced by [Azure AI Content Safety](/azure/ai-services/content-safety/overview), which offers advanced AI features and enhanced performance.
16
+
> Azure Content Moderator is deprecated as of February 2024 and will be retired by February 2027. It is replaced by [Azure AI Content Safety](/azure/ai-services/content-safety/overview), which offers advanced AI features and enhanced performance.
17
17
>
18
18
> Azure AI Content Safety is a comprehensive solution designed to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety is suitable for many scenarios such as online marketplaces, gaming companies, social messaging platforms, enterprise media companies, and K-12 education solution providers. Here's an overview of its features and capabilities:
Copy file name to clipboardExpand all lines: articles/ai-services/content-moderator/index.yml
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
### YamlMime:Landing
2
2
3
3
title: Content Moderator documentation # < 60 chars
4
-
summary: "The Azure Content Moderator API checks text, image, and video content for material that is potentially offensive, risky, or otherwise undesirable. Content Moderator is being deprecated in February 2024, and will be retired by February 2027. It is being replaced by Azure AI Content Safety, which offers advanced AI features and enhanced performance. Azure AI Content Safety is a comprehensive solution designed to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety is suitable for many scenarios such as online marketplaces, gaming companies, social messaging platforms, enterprise media companies, and K-12 education solution providers."# < 160 chars
4
+
summary: "Azure Content Moderator is deprecated as of February 2024 and will be retired by February 2027. It is replaced by Azure AI Content Safety, which offers advanced AI features and enhanced performance. Azure AI Content Safety is a comprehensive solution designed to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety is suitable for many scenarios such as online marketplaces, gaming companies, social messaging platforms, enterprise media companies, and K-12 education solution providers."# < 160 chars
5
5
6
6
metadata:
7
7
title: Content Moderator Documentation - Quickstarts, Tutorials, API Reference - Azure AI services | Microsoft Docs
Copy file name to clipboardExpand all lines: articles/ai-services/openai/assistants-reference-runs.md
+113-1Lines changed: 113 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ description: Learn how to use Azure OpenAI's Python & REST API runs with Assista
5
5
manager: nitinme
6
6
ms.service: azure-ai-openai
7
7
ms.topic: conceptual
8
-
ms.date: 02/01/2024
8
+
ms.date: 04/16/2024
9
9
author: mrbullwinkle
10
10
ms.author: mbullwin
11
11
recommendations: false
@@ -585,3 +585,115 @@ Represent a step in execution of a run.
585
585
|`failed_at`| integer or null | The Unix timestamp (in seconds) for when the run step failed.|
586
586
|`completed_at`| integer or null | The Unix timestamp (in seconds) for when the run step completed.|
587
587
|`metadata`| map | Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maximum of 512 characters long.|
588
+
589
+
## Stream a run result (preview)
590
+
591
+
Stream the result of executing a Run or resuming a Run after submitting tool outputs. You can stream events after:
To stream a result, pass `"stream": true` while creating a run. The response will be a [Server-Sent events](https://html.spec.whatwg.org/multipage/server-sent-events.html#server-sent-events) stream.
597
+
598
+
### Streaming example
599
+
600
+
```python
601
+
from typing_extensions import override
602
+
from openai import AssistantEventHandler
603
+
604
+
# First, we create a EventHandler class to define
605
+
# how we want to handle the events in the response stream.
Events are emitted whenever a new object is created, transitions to a new state, or is being streamed in parts (deltas). For example, `thread.run.created` is emitted when a new run is created, `thread.run.completed` when a run completes, and so on. When an Assistant chooses to create a message during a run, we emit a `thread.message.created` event, a `thread.message.in_progress` event, many thread.`message.delta` events, and finally a `thread.message.completed` event.
673
+
674
+
|Name | Type | Description |
675
+
|--- |--- |--- |
676
+
|`thread.created`|`data` is a thread. | Occurs when a new thread is created. |
677
+
|`thread.run.created`|`data` is a run. | Occurs when a new run is created. |
678
+
|`thread.run.queued`|`data` is a run. | Occurs when a run moves to a queued status. |
679
+
|`thread.run.in_progress`|`data` is a run. | Occurs when a run moves to an in_progress status. |
680
+
|`thread.run.requires_action`|`data` is a run. | Occurs when a run moves to a `requires_action` status. |
681
+
|`thread.run.completed`|`data` is a run. | Occurs when a run is completed. |
682
+
|`thread.run.failed`|`data` is a run. | Occurs when a run fails. |
683
+
|`thread.run.cancelling`|`data` is a run. | Occurs when a run moves to a `cancelling` status. |
684
+
|`thread.run.cancelled`|`data` is a run. | Occurs when a run is cancelled. |
685
+
|`thread.run.expired`|`data` is a run. | Occurs when a run expires. |
686
+
|`thread.run.step.created`|`data` is a run step. | Occurs when a run step is created. |
687
+
|`thread.run.step.in_progress`|`data` is a run step. | Occurs when a run step moves to an `in_progress` state. |
688
+
|`thread.run.step.delta`|`data` is a run step delta. | Occurs when parts of a run step are being streamed. |
689
+
|`thread.run.step.completed`|`data` is a run step. | Occurs when a run step is completed. |
690
+
|`thread.run.step.failed`|`data` is a run step. | Occurs when a run step fails. |
691
+
|`thread.run.step.cancelled`|`data` is a run step. | Occurs when a run step is cancelled. |
692
+
|`thread.run.step.expired`|`data` is a run step. | Occurs when a run step expires. |
693
+
|`thread.message.created`|`data` is a message. | Occurs when a message is created. |
694
+
|`thread.message.in_progress`|`data` is a message. | Occurs when a message moves to an in_progress state. |
695
+
|`thread.message.delta`|`data` is a message delta. | Occurs when parts of a Message are being streamed. |
696
+
|`thread.message.completed`|`data` is a message. | Occurs when a message is completed. |
697
+
|`thread.message.incomplete`|`data` is a message. | Occurs when a message ends before it is completed. |
698
+
|`error`|`data` is an error. | Occurs when an error occurs. This can happen due to an internal server error or a timeout. |
699
+
|`done`|`data` is `[DONE]`| Occurs when a stream ends. |
Copy file name to clipboardExpand all lines: articles/ai-services/openai/concepts/use-your-data.md
+28-45Lines changed: 28 additions & 45 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -62,59 +62,15 @@ The [Integrated Vector Database in vCore-based Azure Cosmos DB for MongoDB](/azu
62
62
63
63
For some data sources such as uploading files from your local machine (preview) or data contained in a blob storage account (preview), Azure AI Search is used. When you choose the following data sources, your data is ingested into an Azure AI Search index.
64
64
65
-
>[!TIP]
66
-
>If you use Azure Cosmos DB (except for its vCore-based API for MongoDB), you may be eligible for the [Azure AI Advantage offer](/azure/cosmos-db/ai-advantage), which provides the equivalent of up to $6,000 in Azure Cosmos DB throughput credits.
67
-
68
-
|Data source | Description |
65
+
|Data ingested through Azure AI Search | Description |
69
66
|---------|---------|
70
67
|[Azure AI Search](/azure/search/search-what-is-azure-search)| Use an existing Azure AI Search index with Azure OpenAI On Your Data. |
71
-
|[Azure Cosmos DB](/azure/cosmos-db/introduction)| Azure Cosmos DB's API for Postgres and vCore-based API for MongoDB offer natively integrated vector indexing; therefore, they don't require Azure AI Search. However, its other APIs do require Azure AI Search for vector indexing. Azure Cosmos DB for NoSQL's natively integrated vector database debuts in mid-2024. |
72
68
|Upload files (preview) | Upload files from your local machine to be stored in an Azure Blob Storage database, and ingested into Azure AI Search. |
73
69
|URL/Web address (preview) | Web content from the URLs is stored in Azure Blob Storage. |
74
70
|Azure Blob Storage (preview) | Upload files from Azure Blob Storage to be ingested into an Azure AI Search index. |
75
71
76
72
:::image type="content" source="../media/use-your-data/azure-databases-and-ai-search.png" lightbox="../media/use-your-data/azure-databases-and-ai-search.png" alt-text="Diagram of vector indexing services.":::
77
73
78
-
# [Vector Database in Azure Cosmos DB for MongoDB](#tab/mongo-db)
79
-
80
-
### Prerequisites
81
-
*[vCore-based Azure Cosmos DB for MongoDB](/azure/cosmos-db/mongodb/vcore/introduction) account
82
-
* A deployed [embedding model](../concepts/understand-embeddings.md)
83
-
84
-
### Limitations
85
-
* Only vCore-based Azure Cosmos DB for MongoDB is supported.
86
-
* The search type is limited to [Integrated Vector Database in Azure Cosmos DB for MongoDB](/azure/cosmos-db/mongodb/vcore/vector-search) with an Azure OpenAI embedding model.
87
-
* This implementation works best on unstructured and spatial data.
88
-
89
-
90
-
### Data preparation
91
-
92
-
Use the script provided on [GitHub](https://github.com/microsoft/sample-app-aoai-chatGPT/tree/main/scripts#data-preparation) to prepare your data.
93
-
94
-
<!--### Add your data source in Azure OpenAI Studio
95
-
96
-
To add vCore-based Azure Cosmos DB for MongoDB as a data source, you'll need an existing Azure Cosmos DB for MongoDB index containing your data, and a deployed Azure OpenAI Ada embeddings model that will be used for vector search.
97
-
98
-
1. In the [Azure OpenAI portal](https://oai.azure.com/portal) chat playground, select **Add your data**. In the panel that appears, select ** vCore-based Azure Cosmos DB for MongoDB** as the data source.
99
-
1. Select your Azure subscription and database account, then connect to your Azure Cosmos DB account by providing your Azure Cosmos DB account username and password.
100
-
101
-
:::image type="content" source="../media/use-your-data/add-mongo-data-source.png" alt-text="A screenshot showing the screen for adding Mongo DB as a data source in Azure OpenAI Studio." lightbox="../media/use-your-data/add-mongo-data-source.png":::
102
-
103
-
1. **Select Database**. In the dropdown menus, select the database name, database collection, and index name that you want to use as your data source. Select the embedding model deployment you would like to use for vector search on this data source, and acknowledge that you'll incur charges for using vector search. Then select **Next**.
104
-
105
-
:::image type="content" source="../media/use-your-data/select-mongo-database.png" alt-text="A screenshot showing the screen for adding Mongo DB settings in Azure OpenAI Studio." lightbox="../media/use-your-data/select-mongo-database.png":::
106
-
-->
107
-
108
-
### Index field mapping
109
-
110
-
When you add your vCore-based Azure Cosmos DB for MongoDB data source, you can specify data fields to properly map your data for retrieval.
111
-
112
-
* Content data (required): One or more provided fields to be used to ground the model on your data. For multiple fields, separate the values with commas, with no spaces.
113
-
* File name/title/URL: Used to display more information when a document is referenced in the chat.
114
-
* Vector fields (required): Select the field in your database that contains the vectors.
115
-
116
-
:::image type="content" source="../media/use-your-data/mongo-index-mapping.png" alt-text="A screenshot showing the index field mapping options for Mongo DB." lightbox="../media/use-your-data/mongo-index-mapping.png":::
117
-
118
74
# [Azure AI Search](#tab/ai-search)
119
75
120
76
You might want to consider using an Azure AI Search index when you either want to:
@@ -179,6 +135,33 @@ If you want to implement additional value-based criteria for query execution, yo
# [Vector Database in Azure Cosmos DB for MongoDB](#tab/mongo-db)
140
+
141
+
### Prerequisites
142
+
*[vCore-based Azure Cosmos DB for MongoDB](/azure/cosmos-db/mongodb/vcore/introduction) account
143
+
* A deployed [embedding model](../concepts/understand-embeddings.md)
144
+
145
+
### Limitations
146
+
* Only vCore-based Azure Cosmos DB for MongoDB is supported.
147
+
* The search type is limited to [Integrated Vector Database in Azure Cosmos DB for MongoDB](/azure/cosmos-db/mongodb/vcore/vector-search) with an Azure OpenAI embedding model.
148
+
* This implementation works best on unstructured and spatial data.
149
+
150
+
151
+
### Data preparation
152
+
153
+
Use the script provided on [GitHub](https://github.com/microsoft/sample-app-aoai-chatGPT/tree/main/scripts#data-preparation) to prepare your data.
154
+
155
+
### Index field mapping
156
+
157
+
When you add your vCore-based Azure Cosmos DB for MongoDB data source, you can specify data fields to properly map your data for retrieval.
158
+
159
+
* Content data (required): One or more provided fields to be used to ground the model on your data. For multiple fields, separate the values with commas, with no spaces.
160
+
* File name/title/URL: Used to display more information when a document is referenced in the chat.
161
+
* Vector fields (required): Select the field in your database that contains the vectors.
162
+
163
+
:::image type="content" source="../media/use-your-data/mongo-index-mapping.png" alt-text="A screenshot showing the index field mapping options for Mongo DB." lightbox="../media/use-your-data/mongo-index-mapping.png":::
You might want to use Azure Blob Storage as a data source if you want to connect to existing Azure Blob Storage and use files stored in your containers.
Copy file name to clipboardExpand all lines: articles/ai-services/openai/concepts/use-your-image-data.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.service: azure-ai-openai
8
8
ms.topic: quickstart
9
9
author: aahill
10
10
ms.author: aahi
11
-
ms.date: 11/02/2023
11
+
ms.date: 05/09/2024
12
12
recommendations: false
13
13
---
14
14
@@ -17,7 +17,7 @@ recommendations: false
17
17
Use this article to learn how to provide your own image data for GPT-4 Turbo with Vision, Azure OpenAI’s vision model. GPT-4 Turbo with Vision on your data allows the model to generate more customized and targeted answers using Retrieval Augmented Generation based on your own images and image metadata.
18
18
19
19
> [!IMPORTANT]
20
-
> This article is for using your data on the GPT-4 Turbo with Vision model. If you are interested in using your data for text-based models, see [Use your text data](./use-your-data.md).
20
+
> Once the GPT4-Turbo with vision preview model is deprecated, you will no longer be able to use Azure OpenAI On your image data. To implement a Retrieval Augmented Generation (RAG) solution with image data, see the following sample on [github](https://github.com/Azure-Samples/azure-search-openai-demo/).
0 commit comments