Skip to content

Commit 534fd7a

Browse files
Learn Build Service GitHub AppLearn Build Service GitHub App
authored andcommitted
Merging changes synced from https://github.com/MicrosoftDocs/azure-docs-pr (branch live)
2 parents 6166004 + 176cdfd commit 534fd7a

File tree

122 files changed

+3132
-861
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

122 files changed

+3132
-861
lines changed

.openpublishing.redirection.json

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3430,6 +3430,12 @@
34303430
"redirect_url": "/azure/ai-services/anomaly-detector/quickstarts/detect-data-anomalies-python",
34313431
"redirect_document_id": false
34323432
},
3433+
{
3434+
"source_path_from_root": "/articles/ai-studio/how-to/model-catalog.md",
3435+
"redirect_url": "/azure/ai-studio/how-to/model-catalog-overview",
3436+
"redirect_document_id": false
3437+
},
3438+
34333439
{
34343440
"source_path_from_root": "/articles/service-fabric/upgrade-managed-disks.md",
34353441
"redirect_url": "/azure/service-fabric/service-fabric-scale-up-primary-node-type",
@@ -4024,6 +4030,11 @@
40244030
"source_path_from_root":"/articles/aks/generation-2-vm-windows.md",
40254031
"redirect_url":"/azure/aks/generation-2-vm",
40264032
"redirect_document_id":false
4033+
},
4034+
{
4035+
"source_path_from_root":"/articles/cosmos-db/high-availability.md",
4036+
"redirect_url":"/azure/reliability/reliability-cosmos-db-nosql.md",
4037+
"redirect_document_id":false
40274038
}
40284039
]
40294040
}

articles/ai-services/content-moderator/includes/tool-deprecation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.author: pafarley
1313

1414

1515
> [!IMPORTANT]
16-
> Azure Content Moderator is being deprecated in February 2024, and will be retired by February 2027. It is being replaced by [Azure AI Content Safety](/azure/ai-services/content-safety/overview), which offers advanced AI features and enhanced performance.
16+
> Azure Content Moderator is deprecated as of February 2024 and will be retired by February 2027. It is replaced by [Azure AI Content Safety](/azure/ai-services/content-safety/overview), which offers advanced AI features and enhanced performance.
1717
>
1818
> Azure AI Content Safety is a comprehensive solution designed to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety is suitable for many scenarios such as online marketplaces, gaming companies, social messaging platforms, enterprise media companies, and K-12 education solution providers. Here's an overview of its features and capabilities:
1919
>

articles/ai-services/content-moderator/index.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
### YamlMime:Landing
22

33
title: Content Moderator documentation # < 60 chars
4-
summary: "The Azure Content Moderator API checks text, image, and video content for material that is potentially offensive, risky, or otherwise undesirable. Content Moderator is being deprecated in February 2024, and will be retired by February 2027. It is being replaced by Azure AI Content Safety, which offers advanced AI features and enhanced performance. Azure AI Content Safety is a comprehensive solution designed to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety is suitable for many scenarios such as online marketplaces, gaming companies, social messaging platforms, enterprise media companies, and K-12 education solution providers." # < 160 chars
4+
summary: "Azure Content Moderator is deprecated as of February 2024 and will be retired by February 2027. It is replaced by Azure AI Content Safety, which offers advanced AI features and enhanced performance. Azure AI Content Safety is a comprehensive solution designed to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety is suitable for many scenarios such as online marketplaces, gaming companies, social messaging platforms, enterprise media companies, and K-12 education solution providers." # < 160 chars
55

66
metadata:
77
title: Content Moderator Documentation - Quickstarts, Tutorials, API Reference - Azure AI services | Microsoft Docs

articles/ai-services/openai/assistants-reference-runs.md

Lines changed: 113 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: Learn how to use Azure OpenAI's Python & REST API runs with Assista
55
manager: nitinme
66
ms.service: azure-ai-openai
77
ms.topic: conceptual
8-
ms.date: 02/01/2024
8+
ms.date: 04/16/2024
99
author: mrbullwinkle
1010
ms.author: mbullwin
1111
recommendations: false
@@ -585,3 +585,115 @@ Represent a step in execution of a run.
585585
| `failed_at`| integer or null | The Unix timestamp (in seconds) for when the run step failed.|
586586
| `completed_at`| integer or null | The Unix timestamp (in seconds) for when the run step completed.|
587587
| `metadata`| map | Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maximum of 512 characters long.|
588+
589+
## Stream a run result (preview)
590+
591+
Stream the result of executing a Run or resuming a Run after submitting tool outputs. You can stream events after:
592+
* [Create Thread and Run](#create-thread-and-run)
593+
* [Create Run](#create-run)
594+
* [Submit Tool Outputs](#submit-tool-outputs-to-run)
595+
596+
To stream a result, pass `"stream": true` while creating a run. The response will be a [Server-Sent events](https://html.spec.whatwg.org/multipage/server-sent-events.html#server-sent-events) stream.
597+
598+
### Streaming example
599+
600+
```python
601+
from typing_extensions import override
602+
from openai import AssistantEventHandler
603+
604+
# First, we create a EventHandler class to define
605+
# how we want to handle the events in the response stream.
606+
607+
class EventHandler(AssistantEventHandler):
608+
@override
609+
def on_text_created(self, text) -> None:
610+
print(f"\nassistant > ", end="", flush=True)
611+
612+
@override
613+
def on_text_delta(self, delta, snapshot):
614+
print(delta.value, end="", flush=True)
615+
616+
def on_tool_call_created(self, tool_call):
617+
print(f"\nassistant > {tool_call.type}\n", flush=True)
618+
619+
def on_tool_call_delta(self, delta, snapshot):
620+
if delta.type == 'code_interpreter':
621+
if delta.code_interpreter.input:
622+
print(delta.code_interpreter.input, end="", flush=True)
623+
if delta.code_interpreter.outputs:
624+
print(f"\n\noutput >", flush=True)
625+
for output in delta.code_interpreter.outputs:
626+
if output.type == "logs":
627+
print(f"\n{output.logs}", flush=True)
628+
629+
# Then, we use the `create_and_stream` SDK helper
630+
# with the `EventHandler` class to create the Run
631+
# and stream the response.
632+
633+
with client.beta.threads.runs.stream(
634+
thread_id=thread.id,
635+
assistant_id=assistant.id,
636+
instructions="Please address the user as Jane Doe. The user has a premium account.",
637+
event_handler=EventHandler(),
638+
) as stream:
639+
stream.until_done()
640+
```
641+
642+
643+
## Message delta object
644+
645+
Represents a message delta. For example any changed fields on a message during streaming.
646+
647+
|Name | Type | Description |
648+
|--- |--- |--- |
649+
| `id` | string | The identifier of the message, which can be referenced in API endpoints. |
650+
| `object` | string | The object type, which is always `thread.message.delta`. |
651+
| `delta` | object | The delta containing the fields that have changed on the Message. |
652+
653+
## Run step delta object
654+
655+
Represents a run step delta. For example any changed fields on a run step during streaming.
656+
657+
|Name | Type | Description |
658+
|--- |--- |--- |
659+
| `id` | string | The identifier of the run step, which can be referenced in API endpoints. |
660+
| `object` | string | The object type, which is always `thread.run.step.delta`. |
661+
| `delta` | object | The delta containing the fields that have changed on the run step.
662+
663+
## Assistant stream events
664+
665+
Represents an event emitted when streaming a Run. Each event in a server-sent events stream has an event and data property:
666+
667+
```json
668+
event: thread.created
669+
data: {"id": "thread_123", "object": "thread", ...}
670+
```
671+
672+
Events are emitted whenever a new object is created, transitions to a new state, or is being streamed in parts (deltas). For example, `thread.run.created` is emitted when a new run is created, `thread.run.completed` when a run completes, and so on. When an Assistant chooses to create a message during a run, we emit a `thread.message.created` event, a `thread.message.in_progress` event, many thread.`message.delta` events, and finally a `thread.message.completed` event.
673+
674+
|Name | Type | Description |
675+
|--- |--- |--- |
676+
| `thread.created` | `data` is a thread. | Occurs when a new thread is created. |
677+
| `thread.run.created` | `data` is a run. | Occurs when a new run is created. |
678+
| `thread.run.queued` | `data` is a run. | Occurs when a run moves to a queued status. |
679+
| `thread.run.in_progress` | `data` is a run. | Occurs when a run moves to an in_progress status. |
680+
| `thread.run.requires_action` | `data` is a run. | Occurs when a run moves to a `requires_action` status. |
681+
| `thread.run.completed` | `data` is a run. | Occurs when a run is completed. |
682+
| `thread.run.failed` | `data` is a run. | Occurs when a run fails. |
683+
| `thread.run.cancelling` | `data` is a run. | Occurs when a run moves to a `cancelling` status. |
684+
| `thread.run.cancelled` | `data` is a run. | Occurs when a run is cancelled. |
685+
| `thread.run.expired` | `data` is a run. | Occurs when a run expires. |
686+
| `thread.run.step.created` | `data` is a run step. | Occurs when a run step is created. |
687+
| `thread.run.step.in_progress` | `data` is a run step. | Occurs when a run step moves to an `in_progress` state. |
688+
| `thread.run.step.delta` | `data` is a run step delta. | Occurs when parts of a run step are being streamed. |
689+
| `thread.run.step.completed` | `data` is a run step. | Occurs when a run step is completed. |
690+
| `thread.run.step.failed` | `data` is a run step. | Occurs when a run step fails. |
691+
| `thread.run.step.cancelled` | `data` is a run step. | Occurs when a run step is cancelled. |
692+
| `thread.run.step.expired` | `data` is a run step. | Occurs when a run step expires. |
693+
| `thread.message.created` | `data` is a message. | Occurs when a message is created. |
694+
| `thread.message.in_progress` | `data` is a message. | Occurs when a message moves to an in_progress state. |
695+
| `thread.message.delta` | `data` is a message delta. | Occurs when parts of a Message are being streamed. |
696+
| `thread.message.completed` | `data` is a message. | Occurs when a message is completed. |
697+
| `thread.message.incomplete` | `data` is a message. | Occurs when a message ends before it is completed. |
698+
| `error` | `data` is an error. | Occurs when an error occurs. This can happen due to an internal server error or a timeout. |
699+
| `done` | `data` is `[DONE]` | Occurs when a stream ends. |

articles/ai-services/openai/concepts/use-your-data.md

Lines changed: 28 additions & 45 deletions
Original file line numberDiff line numberDiff line change
@@ -62,59 +62,15 @@ The [Integrated Vector Database in vCore-based Azure Cosmos DB for MongoDB](/azu
6262

6363
For some data sources such as uploading files from your local machine (preview) or data contained in a blob storage account (preview), Azure AI Search is used. When you choose the following data sources, your data is ingested into an Azure AI Search index.
6464

65-
>[!TIP]
66-
>If you use Azure Cosmos DB (except for its vCore-based API for MongoDB), you may be eligible for the [Azure AI Advantage offer](/azure/cosmos-db/ai-advantage), which provides the equivalent of up to $6,000 in Azure Cosmos DB throughput credits.
67-
68-
|Data source | Description |
65+
|Data ingested through Azure AI Search | Description |
6966
|---------|---------|
7067
| [Azure AI Search](/azure/search/search-what-is-azure-search) | Use an existing Azure AI Search index with Azure OpenAI On Your Data. |
71-
| [Azure Cosmos DB](/azure/cosmos-db/introduction) | Azure Cosmos DB's API for Postgres and vCore-based API for MongoDB offer natively integrated vector indexing; therefore, they don't require Azure AI Search. However, its other APIs do require Azure AI Search for vector indexing. Azure Cosmos DB for NoSQL's natively integrated vector database debuts in mid-2024. |
7268
|Upload files (preview) | Upload files from your local machine to be stored in an Azure Blob Storage database, and ingested into Azure AI Search. |
7369
|URL/Web address (preview) | Web content from the URLs is stored in Azure Blob Storage. |
7470
|Azure Blob Storage (preview) | Upload files from Azure Blob Storage to be ingested into an Azure AI Search index. |
7571

7672
:::image type="content" source="../media/use-your-data/azure-databases-and-ai-search.png" lightbox="../media/use-your-data/azure-databases-and-ai-search.png" alt-text="Diagram of vector indexing services.":::
7773

78-
# [Vector Database in Azure Cosmos DB for MongoDB](#tab/mongo-db)
79-
80-
### Prerequisites
81-
* [vCore-based Azure Cosmos DB for MongoDB](/azure/cosmos-db/mongodb/vcore/introduction) account
82-
* A deployed [embedding model](../concepts/understand-embeddings.md)
83-
84-
### Limitations
85-
* Only vCore-based Azure Cosmos DB for MongoDB is supported.
86-
* The search type is limited to [Integrated Vector Database in Azure Cosmos DB for MongoDB](/azure/cosmos-db/mongodb/vcore/vector-search) with an Azure OpenAI embedding model.
87-
* This implementation works best on unstructured and spatial data.
88-
89-
90-
### Data preparation
91-
92-
Use the script provided on [GitHub](https://github.com/microsoft/sample-app-aoai-chatGPT/tree/main/scripts#data-preparation) to prepare your data.
93-
94-
<!--### Add your data source in Azure OpenAI Studio
95-
96-
To add vCore-based Azure Cosmos DB for MongoDB as a data source, you'll need an existing Azure Cosmos DB for MongoDB index containing your data, and a deployed Azure OpenAI Ada embeddings model that will be used for vector search.
97-
98-
1. In the [Azure OpenAI portal](https://oai.azure.com/portal) chat playground, select **Add your data**. In the panel that appears, select ** vCore-based Azure Cosmos DB for MongoDB** as the data source.
99-
1. Select your Azure subscription and database account, then connect to your Azure Cosmos DB account by providing your Azure Cosmos DB account username and password.
100-
101-
:::image type="content" source="../media/use-your-data/add-mongo-data-source.png" alt-text="A screenshot showing the screen for adding Mongo DB as a data source in Azure OpenAI Studio." lightbox="../media/use-your-data/add-mongo-data-source.png":::
102-
103-
1. **Select Database**. In the dropdown menus, select the database name, database collection, and index name that you want to use as your data source. Select the embedding model deployment you would like to use for vector search on this data source, and acknowledge that you'll incur charges for using vector search. Then select **Next**.
104-
105-
:::image type="content" source="../media/use-your-data/select-mongo-database.png" alt-text="A screenshot showing the screen for adding Mongo DB settings in Azure OpenAI Studio." lightbox="../media/use-your-data/select-mongo-database.png":::
106-
-->
107-
108-
### Index field mapping
109-
110-
When you add your vCore-based Azure Cosmos DB for MongoDB data source, you can specify data fields to properly map your data for retrieval.
111-
112-
* Content data (required): One or more provided fields to be used to ground the model on your data. For multiple fields, separate the values with commas, with no spaces.
113-
* File name/title/URL: Used to display more information when a document is referenced in the chat.
114-
* Vector fields (required): Select the field in your database that contains the vectors.
115-
116-
:::image type="content" source="../media/use-your-data/mongo-index-mapping.png" alt-text="A screenshot showing the index field mapping options for Mongo DB." lightbox="../media/use-your-data/mongo-index-mapping.png":::
117-
11874
# [Azure AI Search](#tab/ai-search)
11975

12076
You might want to consider using an Azure AI Search index when you either want to:
@@ -179,6 +135,33 @@ If you want to implement additional value-based criteria for query execution, yo
179135

180136
[!INCLUDE [ai-search-ingestion](../includes/ai-search-ingestion.md)]
181137

138+
139+
# [Vector Database in Azure Cosmos DB for MongoDB](#tab/mongo-db)
140+
141+
### Prerequisites
142+
* [vCore-based Azure Cosmos DB for MongoDB](/azure/cosmos-db/mongodb/vcore/introduction) account
143+
* A deployed [embedding model](../concepts/understand-embeddings.md)
144+
145+
### Limitations
146+
* Only vCore-based Azure Cosmos DB for MongoDB is supported.
147+
* The search type is limited to [Integrated Vector Database in Azure Cosmos DB for MongoDB](/azure/cosmos-db/mongodb/vcore/vector-search) with an Azure OpenAI embedding model.
148+
* This implementation works best on unstructured and spatial data.
149+
150+
151+
### Data preparation
152+
153+
Use the script provided on [GitHub](https://github.com/microsoft/sample-app-aoai-chatGPT/tree/main/scripts#data-preparation) to prepare your data.
154+
155+
### Index field mapping
156+
157+
When you add your vCore-based Azure Cosmos DB for MongoDB data source, you can specify data fields to properly map your data for retrieval.
158+
159+
* Content data (required): One or more provided fields to be used to ground the model on your data. For multiple fields, separate the values with commas, with no spaces.
160+
* File name/title/URL: Used to display more information when a document is referenced in the chat.
161+
* Vector fields (required): Select the field in your database that contains the vectors.
162+
163+
:::image type="content" source="../media/use-your-data/mongo-index-mapping.png" alt-text="A screenshot showing the index field mapping options for Mongo DB." lightbox="../media/use-your-data/mongo-index-mapping.png":::
164+
182165
# [Azure Blob Storage (preview)](#tab/blob-storage)
183166

184167
You might want to use Azure Blob Storage as a data source if you want to connect to existing Azure Blob Storage and use files stored in your containers.

articles/ai-services/openai/concepts/use-your-image-data.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: azure-ai-openai
88
ms.topic: quickstart
99
author: aahill
1010
ms.author: aahi
11-
ms.date: 11/02/2023
11+
ms.date: 05/09/2024
1212
recommendations: false
1313
---
1414

@@ -17,7 +17,7 @@ recommendations: false
1717
Use this article to learn how to provide your own image data for GPT-4 Turbo with Vision, Azure OpenAI’s vision model. GPT-4 Turbo with Vision on your data allows the model to generate more customized and targeted answers using Retrieval Augmented Generation based on your own images and image metadata.
1818

1919
> [!IMPORTANT]
20-
> This article is for using your data on the GPT-4 Turbo with Vision model. If you are interested in using your data for text-based models, see [Use your text data](./use-your-data.md).
20+
> Once the GPT4-Turbo with vision preview model is deprecated, you will no longer be able to use Azure OpenAI On your image data. To implement a Retrieval Augmented Generation (RAG) solution with image data, see the following sample on [github](https://github.com/Azure-Samples/azure-search-openai-demo/).
2121
2222
## Prerequisites
2323

0 commit comments

Comments
 (0)