You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/openai/how-to/integrate-synapseml.md
+9-20Lines changed: 9 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.service: cognitive-services
8
8
ms.subservice: openai
9
9
ms.custom: build-2023, build-2023-dataai
10
10
ms.topic: how-to
11
-
ms.date: 08/30/2023
11
+
ms.date: 09/01/2023
12
12
author: ChrisHMSFT
13
13
ms.author: chrhoder
14
14
recommendations: false
@@ -26,14 +26,19 @@ This tutorial shows how to apply large language models at a distributed scale by
26
26
27
27
- Access granted to Azure OpenAI in your Azure subscription.
28
28
29
+
Currently, you must submit an application to access Azure OpenAI Service. To apply for access, complete <ahref="https://aka.ms/oai/access"target="_blank">this form</a>. If you need assistance, open an issue on this repo to contact Microsoft.
30
+
29
31
- An Azure OpenAI resource. [Create a resource](create-resource.md?pivots=web-portal#create-a-resource).
30
32
31
33
- An Apache Spark cluster with SynapseML installed.
34
+
32
35
- Create a [serverless Apache Spark pool](../../../synapse-analytics/get-started-analyze-spark.md#create-a-serverless-apache-spark-pool).
33
36
- To install SynapseML for your Apache Spark cluster, see [Install SynapseML](#install-synapseml).
34
37
35
38
> [!NOTE]
36
-
> Currently, you must submit an application to access Azure OpenAI Service. To apply for access, complete <ahref="https://aka.ms/oai/access"target="_blank">this form</a>. If you need assistance, open an issue on this repo to contact Microsoft.
39
+
> This article is designed to work with the [Azure OpenAI Service legacy models](/azure/ai-services/openai/concepts/legacy-models) that support prompt-based completions like `Text-Davinci-003`. Newer models like the current `GPT-3.5 Turbo` and `GPT-4` model series are designed to work with the new chat completion API that expects a specially formatted array of messages as input.
40
+
>
41
+
> The Azure OpenAI SynapseML integration supports the latest models via the [OpenAIChatCompletion()](https://github.com/microsoft/SynapseML/blob/0836e40efd9c48424e91aa10c8aa3fbf0de39f31/cognitive/src/main/scala/com/microsoft/azure/synapse/ml/cognitive/openai/OpenAIChatCompletion.scala#L24) transformer, which isn't demonstrated in this article. After the [release of the GPT-3.5 Turbo Instruct model](https://techcommunity.microsoft.com/t5/azure-ai-services-blog/announcing-updates-to-azure-openai-service-models/ba-p/3866757), the newer model will be the preferred model to use with this article.
37
42
38
43
We recommend that you [create an Azure Synapse workspace](../../../synapse-analytics/get-started-create-workspace.md). However, you can also use Azure Databricks, Azure HDInsight, Spark on Kubernetes, or the Python environment with the `pyspark` package.
The following image shows example output with completions for multiple prompts in a request:
227
-
228
-
:::image type="content" source="../media/how-to/synapse-studio-request-batch-output.png" alt-text="Screenshot that shows completions for multiple prompts in a single request in Azure Synapse Analytics Studio." border="false":::
229
-
230
231
> [!NOTE]
231
232
> There's currently a limit of 20 prompts in a single request and a limit of 2048 tokens, or approximately 1500 words.
232
233
@@ -248,10 +249,6 @@ completed_autobatch_df = (df
248
249
display(completed_autobatch_df)
249
250
```
250
251
251
-
The following image shows example output for an automatic mini-batcher that transposes data to row format:
252
-
253
-
:::image type="content" source="../media/how-to/synapse-studio-transpose-data-output.png" alt-text="Screenshot that shows completions for an automatic mini-batcher in Azure Synapse Analytics Studio." border="false":::
254
-
255
252
### Prompt engineering for translation
256
253
257
254
Azure OpenAI can solve many different natural language tasks through _prompt engineering_. For more information, see [Learn how to generate or manipulate text](completions.md). In this example, you can prompt for language translation:
The following image shows example output for language translation prompts:
271
-
272
-
:::image type="content" source="../media/how-to/synapse-studio-language-translation-output.png" alt-text="Screenshot that shows completions for language translation prompts in Azure Synapse Analytics Studio." border="false":::
273
-
274
267
### Prompt for question answering
275
268
276
-
Azure OpenAI also supports prompting the GPT-3 model for general-knowledge question answering:
269
+
Azure OpenAI also supports prompting the `Text-Davinci-003` model for general-knowledge question answering:
0 commit comments