Skip to content

Commit a77bfb2

Browse files
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into aca/certificate-updates
2 parents 46c0807 + 06147b3 commit a77bfb2

File tree

46 files changed

+516
-84
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+516
-84
lines changed

articles/active-directory-b2c/custom-policies-series-sign-up-or-sign-in-federation.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -225,7 +225,7 @@ Notice the claims transformations we defined in [step 3.2](#step-32---define-cla
225225

226226
Just like in sign-in with a local account, you need to configure the [Microsoft Entra Technical Profiles](active-directory-technical-profile.md), which you use to connect to Microsoft Entra ID storage, to store or read a user social account.
227227

228-
1. In the `ContosoCustomPolicy.XML` file, locate the `AAD-UserRead` technical profile and then add a new technical profile by using the following code:
228+
1. In the `ContosoCustomPolicy.XML` file, locate the `AAD-UserRead` technical profile and then add a new technical profile below it by using the following code:
229229

230230
```xml
231231
<TechnicalProfile Id="AAD-UserWriteUsingAlternativeSecurityId">
@@ -517,6 +517,7 @@ Use the following steps to add a combined local and social account:
517517
```xml
518518
<OutputClaim ClaimTypeReferenceId="authenticationSource" DefaultValue="localIdpAuthentication" AlwaysUseDefaultValue="true" />
519519
```
520+
Make sure you also add the `authenticationSource` claim in the output claims collection of the `UserSignInCollector` self-asserted technical profile.
520521

521522
1. In the `UserJourneys` section, add a new user journey, `LocalAndSocialSignInAndSignUp` by using the following code:
522523

articles/ai-services/openai/includes/create-resource-portal.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,9 @@ As an option, you can add a private endpoint for access to your resource. Select
8989

9090
1. Confirm your configuration settings, and select **Create**.
9191

92-
The Azure portal displays a notification when the new resource is available.
92+
1. The Azure portal displays a notification when the new resource is available. Select **Go to resource**.
93+
94+
:::image type="content" source="../media/create-resource/create-resource-go-to-resource.png" alt-text="Screenshot showing the Go to resource button in the Azure portal.":::
9395

9496
## Deploy a model
9597

94.6 KB
Loading

articles/ai-services/use-key-vault.md

Lines changed: 12 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -206,7 +206,7 @@ namespace key_vault_console_app
206206

207207
## Run the application
208208

209-
Run the application by selecting the **Debug** button at the top of Visual studio. Your key and endpoint secrets will be retrieved from your key vault.
209+
Run the application by selecting the **Debug** button at the top of Visual Studio. Your key and endpoint secrets will be retrieved from your key vault.
210210

211211
## Send a test Language service call (optional)
212212

@@ -376,18 +376,17 @@ In your project, add the following dependencies to your `pom.xml` file.
376376

377377
```xml
378378
<dependencies>
379-
380-
<dependency>
381-
<groupId>com.azure</groupId>
382-
<artifactId>azure-security-keyvault-secrets</artifactId>
383-
<version>4.2.3</version>
384-
</dependency>
385-
<dependency>
386-
<groupId>com.azure</groupId>
387-
<artifactId>azure-identity</artifactId>
388-
<version>1.2.0</version>
389-
</dependency>
390-
</dependencies>
379+
<dependency>
380+
<groupId>com.azure</groupId>
381+
<artifactId>azure-security-keyvault-secrets</artifactId>
382+
<version>4.2.3</version>
383+
</dependency>
384+
<dependency>
385+
<groupId>com.azure</groupId>
386+
<artifactId>azure-identity</artifactId>
387+
<version>1.2.0</version>
388+
</dependency>
389+
</dependencies>
391390
```
392391

393392
## Import the example code

articles/ai-studio/how-to/configure-managed-network.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,11 @@ The following diagram shows a managed VNet configured to __allow only approved o
7777
### Connectivity to other services
7878

7979
* Azure AI services provisioned with Azure AI hub and Azure AI Search attached with Azure AI hub should be public.
80-
* The "Add your data" feature in the Azure AI Studio playground doesn't support private storage account.
80+
* The "Add your data" feature in the Azure AI Studio playground doesn't support using a virtual network or private endpoint on the following resources:
81+
* Azure AI Search
82+
* Azure OpenAI
83+
* Storage resource
84+
8185

8286
## Configure a managed virtual network to allow internet outbound
8387

articles/ai-studio/how-to/prompt-flow-tools/llm-tool.md

Lines changed: 176 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -17,14 +17,51 @@ author: lgayhardt
1717

1818
[!INCLUDE [Azure AI Studio preview](../../includes/preview-ai-studio.md)]
1919

20-
To use large language models (LLMs) for natural language processing, you use the prompt flow LLM tool.
20+
The large language model (LLM) tool in prompt flow enables you to take advantage of widely used large language models like [OpenAI](https://platform.openai.com/), [Azure OpenAI Service](../../../ai-services/openai/overview.md), and models in [Azure AI Studio model catalog](../model-catalog.md) for natural language processing.
21+
> [!NOTE]
22+
> The previous version of the LLM tool is now being deprecated. Please upgrade to latest [promptflow-tools](https://pypi.org/project/promptflow-tools/) package to consume new llm tools.
23+
24+
Prompt flow provides a few different large language model APIs:
25+
26+
- [Completion](https://platform.openai.com/docs/api-reference/completions): OpenAI's completion models generate text based on provided prompts.
27+
- [Chat](https://platform.openai.com/docs/api-reference/chat): OpenAI's chat models facilitate interactive conversations with text-based inputs and responses.
2128

2229
> [!NOTE]
23-
> For embeddings to convert text into dense vector representations for various natural language processing tasks, see [Embedding tool](embedding-tool.md).
30+
> Don't use non-ascii characters in resource group name of Azure OpenAI resource, prompt flow didn't support this case.
2431
2532
## Prerequisites
2633

27-
Prepare a prompt as described in the [Prompt tool](prompt-tool.md#prerequisites) documentation. The LLM tool and Prompt tool both support [Jinja](https://jinja.palletsprojects.com/en/3.1.x/) templates. For more information and best practices, see [Prompt engineering techniques](../../../ai-services/openai/concepts/advanced-prompt-engineering.md).
34+
Create OpenAI resources, Azure OpenAI resources, or MaaS deployment with the LLM models (for example: llama2, mistral, cohere etc.) in Azure AI Studio model catalog:
35+
36+
- **OpenAI**:
37+
38+
- Sign up your account on the [OpenAI website](https://openai.com/).
39+
- Sign in and [find your personal API key](https://platform.openai.com/account/api-keys).
40+
41+
- **Azure OpenAI**:
42+
43+
- [Create Azure OpenAI resources](../../../ai-services/openai/how-to/create-resource.md).
44+
45+
- **MaaS deployment**:
46+
47+
[Create MaaS deployment for models in Azure AI Studio model catalog](../../concepts/deployments-overview.md#deploy-models-with-model-as-a-service).
48+
49+
You can create serverless connection to use this MaaS deployment.
50+
51+
## Connections
52+
53+
Set up connections to provisioned resources in prompt flow.
54+
55+
| Type | Name | API key | API base | API type | API version |
56+
|-------------|----------|----------|----------|----------|-------------|
57+
| OpenAI | Required | Required | - | - | - |
58+
| Azure OpenAI| Required | Required | Required | Required | Required |
59+
| Serverless | Required | Required | Required | - | - |
60+
61+
> [!TIP]
62+
> - To use Microsoft Entra ID auth type for Azure OpenAI connection, you need assign either the `Cognitive Services OpenAI User` or `Cognitive Services OpenAI Contributor role` to user or user assigned managed identity.
63+
> - Learn more about [how to specify to use user identity to submit flow run](../create-manage-runtime.md#create-an-automatic-runtime-on-a-flow-page).
64+
> - Learn more about [How to configure Azure OpenAI Service with managed identities](../../../ai-services/openai/how-to/managed-identity.md).
2865
2966
## Build with the LLM tool
3067

@@ -35,7 +72,7 @@ Prepare a prompt as described in the [Prompt tool](prompt-tool.md#prerequisites)
3572

3673
1. Select the connection to one of your provisioned resources. For example, select **Default_AzureOpenAI**.
3774
1. From the **Api** dropdown list, select **chat** or **completion**.
38-
1. Enter values for the LLM tool input parameters described in the [Text completion inputs table](#inputs). If you selected the **chat** API, see the [Chat inputs table](#chat-inputs). If you selected the **completion** API, see the [Text completion inputs table](#text-completion-inputs). For information about how to prepare the prompt input, see [Prerequisites](#prerequisites).
75+
1. Enter values for the LLM tool input parameters described in the [Text completion inputs table](#inputs). If you selected the **chat** API, see the [Chat inputs table](#chat-inputs). If you selected the **completion** API, see the [Text completion inputs table](#text-completion-inputs). For information about how to prepare the prompt input, see [How to write a prompt](#how-to-write-a-prompt).
3976
1. Add more tools to your flow, as needed. Or select **Run** to run the flow.
4077
1. The outputs are described in the [Outputs table](#outputs).
4178

@@ -74,15 +111,146 @@ The following input parameters are available.
74111
| presence\_penalty | float | The value that controls the model's behavior regarding repeating phrases. Default is 0. | No |
75112
| frequency\_penalty | float | The value that controls the model's behavior regarding generating rare phrases. Default is 0. | No |
76113
| logit\_bias | dictionary | The logit bias for the language model. Default is empty dictionary. | No |
114+
| tool\_choice | object | Value that controls which tool is called by the model. Default is null. | No |
115+
| tools | list | A list of tools the model may generate JSON inputs for. Default is null. | No |
116+
| response_format | object | An object specifying the format that the model must output. Default is null. | No |
77117

78118
## Outputs
79119

80120
The output varies depending on the API you selected for inputs.
81121

82-
| API | Return type | Description |
83-
|------------|-------------|------------------------------------------|
84-
| Completion | string | The text of one predicted completion. |
85-
| Chat | string | The text of one response of conversation. |
122+
| Return type | Description |
123+
|-------------|------------------------------------------|
124+
| string | Text of one predicted completion or response of conversation |
125+
126+
## How to write a prompt?
127+
128+
Prepare a prompt as described in the [Prompt tool](prompt-tool.md#prerequisites) documentation. The LLM tool and Prompt tool both support [Jinja](https://jinja.palletsprojects.com/en/3.1.x/) templates. For more information and best practices, see [Prompt engineering techniques](../../../ai-services/openai/concepts/advanced-prompt-engineering.md).
129+
130+
For example, for a chat prompt we offer a method to distinguish between different roles in a chat prompt, such as "system", "user", "assistant" and "tool". The "system", "user", "assistant" roles can have "name" and "content" properties. The "tool" role, however, should have "tool_call_id" and "content" properties. For an example of a tool chat prompt, please refer to [Sample 3](#sample-3).
131+
132+
### Sample 1
133+
```jinja
134+
# system:
135+
You are a helpful assistant.
136+
137+
{% for item in chat_history %}
138+
# user:
139+
{{item.inputs.question}}
140+
# assistant:
141+
{{item.outputs.answer}}
142+
{% endfor %}
143+
144+
# user:
145+
{{question}}
146+
```
147+
148+
In LLM tool, the prompt is transformed to match the [OpenAI messages](https://platform.openai.com/docs/api-reference/chat/create#chat-create-messages) structure before sending to OpenAI chat API.
149+
150+
```
151+
[
152+
{
153+
"role": "system",
154+
"content": "You are a helpful assistant."
155+
},
156+
{
157+
"role": "user",
158+
"content": "<question-of-chat-history-round-1>"
159+
},
160+
{
161+
"role": "assistant",
162+
"content": "<answer-of-chat-history-round-1>"
163+
},
164+
...
165+
{
166+
"role": "user",
167+
"content": "<question>"
168+
}
169+
]
170+
```
171+
172+
### Sample 2
173+
```jinja
174+
# system:
175+
{# For role naming customization, the following syntax is used #}
176+
## name:
177+
Alice
178+
## content:
179+
You are a bot can tell good jokes.
180+
```
181+
182+
In LLM tool, the prompt is transformed to match the [OpenAI messages](https://platform.openai.com/docs/api-reference/chat/create#chat-create-messages) structure before sending to OpenAI chat API.
183+
184+
```
185+
[
186+
{
187+
"role": "system",
188+
"name": "Alice",
189+
"content": "You are a bot can tell good jokes."
190+
}
191+
]
192+
```
193+
194+
### Sample 3
195+
This sample illustrates how to write a tool chat prompt.
196+
```jinja
197+
# system:
198+
You are a helpful assistant.
199+
# user:
200+
What is the current weather like in Boston?
201+
# assistant:
202+
{# The assistant message with 'tool_calls' must be followed by messages with role 'tool'. #}
203+
## tool_calls:
204+
{{llm_output.tool_calls}}
205+
# tool:
206+
{#
207+
Messages with role 'tool' must be a response to a preceding message with 'tool_calls'.
208+
Additionally, 'tool_call_id's should match ids of assistant message 'tool_calls'.
209+
#}
210+
## tool_call_id:
211+
{{llm_output.tool_calls[0].id}}
212+
## content:
213+
{{tool-answer-of-last-question}}
214+
# user:
215+
{{question}}
216+
```
217+
218+
In LLM tool, the prompt is transformed to match the [OpenAI messages](https://platform.openai.com/docs/api-reference/chat/create#chat-create-messages) structure before sending to OpenAI chat API.
219+
220+
```
221+
[
222+
{
223+
"role": "system",
224+
"content": "You are a helpful assistant."
225+
},
226+
{
227+
"role": "user",
228+
"content": "What is the current weather like in Boston?"
229+
},
230+
{
231+
"role": "assistant",
232+
"content": null,
233+
"function_call": null,
234+
"tool_calls": [
235+
{
236+
"id": "<tool-call-id-of-last-question>",
237+
"type": "function",
238+
"function": "<function-to-call-of-last-question>"
239+
}
240+
]
241+
},
242+
{
243+
"role": "tool",
244+
"tool_call_id": "<tool-call-id-of-last-question>",
245+
"content": "<tool-answer-of-last-question>"
246+
}
247+
...
248+
{
249+
"role": "user",
250+
"content": "<question>"
251+
}
252+
]
253+
```
86254

87255
## Next steps
88256

articles/ai-studio/how-to/prompt-flow-tools/prompt-flow-tools-overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ The following table provides an index of tools in prompt flow.
1919

2020
| Tool name | Description | Package name |
2121
|------|-----------|-------------|--------------|
22-
| [LLM](./llm-tool.md) | Use large language models (LLM) with the Azure OpenAI Service for tasks such as text completion or chat. | [promptflow-tools](https://pypi.org/project/promptflow-tools/) |
22+
| [LLM](./llm-tool.md) | Use large language models (LLM) for tasks such as text completion or chat. | [promptflow-tools](https://pypi.org/project/promptflow-tools/) |
2323
| [Prompt](./prompt-tool.md) | Craft a prompt by using Jinja as the templating language. | [promptflow-tools](https://pypi.org/project/promptflow-tools/) |
2424
| [Python](./python-tool.md) | Run Python code. | [promptflow-tools](https://pypi.org/project/promptflow-tools/) |
2525
| [Azure OpenAI GPT-4 Turbo with Vision](./azure-open-ai-gpt-4v-tool.md) | Use an Azure OpenAI GPT-4 Turbo with Vision model deployment to analyze images and provide textual responses to questions about them. | [promptflow-tools](https://pypi.org/project/promptflow-tools/) |
-46.7 KB
Loading

articles/ai-studio/tutorials/deploy-chat-web-app.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -79,6 +79,12 @@ In the next section, you'll add your data to the model to help it answer questio
7979

8080
Follow these steps to add your data to the playground to help the assistant answer questions about your products. You're not changing the deployed model itself. Your data is stored separately and securely in your Azure subscription.
8181

82+
> [!IMPORTANT]
83+
> The "Add your data" feature in the Azure AI Studio playground doesn't support using a virtual network or private endpoint on the following resources:
84+
> * Azure AI Search
85+
> * Azure OpenAI
86+
> * Storage resource
87+
8288
1. If you aren't already in the playground, select **Build** from the top menu and then select **Playground** from the collapsible left menu.
8389
1. On the **Assistant setup** pane, select **Add your data (preview)** > **+ Add a data source**.
8490

articles/azure-cache-for-redis/cache-managed-identity.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -191,4 +191,4 @@ If you're not using managed identity and instead authorizing a storage account w
191191
## Related content
192192

193193
- [Learn more](cache-overview.md#service-tiers) about Azure Cache for Redis features
194-
- [What are managed identifies](../active-directory/managed-identities-azure-resources/overview.md)
194+
- [What are managed identities](../active-directory/managed-identities-azure-resources/overview.md)

0 commit comments

Comments
 (0)