You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can use the project client to get a configured and authenticated `ChatCompletionsClient` or `EmbeddingsClient`:
@@ -241,12 +241,6 @@ print(messages)
241
241
> [!NOTE]
242
242
> Leading whitespace is automatically trimmed from input strings.
243
243
244
-
::: zone-end
245
-
246
-
::: zone pivot="programming-language-csharp"
247
-
248
-
::: zone-end
249
-
250
244
This code outputs messages that you can then pass to a chat completion call:
251
245
252
246
```text
@@ -258,8 +252,6 @@ This code outputs messages that you can then pass to a chat completion call:
258
252
259
253
You can also load prompts from a [`Prompty`](https://prompty.ai) file, enabling you to also load the model name and parameters from the `.prompty` file:
260
254
261
-
::: zone pivot="programming-language-python"
262
-
263
255
```Python
264
256
from azure.ai.inference.prompts import PromptTemplate
265
257
@@ -277,8 +269,17 @@ response = chat.complete(
277
269
278
270
::: zone pivot="programming-language-csharp"
279
271
272
+
```dotnet
273
+
dotnet add package Azure.AI.Inference
274
+
```
275
+
276
+
You can render a prompt template from an inline string:
If you have an Azure AI Search resource connected to your project, you can also use the project client to create an Azure AI Search client using the project connection.
0 commit comments