Skip to content

Commit ba01299

Browse files
Merge pull request #1461 from qubitron/patch-1
Update sdk-overview.md
2 parents 85aa905 + 45cbaca commit ba01299

File tree

1 file changed

+20
-8
lines changed

1 file changed

+20
-8
lines changed

articles/ai-studio/how-to/develop/sdk-overview.md

Lines changed: 20 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -83,13 +83,14 @@ Be sure to check out the [reference](https://aka.ms/aifoundrysdk/reference) and
8383

8484
## Azure OpenAI Service
8585

86-
If you have existing code that uses the OpenAI SDK, you can use the project client to create an ```AzureOpenAI``` client that uses your project's Azure OpenAI connection.
86+
The [Azure OpenAI Service](../../../ai-services/openai/overview.md) allows you to access OpenAI models the same day they launch. It integrates with the rest of Azure and provides enterprise scale and controls.
87+
88+
If you have code that uses the OpenAI SDK, you can easily target your code to use the Azure OpenAI service. First, install the OpenAI SDK:
8789
```
8890
pip install openai
8991
```
9092

91-
Now use the project client to return an ```AzureOpenAI``` client with the desired API version and make a chat completions call.
92-
93+
If you have existing code that uses the OpenAI SDK, you can use the project client to create an ```AzureOpenAI``` client that uses your project's Azure OpenAI connection:
9394
```Python
9495
openai = project.inference.get_azure_openai_client(api_version="2024-06-01")
9596
response = openai.chat.completions.create(
@@ -102,8 +103,8 @@ response = openai.chat.completions.create(
102103

103104
print(response.choices[0].message.content)
104105
```
106+
If you’re already using the [Azure OpenAI SDK](../../../ai-services/openai/chatgpt-quickstart.md) directly against the Azure OpenAI Service, the project provides a convenient way to use Azure OpenAI Service capabilities alongside the rest of the AI Foundry capabilities.
105107

106-
For more on using the Azure OpenAI client library, including how to use it directly against with the Azure OpenAI Service, check out [Azure OpenAI chat quickstart](../../../ai-services/openai/chatgpt-quickstart.md).
107108

108109
## Azure AI model inference service
109110

@@ -168,6 +169,7 @@ prompt_template = PromptTemplate.from_string(prompt_template="""
168169
messages = prompt_template.create_messages(first_name="Jane", last_name="Doe")
169170
print(messages)
170171
```
172+
NOTE: leading whitespace is automatically trimmed from input strings.
171173

172174
This code outputs messages that you can then pass to a chat completions call:
173175

@@ -178,8 +180,6 @@ This code outputs messages that you can then pass to a chat completions call:
178180
]
179181
```
180182

181-
NOTE: leading whitespace is automatically trimmed from input strings.
182-
183183
You can also load prompts from a [`Prompty`](https://prompty.ai) file, enabling you to also load the model name and parameters from the `.prompty` file:
184184

185185
```Python
@@ -243,8 +243,11 @@ To get access to agents, [sign-up for the private preview](https://nam.dcv.ms/nz
243243
## Evaluation
244244

245245
You can use the project client to easily connect to the Azure AI evaluation service, and models needed for running your evaluators.
246+
```
247+
pip install azure-ai-evaluation
248+
```
246249

247-
Using the ```project.scope``` parameter, we can easily instantiate a ```ViolenceEvaluator```:
250+
Using the ```project.scope``` parameter, we can instantiate a ```ViolenceEvaluator```:
248251
```Python
249252
from azure.ai.evaluation import ViolenceEvaluator
250253
from azure.identity import DefaultAzureCredential
@@ -258,13 +261,20 @@ violence_eval = ViolenceEvaluator(
258261
violence_score = violence_eval(query="what's the capital of france", response="Paris")
259262
print(violence_score)
260263
```
264+
NOTE: to run violence evaluators your project needs to be in East US 2, Sweden Central, US North Central, France Central.
261265

262266
To learn more, check out [Evaluation using the SDK](evaluate-sdk.md).
263267

264268
## Tracing
265269

266270
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project and follow instructions to create or attach Application Insights.
267271

272+
Install the Azure Monitor OpenTelemetry package:
273+
```
274+
pip install azure-monitor-opentelemetry
275+
```
276+
277+
Use the following code to enable instrumentation of the Azure AI Inference SDK and logging to your AI project:
268278
```Python
269279
# Enable instrumentation of AI packages (inference, agents, openai, langchain)
270280
project.telemetry.enable()
@@ -286,7 +296,7 @@ Client libraries:
286296
* [Azure AI services SDKs](../../../ai-services/reference/sdk-package-resources.md?context=/azure/ai-studio/context/context)
287297
* [Azure AI services REST APIs](../../../ai-services/reference/rest-api-resources.md?context=/azure/ai-studio/context/context)
288298

289-
Azure AI services
299+
Management libraries:
290300
* [Azure AI Services Python Management Library](/python/api/overview/azure/mgmt-cognitiveservices-readme)
291301
* [Azure AI Search Python Management Library](/python/api/azure-mgmt-search/azure.mgmt.search)
292302

@@ -304,6 +314,8 @@ Prompt flow
304314
* [pfazure CLI](https://microsoft.github.io/promptflow/reference/pfazure-command-reference.html)
305315
* [pfazure Python library](https://microsoft.github.io/promptflow/reference/python-library-reference/promptflow-azure/promptflow.azure.html)
306316

317+
Semantic Kernel
318+
* [Semantic Kernel Overview](https://learn.microsoft.com/semantic-kernel/overview/)
307319
Agentic frameworks
308320

309321
* [LlamaIndex](llama-index.md)

0 commit comments

Comments
 (0)