Skip to content

Commit db9b110

Browse files
committed
Merge branch 'semantickernel1224' of https://github.com/lgayhardt/azure-ai-docs-pr into semantickernel1224
2 parents 0d5204c + 735c5db commit db9b110

File tree

1 file changed

+8
-4
lines changed

1 file changed

+8
-4
lines changed

articles/ai-studio/how-to/develop/semantic-kernel.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,10 @@ In this article, you learn how to use [Semantic Kernel](/semantic-kernel/overvie
2929
```bash
3030
pip install semantic-kernel
3131
```
32+
- In this example, we are working with the Azure AI model inference API, hence we install the relevant azure dependencies. You can do it with:
33+
```bash
34+
pip install semantic-kernel[azure]
35+
```
3236

3337
## Configure the environment
3438

@@ -148,7 +152,7 @@ Alternatively, you can stream the response from the service:
148152
chat_history = ChatHistory()
149153
chat_history.add_user_message("Hello, how are you?")
150154
151-
response = chat_completion.get_streaming_chat_message_content(
155+
response = chat_completion_service.get_streaming_chat_message_content(
152156
chat_history=chat_history,
153157
settings=execution_settings,
154158
)
@@ -167,7 +171,7 @@ You can create a long-running conversation by using a loop:
167171
168172
```python
169173
while True:
170-
response = await chat_completion.get_chat_message_content(
174+
response = await chat_completion_service.get_chat_message_content(
171175
chat_history=chat_history,
172176
settings=execution_settings,
173177
)
@@ -180,7 +184,7 @@ If you're streaming the response, you can use the following code:
180184
181185
```python
182186
while True:
183-
response = chat_completion.get_streaming_chat_message_content(
187+
response = chat_completion_service.get_streaming_chat_message_content(
184188
chat_history=chat_history,
185189
settings=execution_settings,
186190
)
@@ -209,7 +213,7 @@ The following code shows how to get embeddings from the service:
209213
210214
```python
211215
embeddings = await embedding_generation_service.generate_embeddings(
212-
text=["My favorite color is blue.", "I love to eat pizza."],
216+
texts=["My favorite color is blue.", "I love to eat pizza."],
213217
)
214218
215219
for embedding in embeddings:

0 commit comments

Comments
 (0)