Skip to content

Commit 800041c

Browse files
Merge pull request #1344 from sdgilley/sdg-release-update-code-qs-tutorial
switch to code refs
2 parents 3506255 + 4a0a83c commit 800041c

File tree

3 files changed

+16
-570
lines changed

3 files changed

+16
-570
lines changed

.openpublishing.publish.config.json

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -104,6 +104,12 @@
104104
"branch": "master",
105105
"branch_mapping": {}
106106
},
107+
{
108+
"path_to_root": "azureai-samples-nov2024",
109+
"url": "https://github.com/Azure-Samples/azureai-samples",
110+
"branch": "dantaylo/nov2024",
111+
"branch_mapping": {}
112+
},
107113
{
108114
"path_to_root": "azureml-examples-main",
109115
"url": "https://github.com/azure/azureml-examples",

articles/ai-studio/quickstarts/get-started-code.md

Lines changed: 4 additions & 58 deletions
Original file line numberDiff line numberDiff line change
@@ -42,28 +42,7 @@ pip install azure-ai-projects azure-ai-inference azure-identity
4242

4343
Create a file named **chat.py**. Copy and paste the following code into it.
4444

45-
```python
46-
from azure.ai.projects import AIProjectClient
47-
from azure.identity import DefaultAzureCredential
48-
49-
project_connection_string = "<your-connection-string-goes-here>"
50-
51-
project = AIProjectClient.from_connection_string(
52-
conn_str=project_connection_string,
53-
credential=DefaultAzureCredential()
54-
)
55-
56-
chat = project.inference.get_chat_completions_client()
57-
response = chat.complete(
58-
model="gpt-4o-mini",
59-
messages=[
60-
{"role": "system", "content": "You are an AI assistant that speaks like a techno punk rocker from 2350. Be cool but not too cool. Ya dig?"},
61-
{"role": "user", "content": "Hey, can you help me with my taxes? I'm a freelancer."},
62-
]
63-
)
64-
65-
print(response.choices[0].message.content)
66-
```
45+
:::code language="python" source="~/azureai-samples-nov2024/scenarios/inference/chat-app/chat-simple.py":::
6746

6847
## Insert your connection string
6948

@@ -89,35 +68,11 @@ The script uses hardcoded input and output messages. In a real app you'd take in
8968

9069
Let's change the script to take input from a client application and generate a system message using a prompt template.
9170

92-
1. Remove the last line of the script that prints a response.
71+
1. Remove the last line of the script that prints a response.
9372

9473
1. Now define a `get_chat_response` function that takes messages and context, generates a system message using a prompt template, and calls a model. Add this code to your **chat.py** file:
9574

96-
```python
97-
from azure.ai.inference.prompts import PromptTemplate
98-
99-
def get_chat_response(messages, context):
100-
# create a prompt template from an inline string (using mustache syntax)
101-
prompt_template = PromptTemplate.from_message(prompt_template="""
102-
system:
103-
You are an AI assistant that speaks like a techno punk rocker from 2350. Be cool but not too cool. Ya dig? Refer to the user by their first name, try to work their last name into a pun.
104-
105-
The user's first name is {{first_name}} and their last name is {{last_name}}.
106-
""")
107-
108-
# generate system message from the template, passing in the context as variables
109-
system_message = prompt_template.render(data=context)
110-
111-
# add the prompt messages to the user messages
112-
response = chat.complete(
113-
model="gpt-4o-mini",
114-
messages=system_message + messages,
115-
temperature=1,
116-
frequency_penalty=0.5,
117-
presence_penalty=0.5)
118-
119-
return response
120-
```
75+
:::code language="python" source="~/azureai-samples-nov2024/scenarios/inference/chat-app/chat-template.py" id="chat_function":::
12176

12277
> [!NOTE]
12378
> The prompt template uses mustache format.
@@ -126,16 +81,7 @@ Let's change the script to take input from a client application and generate a s
12681

12782
1. Now simulate passing information from a frontend application to this function. Add the following code to the end of your **chat.py** file. Feel free to play with the message and add your own name.
12883

129-
```python
130-
response = get_chat_response(
131-
messages=[{"role": "user", "content": "what city has the best food in the world?"}],
132-
context = {
133-
"first_name": "Jessie",
134-
"last_name": "Irwin"
135-
}
136-
)
137-
print(response.choices[0].message.content)
138-
```
84+
:::code language="python" source="~/azureai-samples-nov2024/scenarios/inference/chat-app/chat-template.py" id="create_response":::
13985

14086
Run the script to see the response from the model with this new input.
14187

0 commit comments

Comments
 (0)