Skip to content

Commit 9547ace

Browse files
Merge pull request #2008 from sdgilley/sdg-quickstart
change branch for quickstart
2 parents b13e75b + 28df1be commit 9547ace

File tree

2 files changed

+8
-8
lines changed

2 files changed

+8
-8
lines changed

.openpublishing.publish.config.json

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -111,9 +111,9 @@
111111
"branch_mapping": {}
112112
},
113113
{
114-
"path_to_root": "azureai-samples-nov2024",
115-
"url": "https://github.com/Azure-Samples/azureai-samples",
116-
"branch": "dantaylo/nov2024",
114+
"path_to_root": "azureai-samples-temp",
115+
"url": "https://github.com/sdgilley/azureai-samples",
116+
"branch": "sdg-add-quickstart",
117117
"branch_mapping": {}
118118
},
119119
{

articles/ai-studio/quickstarts/get-started-code.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ pip install azure-ai-projects azure-ai-inference azure-identity
4242

4343
Create a file named **chat.py**. Copy and paste the following code into it.
4444

45-
:::code language="python" source="~/azureai-samples-nov2024/scenarios/inference/chat-app/chat-simple.py":::
45+
:::code language="python" source="~/azureai-samples-temp/scenarios/inference/chat-app/chat-simple.py":::
4646

4747
## Insert your connection string
4848

@@ -70,9 +70,9 @@ Let's change the script to take input from a client application and generate a s
7070

7171
1. Remove the last line of the script that prints a response.
7272

73-
1. Now define a `get_chat_response` function that takes messages and context, generates a system message using a prompt template, and calls a model. Add this code to your **chat.py** file:
73+
1. Now define a `get_chat_response` function that takes messages and context, generates a system message using a prompt template, and calls a model. Add this code to your existing **chat.py** file:
7474

75-
:::code language="python" source="~/azureai-samples-nov2024/scenarios/inference/chat-app/chat-template.py" id="chat_function":::
75+
:::code language="python" source="~/azureai-samples-temp/scenarios/inference/chat-app/chat-template.py" id="chat_function":::
7676

7777
> [!NOTE]
7878
> The prompt template uses mustache format.
@@ -81,9 +81,9 @@ Let's change the script to take input from a client application and generate a s
8181

8282
1. Now simulate passing information from a frontend application to this function. Add the following code to the end of your **chat.py** file. Feel free to play with the message and add your own name.
8383

84-
:::code language="python" source="~/azureai-samples-nov2024/scenarios/inference/chat-app/chat-template.py" id="create_response":::
84+
:::code language="python" source="~/azureai-samples-temp/scenarios/inference/chat-app/chat-template.py" id="create_response":::
8585

86-
Run the script to see the response from the model with this new input.
86+
Run the revised script to see the response from the model with this new input.
8787

8888
```bash
8989
python chat.py

0 commit comments

Comments
 (0)