Skip to content

Commit ed13b46

Browse files
committed
Edit pass
1 parent a78f02a commit ed13b46

File tree

1 file changed

+37
-20
lines changed

1 file changed

+37
-20
lines changed

docs/intelligentapps/promptbuilder.md

Lines changed: 37 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -5,58 +5,75 @@ MetaDescription: Get Started with creating, iterating and optimizing your prompt
55
---
66
# Prompt engineering in AI Toolkit
77

8-
Prompt builder in AI Toolkit streamlines the prompt engineering workflow by generating starter prompts, helping you iterate and refine with each run, breaking down complex tasks through prompt chaining and structured outputs, and providing easy access to code for seamless LLM integration via APIs.
8+
Prompt builder in AI Toolkit streamlines the prompt engineering workflow. It can generate starter prompts, helping you iterate and refine with each run, break down complex tasks through prompt chaining and structured outputs, and provide easy access to code for seamless Large Language Model (LLM) integration via APIs.
99

1010
![Getting started with prompt builder](./images/promptbuilder/promptbuilder.gif)
1111

12-
## Create, edit and test prompts
12+
## Create, edit, and test prompts
13+
14+
To access the prompt builder, use either of these options:
1315

14-
To access the prompt builder, use one of these options:
1516
- In the AI Toolkit view, select **Prompt Builder**
1617
- Select **Try in Prompt Builder** from a model card in the model catalog
1718

1819
To test a prompt in the prompt builder, follow these steps:
20+
1921
1. In **Models**, select a model from the dropdown list, or select **Browse models** to add another model from the model catalog.
22+
2023
![select a model](./images/promptbuilder/s1_models.png)
21-
2. Enter a **User prompt** and optionally enter a **System prompt**
22-
The user prompt is the input that you want to send to the model. The optional system prompt is used to provide instructions with relevant context to guide the model response.
24+
25+
1. Enter a **User prompt** and optionally enter a **System prompt**.
26+
27+
The *user prompt* is the input that you want to send to the model. The optional *system prompt* is used to provide instructions with relevant context to guide the model response.
2328

2429
> [!TIP]
2530
> If you don't know how to input these prompts, you can describe your project idea in natural language and let the AI-powered feature generate prompts for you to experiment with.
2631
> ![generate prompts with natural language](./images/promptbuilder/generate_prompt.gif)
27-
3. Select **Run** to send the prompts to the selected model
28-
4. Optionally, select **Add Prompts** to add more user and assistant prompts to the conversation, or select **Use Response as Assistant Prompt** as the history and context you send to the model to further guide the model's behavior
29-
5. Repeat the previous steps to iterate over your prompts by observing the model response and making changes to the prompts
32+
33+
1. Select **Run** to send the prompts to the selected model.
34+
35+
1. Optionally, select **Add Prompts** to add more user and assistant prompts to the conversation, or select **Use Response as Assistant Prompt** as the history and context you send to the model to further guide the model's behavior.
36+
37+
1. Repeat the previous steps to iterate over your prompts by observing the model response and making changes to the prompts.
3038

3139
## Structured output
3240

3341
Structured output support helps you design prompts to deliver outputs in a structured, predictable format.
42+
3443
![Use structured output](./images/promptbuilder/structured_output.gif)
3544

3645
To test a prompt in the prompt builder, follow these steps:
3746

38-
1. Select the **Format** dropdown in the **Response** area, and select **json_schema**
39-
2. Select **Prepare schema**
40-
3. From the Command Palette, either select **Select local file** to use your own schema, or select **Use an example** to use a predefined schema
41-
If you proceed with an example, you can select a schema from the dropdown list
42-
5. Select **Run** to send the prompts to the selected model
43-
6. You can also edit the schema by selecting **Edit**
47+
1. Select the **Format** dropdown in the **Response** area, and select **json_schema**.
48+
49+
1. Select **Prepare schema**, and then select **Select local file** to use your own schema, or select **Use an example** to use a predefined schema.
50+
51+
If you proceed with an example, you can select a schema from the dropdown list.
52+
53+
1. Select **Run** to send the prompts to the selected model.
54+
55+
1. You can also edit the schema by selecting **Edit**.
56+
4457
![edit schema](./images/promptbuilder/edit_schema.png)
4558

4659
## Integrate prompt engineering into your application
4760

4861
After experimenting with models and prompts, you can get into coding right away with the automatically generated Python code.
62+
4963
![view code](./images/promptbuilder/view_code.gif)
64+
5065
To view the Python code, follow these steps:
51-
1. Select **View Code**
52-
1. Select the inference SDK you want to use if it's hosted by GitHub
5366

54-
AI Toolkit generates the code for the model you selected by using the provider's client SDK. For models hosted by GitHub, you have the option to select the inference SDK you want to use: [Azure AI Inference SDK](https://learn.microsoft.com/python/api/overview/azure/ai-inference-readme?view=azure-python-preview) or the SDK from the model provider, such as [OpenAI SDK](https://platform.openai.com/docs/libraries) or [Mistral API](https://docs.mistral.ai/api).
67+
1. Select **View Code**.
68+
69+
1. For models hosted on GitHub, select the inference SDK you want to use.
70+
71+
AI Toolkit generates the code for the model you selected by using the provider's client SDK. For models hosted by GitHub, you can choose which inference SDK you want to use: [Azure AI Inference SDK](https://learn.microsoft.com/python/api/overview/azure/ai-inference-readme?view=azure-python-preview) or the SDK from the model provider, such as [OpenAI SDK](https://platform.openai.com/docs/libraries) or [Mistral API](https://docs.mistral.ai/api).
72+
5573
1. The generated code snippet is shown in a new editor, where you can copy it into your application.
56-
1. You can view the generated code snippet in a new file window and copy them into your application.
57-
> To authenticate with the model you usually need an API key from the provider. To access models hosted by GitHub, [generate a personal access token](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens) (PAT) in your GitHub settings
74+
75+
> To authenticate with the model, you usually need an API key from the provider. To access models hosted by GitHub, [generate a personal access token](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens) (PAT) in your GitHub settings.
5876
5977
## Next steps
6078

6179
- [Run an evaluation job](/docs/intelligentapps/evaluation.md) for the popular evaluators
62-
- [Run evaluation](/docs/intelligentapps/evaluation.md) job for the popular evaluators

0 commit comments

Comments
 (0)