You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/how-to-integrate-with-llm-app-devops.md
+14-11Lines changed: 14 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -118,7 +118,7 @@ If you prefer to work directly in code, or use Jupyter, PyCharm, Visual Studio,
118
118
119
119
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-directory-and-yaml.png" alt-text="Screenshot of a YAML file in VS Code highlighting the default input and flow directory." lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-directory-and-yaml.png":::
120
120
121
-
You can then trigger a single flow run for testing by using the prompt flow CLI or SDK as follows.
121
+
You can then trigger a single flow run for testing by using the prompt flow CLI or SDK in the terminal as follows.
122
122
123
123
# [Azure CLI](#tab/cli)
124
124
@@ -128,10 +128,6 @@ To trigger a run from the working directory, run the following code:
128
128
pf flow test --flow <directory-name>
129
129
```
130
130
131
-
The following screenshot shows example test logs and outputs.
132
-
133
-
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-test-output-cli.png" alt-text="Screenshot of the flow test output in PowerShell." lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-test-output-cli.png":::
The return values of the test functions are the flow and node outputs.
152
+
---
153
+
154
+
The return values are the test logs and outputs.
157
155
158
156
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-test-output.png" alt-text="Screenshot of the flow test output in Python. " lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-test-output.png":::
159
157
160
158
---
161
159
160
+
The following screenshot shows example
161
+
162
+
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-test-output-cli.png" alt-text="Screenshot of the flow test output in PowerShell." lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-test-output-cli.png":::
### Submit runs to the cloud from a local repository
164
167
@@ -503,17 +506,17 @@ The prompt flow SDK/CLI and the VS Code Prompt flow extension facilitate easy co
503
506
504
507
1. Continue to use local iterative development.
505
508
506
-
After they analyze the results of experiments, team members can return to the local environment and code repository for more development and fine-tuning, and iteratively submit subsequent runs to the cloud. This iterative approach ensures consistent enhancement until the team is satisfied with the quality for production.
509
+
After analyzing the results of experiments, team members can return to the local environment and code repository for more development and fine-tuning, and iteratively submit subsequent runs to the cloud. This iterative approach ensures consistent enhancement until the team is satisfied with the quality for production.
507
510
508
511
1. Use one-step deployment to production in the studio.
509
512
510
-
Once the team is fully confident in the quality of the flow, they can seamlessly deploy it as an online endpoint in a robust cloud environment by using the **Deploy** wizard in Azure Machine Learning studio.
513
+
Once the team is fully confident in the quality of the flow, they can seamlessly deploy it as an online endpoint in a robust cloud environment. Deployment as an online endpoint can be based on a run snapshot, allowing stable and secure serving, further resource allocation and usage tracking, and log monitoring in the cloud.
511
514
512
-
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/deploy-wizard.png" alt-text="Screenshot of deploy wizard." lightbox = "media/how-to-integrate-with-llm-app-devops/deploy-wizard.png":::
515
+
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/deploy-from-snapshot.png" alt-text="Screenshot of deploying flow from a run snapshot." lightbox = "media/how-to-integrate-with-llm-app-devops/deploy-from-snapshot.png":::
513
516
514
-
Deployment as an online endpoint can be based on a run snapshot, allowing stable and secure serving, further resource allocation and usage tracking, and log monitoring in the cloud.
517
+
The Azure Machine Learning studio **Deploy** wizard helps you easily configure your deployment.
515
518
516
-
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/deploy-from-snapshot.png" alt-text="Screenshot of deploying flow from a run snapshot." lightbox = "media/how-to-integrate-with-llm-app-devops/deploy-from-snapshot.png":::
519
+
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/deploy-wizard.png" alt-text="Screenshot of deploy wizard." lightbox = "media/how-to-integrate-with-llm-app-devops/deploy-wizard.png":::
0 commit comments