Skip to content

Commit 572c013

Browse files
committed
touchups
1 parent c51ee6e commit 572c013

File tree

5 files changed

+14
-11
lines changed

5 files changed

+14
-11
lines changed

articles/machine-learning/prompt-flow/how-to-integrate-with-llm-app-devops.md

Lines changed: 14 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,7 @@ If you prefer to work directly in code, or use Jupyter, PyCharm, Visual Studio,
118118

119119
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-directory-and-yaml.png" alt-text="Screenshot of a YAML file in VS Code highlighting the default input and flow directory." lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-directory-and-yaml.png":::
120120

121-
You can then trigger a single flow run for testing by using the prompt flow CLI or SDK as follows.
121+
You can then trigger a single flow run for testing by using the prompt flow CLI or SDK in the terminal as follows.
122122

123123
# [Azure CLI](#tab/cli)
124124

@@ -128,10 +128,6 @@ To trigger a run from the working directory, run the following code:
128128
pf flow test --flow <directory-name>
129129
```
130130

131-
The following screenshot shows example test logs and outputs.
132-
133-
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-test-output-cli.png" alt-text="Screenshot of the flow test output in PowerShell." lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-test-output-cli.png":::
134-
135131
# [Python SDK](#tab/python)
136132

137133
```python
@@ -153,12 +149,19 @@ node_result = pf_client.test(flow=flow_path, inputs=node_inputs, node=node_name)
153149
print(f"Node outputs: {node_result}")
154150
```
155151

156-
The return values of the test functions are the flow and node outputs.
152+
---
153+
154+
The return values are the test logs and outputs.
157155

158156
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-test-output.png" alt-text="Screenshot of the flow test output in Python. " lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-test-output.png":::
159157

160158
---
161159

160+
The following screenshot shows example
161+
162+
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-test-output-cli.png" alt-text="Screenshot of the flow test output in PowerShell." lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-test-output-cli.png":::
163+
164+
162165
<a name="submitting-runs-to-the-cloud-from-local-repository"></a>
163166
### Submit runs to the cloud from a local repository
164167

@@ -503,17 +506,17 @@ The prompt flow SDK/CLI and the VS Code Prompt flow extension facilitate easy co
503506

504507
1. Continue to use local iterative development.
505508

506-
After they analyze the results of experiments, team members can return to the local environment and code repository for more development and fine-tuning, and iteratively submit subsequent runs to the cloud. This iterative approach ensures consistent enhancement until the team is satisfied with the quality for production.
509+
After analyzing the results of experiments, team members can return to the local environment and code repository for more development and fine-tuning, and iteratively submit subsequent runs to the cloud. This iterative approach ensures consistent enhancement until the team is satisfied with the quality for production.
507510

508511
1. Use one-step deployment to production in the studio.
509512

510-
Once the team is fully confident in the quality of the flow, they can seamlessly deploy it as an online endpoint in a robust cloud environment by using the **Deploy** wizard in Azure Machine Learning studio.
513+
Once the team is fully confident in the quality of the flow, they can seamlessly deploy it as an online endpoint in a robust cloud environment. Deployment as an online endpoint can be based on a run snapshot, allowing stable and secure serving, further resource allocation and usage tracking, and log monitoring in the cloud.
511514

512-
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/deploy-wizard.png" alt-text="Screenshot of deploy wizard." lightbox = "media/how-to-integrate-with-llm-app-devops/deploy-wizard.png":::
515+
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/deploy-from-snapshot.png" alt-text="Screenshot of deploying flow from a run snapshot." lightbox = "media/how-to-integrate-with-llm-app-devops/deploy-from-snapshot.png":::
513516

514-
Deployment as an online endpoint can be based on a run snapshot, allowing stable and secure serving, further resource allocation and usage tracking, and log monitoring in the cloud.
517+
The Azure Machine Learning studio **Deploy** wizard helps you easily configure your deployment.
515518

516-
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/deploy-from-snapshot.png" alt-text="Screenshot of deploying flow from a run snapshot." lightbox = "media/how-to-integrate-with-llm-app-devops/deploy-from-snapshot.png":::
519+
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/deploy-wizard.png" alt-text="Screenshot of deploy wizard." lightbox = "media/how-to-integrate-with-llm-app-devops/deploy-wizard.png":::
517520

518521
## Related content
519522

942 Bytes
Loading
23.3 KB
Loading
124 KB
Loading

0 commit comments

Comments
 (0)