You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/how-to-create-manage-runtime.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ To use the runtime, assigning the `AzureML Data Scientist` role of workspace to
38
38
39
39
## Permissions/roles for deployments
40
40
41
-
After deploying a prompt flow, the endpoint must be assigned the `AzureML Data Scientist` role to the workspace for successful inferencing. This can be done at any point after the endpoint has been created.
41
+
After deploying a prompt flow, the endpoint must be assigned the `AzureML Data Scientist` role to the workspace for successful inferencing. This operation can be done at any point after the endpoint has been created.
42
42
43
43
## Create runtime in UI
44
44
@@ -61,7 +61,7 @@ Automatic is the default option for runtime, you can start automatic runtime (pr
61
61
62
62
:::image type="content" source="./media/how-to-create-manage-runtime/runtime-create-automatic-init.png" alt-text="Screenshot of prompt flow on the start automatic with default settings on flow page. " lightbox = "./media/how-to-create-manage-runtime/runtime-create-automatic-init.png":::
63
63
64
-
2. Start with advanced settings, you can customize the VM size used by the runtime. You can also customize the idle time, which will delete runtime automatically if it isn't in use to save code. Meanwhile, you can set the user assigned manage identity used by automatic runtime, it will be used to pull base image (please make sure user assigned manage identity have ACR pull permission) and install packages. If you don't set it, we'll use user identity as default. Learn more about [how to create update user assigned identities to workspace](../how-to-identity-based-service-authentication.md#to-create-a-workspace-with-multiple-user-assigned-identities-use-one-of-the-following-methods).
64
+
2. Start with advanced settings, you can customize the VM size used by the runtime. You can also customize the idle time, which will delete runtime automatically if it isn't in use to save code. Meanwhile, you can set the user assigned manage identity used by automatic runtime, it's used to pull base image (please make sure user assigned manage identity have ACR pull permission) and install packages. If you don't set it, we use user identity as default. Learn more about [how to create update user assigned identities to workspace](../how-to-identity-based-service-authentication.md#to-create-a-workspace-with-multiple-user-assigned-identities-use-one-of-the-following-methods).
65
65
66
66
:::image type="content" source="./media/how-to-create-manage-runtime/runtime-creation-automatic-settings.png" alt-text="Screenshot of prompt flow on the start automatic with advanced setting on flow page. " lightbox = "./media/how-to-create-manage-runtime/runtime-creation-automatic-settings.png":::
67
67
@@ -74,7 +74,7 @@ If you don't have a compute instance, create a new one: [Create and manage an Az
74
74
1. Select compute instance you want to use as runtime.
75
75
:::image type="content" source="./media/how-to-create-manage-runtime/runtime-creation-ci-runtime-select-ci.png" alt-text="Screenshot of add compute instance runtime with select compute instance highlighted. " lightbox = "./media/how-to-create-manage-runtime/runtime-creation-ci-runtime-select-ci.png":::
76
76
Because compute instances is isolated by user, you can only see your own compute instances or the ones assigned to you. To learn more, see [Create and manage an Azure Machine Learning compute instance](../how-to-create-compute-instance.md).
77
-
1. Authenticate on the compute instance. You only need to do auth one time per region in six month.
77
+
1. Authenticate on the compute instance. You only need to do auth one time per region in six months.
78
78
:::image type="content" source="./media/how-to-create-manage-runtime/runtime-creation-authentication.png" alt-text="Screenshot of doing the authentication on compute instance. " lightbox = "./media/how-to-create-manage-runtime/runtime-creation-authentication.png":::
79
79
1. Select create new custom application or existing custom application as runtime.
80
80
1. Select create new custom application as runtime.
@@ -126,7 +126,7 @@ You can also customize environment used to run this flow.
126
126
127
127
:::image type="content" source="./media/how-to-create-manage-runtime/runtime-create-automatic-save-install.png" alt-text="Screenshot of save and install packages for automatic runtime (preview) on flow page. " lightbox = "./media/how-to-create-manage-runtime/runtime-create-automatic-save-install.png":::
128
128
129
-
#### Add packages in private feed in Azure devops
129
+
#### Add packages in private feed in Azure DevOps
130
130
131
131
If you want to use private feed in Azure DevOps, add the Managed Identity in the Azure DevOps organization. To learn more, see [Use service principals & managed identities](/azure/devops/integrate/get-started/authentication/service-principal-managed-identity)
132
132
@@ -140,7 +140,7 @@ You need add `{private}` to your private feed url. Such as if you want to instal
140
140
test_package
141
141
```
142
142
143
-
- By default, we'll use latest prompt flow image as base image. If you want to use a different base image, you can build custom base image learn more, see [Customize environment with docker context for runtime](how-to-customize-environment-runtime.md#customize-environment-with-docker-context-for-runtime), then you can use put it under `environment` in `flow.dag.yaml` file in flow folder. You need `reset` runtime to use the new base image, this takes several minutes as it pulls the new base image and install packages again.
143
+
- By default, we use latest prompt flow image as base image. If you want to use a different base image, you can build custom base image learn more, see [Customize environment with docker context for runtime](how-to-customize-environment-runtime.md#customize-environment-with-docker-context-for-runtime), then you can use put it under `environment` in `flow.dag.yaml` file in flow folder. You need `reset` runtime to use the new base image, this takes several minutes as it pulls the new base image and install packages again.
144
144
145
145
:::image type="content" source="./media/how-to-create-manage-runtime/runtime-creation-automatic-image-flow-dag.png" alt-text="Screenshot of customize environment for automatic runtime on flow page. " lightbox = "./media/how-to-create-manage-runtime/runtime-creation-automatic-image-flow-dag.png":::
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/how-to-customize-environment-runtime.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -117,11 +117,11 @@ To learn more about environment CLI, see [Manage environments](../how-to-manage-
117
117
118
118
## Customize environment with flow folder for automatic runtime (preview)
119
119
120
-
In `flow.dag.yaml` file in prompt flow folder, you can use `environment` section we can define the environment for the flow. It include two parts:
121
-
- image: which is the base image for the flow, if ommitted, it will use the latest version of prompt flow base image `mcr.microsoft.com/azureml/promptflow/promptflow-runtime-stable:<newest_version>`. If you want to customize the environment, you can use the image you created in previous section.
120
+
In `flow.dag.yaml` file in prompt flow folder, you can use `environment` section we can define the environment for the flow. It includes two parts:
121
+
- image: which is the base image for the flow, if omitted, it uses the latest version of prompt flow base image `mcr.microsoft.com/azureml/promptflow/promptflow-runtime-stable:<newest_version>`. If you want to customize the environment, you can use the image you created in previous section.
122
122
- You can also specify packages `requirements.txt`, Both automatic runtime and flow deployment from UI will use the environment defined in `flow.dag.yaml` file.
123
123
124
-
:::image type="content" source="./media/how-to-create-manage-runtime/runtime-creation-automatic-image-flow-dag.png" alt-text="Screenshot of customize environment for automatic runtime on flow page. " lightbox = "./media/how-to-create-manage-runtime/runtime-creation-automatic-image-flow-dag.png":::
124
+
:::image type="content" source="./media/how-to-customize-environment-runtime/runtime-creation-automatic-image-flow-dag.png" alt-text="Screenshot of customize environment for automatic runtime on flow page. " lightbox = "./media/how-to-customize-environment-runtime/runtime-creation-automatic-image-flow-dag.png":::
125
125
126
126
If you want to use private feeds in Azure devops, see [Add packages in private feed in Azure devops](./how-to-create-manage-runtime.md#add-packages-in-private-feed-in-azure-devops).
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/how-to-deploy-for-real-time-inference.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -60,11 +60,11 @@ When you deploy prompt flow to managed online endpoint in UI, by default the dep
60
60
61
61
If you are using the customer environment to create compute instance runtime, you can find the image in environment detail page in Azure Machine Learning studio. To learn more, see [Customize environment with docker context for runtime](how-to-customize-environment-runtime.md#customize-environment-with-docker-context-for-runtime).
62
62
63
-
:::image type="content" source="./media/how-to-customize-environment-runtime/runtime-creation-image-environment.png" alt-text="Screenshot of image name in environment detail page. " lightbox = "./media/how-to-customize-environment-runtime/runtime-creation-image-environment.png":::
63
+
:::image type="content" source="./media/how-to-deploy-for-real-time-inference/runtime-creation-image-environment.png" alt-text="Screenshot of image name in environment detail page. " lightbox = "./media/how-to-deploy-for-real-time-inference/runtime-creation-image-environment.png":::
64
64
65
65
Then you need also specify the image to the `environment` in the `flow.dag.yaml` in flow folder.
66
66
67
-
:::image type="content" source="./media/how-to-create-manage-runtime/runtime-creation-automatic-image-flow-dag.png" alt-text="Screenshot of customize environment for automatic runtime on flow page. " lightbox = "./media/how-to-create-manage-runtime/runtime-creation-automatic-image-flow-dag.png":::
67
+
:::image type="content" source="./media/how-to-deploy-for-real-time-inference/runtime-creation-automatic-image-flow-dag.png" alt-text="Screenshot of customize environment for automatic runtime on flow page. " lightbox = "./media/how-to-deploy-for-real-time-inference/runtime-creation-automatic-image-flow-dag.png":::
68
68
69
69
> [!NOTE]
70
70
> If you are using private feeds in Azure devops, you need [build the image with private feeds](./how-to-create-manage-runtime.md#add-packages-in-private-feed-in-azure-devops) first and select custom environment to deploy in UI.
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/how-to-integrate-with-llm-app-devops.md
+12-12Lines changed: 12 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ ms.date: 11/02/2023
16
16
17
17
# Integrate prompt flow with LLM-based application DevOps
18
18
19
-
In this article, you'll learn about the integration of prompt flow with LLM-based application DevOps in Azure Machine Learning. Prompt flow offers a developer-friendly and easy-to-use code-first experience for flow developing and iterating with your entire LLM-based application development workflow.
19
+
In this article, you learn about the integration of prompt flow with LLM-based application DevOps in Azure Machine Learning. Prompt flow offers a developer-friendly and easy-to-use code-first experience for flow developing and iterating with your entire LLM-based application development workflow.
20
20
21
21
It provides an **prompt flow SDK and CLI**, an **VS code extension**, and the new UI of **flow folder explorer** to facilitate the local development of flows, local triggering of flow runs and evaluation runs, and transitioning flows from local to cloud (Azure Machine Learning workspace) environments.
22
22
@@ -49,10 +49,10 @@ Overview of the flow folder structure and the key files it contains:
49
49
-**flow.dag.yaml**: This primary flow definition file, in YAML format, includes information about inputs, outputs, nodes, tools, and variants used in the flow. It's integral for authoring and defining the prompt flow.
50
50
-**Source code files (.py, .jinja2)**: The flow folder also includes user-managed source code files, which are referred to by the tools/nodes in the flow.
51
51
- Files in Python (.py) format can be referenced by the python tool for defining custom python logic.
52
-
- Files in Jinja2 (.jinja2) format can be referenced by the prompt tool or LLM tool for defining prompt context.
53
-
-**Non-source files**: The flow folder can also contain non-source files such as utility files and data files that can be included in the source files.
52
+
- Files in Jinja 2 (.jinja2) format can be referenced by the prompt tool or LLM tool for defining prompt context.
53
+
-**Non-source files**: The flow folder can also contain nonsource files such as utility files and data files that can be included in the source files.
54
54
55
-
Once the flow is created, you can navigate to the Flow Authoring Page to view and operate the flow files in the right file explorer. This allows you to view, edit, and manage your files. Any modifications made to the files will be directly reflected in the file share storage.
55
+
Once the flow is created, you can navigate to the Flow Authoring Page to view and operate the flow files in the right file explorer. This allows you to view, edit, and manage your files. Any modifications made to the files are directly reflected in the file share storage.
56
56
57
57
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-file-explorer.png" alt-text="Screenshot of standard flow highlighting the files explorer. " lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-file-explorer.png":::
58
58
@@ -68,7 +68,7 @@ Alternatively, you can access all the flow folders directly within the Azure Mac
68
68
69
69
## Versioning prompt flow in code repository
70
70
71
-
To check in your flow into your code repository, you can easily export the flow folder from the flow authoring page to your local system. This will download a package containing all the files from the explorer to your local machine, which you can then check into your code repository.
71
+
To check in your flow into your code repository, you can easily export the flow folder from the flow authoring page to your local system. This downloads a package containing all the files from the explorer to your local machine, which you can then check into your code repository.
72
72
73
73
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/flow-export.png" alt-text="Screenshot of showing the download button in the file explorer." lightbox = "./media/how-to-integrate-with-llm-app-devops/flow-export.png":::
74
74
@@ -154,7 +154,7 @@ connections:
154
154
deployment_name: <deployment_name>
155
155
```
156
156
157
-
You can specify the connection and deployment name for each tool in the flow. If you don't specify the connection and deployment name, it will use the one connection and deployment on the `flow.dag.yaml` file. To format of connections:
157
+
You can specify the connection and deployment name for each tool in the flow. If you don't specify the connection and deployment name, it uses the one connection and deployment on the `flow.dag.yaml` file. To format of connections:
runtime=runtime, # if ommited, it will use the automatic runtime, you can also specify the runtime name, specif automatic will also use the automatic runtime.
198
+
runtime=runtime, # if omitted, it will use the automatic runtime, you can also specify the runtime name, specif automatic will also use the automatic runtime.
199
199
# resources = resources, # only work for automatic runtime, will be ignored if you specify the runtime name.
200
200
column_mapping={
201
201
"url": "${data.url}"
@@ -222,7 +222,7 @@ column_mapping:
222
222
prediction: ${run.outputs.category}
223
223
224
224
# define cloud resource
225
-
# if ommited, it will use the automatic runtime, you can also specify the runtime name, specif automatic will also use the automatic runtime.
225
+
# if omitted, it will use the automatic runtime, you can also specify the runtime name, specif automatic will also use the automatic runtime.
226
226
runtime: <runtime_name>
227
227
228
228
@@ -275,7 +275,7 @@ eval_run = pf.run(
275
275
"groundtruth": "${data.answer}",
276
276
"prediction": "${run.outputs.category}",
277
277
},
278
-
runtime=runtime, # if ommited, it will use the automatic runtime, you can also specify the runtime name, specif automatic will also use the automatic runtime.
278
+
runtime=runtime, # if omitted, it will use the automatic runtime, you can also specify the runtime name, specif automatic will also use the automatic runtime.
279
279
# resources = resources, # only work for automatic runtime, will be ignored if you specify the runtime name.
280
280
connections=connections
281
281
)
@@ -355,7 +355,7 @@ To use the extension:
355
355
1. Open a prompt flow folder in VS Code Desktop.
356
356
2. Open the ```flow.dag.yaml`` file in notebook view.
357
357
3. Use the visual editor to make any necessary changes to your flow, such as tune the prompts in variants, or add more tools.
358
-
4. To test your flow, select the **Run Flow** button at the top of the visual editor. This will trigger a flow test.
358
+
4. To test your flow, select the **Run Flow** button at the top of the visual editor. This triggers a flow test.
359
359
360
360
:::image type="content" source="./media/how-to-integrate-with-llm-app-devops/run-flow-visual-editor.png" alt-text="Screenshot of VS Code showing running the flow in the visual editor. " lightbox = "./media/how-to-integrate-with-llm-app-devops/run-flow-visual-editor.png":::
361
361
@@ -453,7 +453,7 @@ The introduction of the prompt flow **SDK/CLI** and the **Visual Studio Code Ext
453
453
454
454
- The first step of this collaborative process involves using a code repository as the base for your project code, which includes the prompt flow code.
455
455
- This centralized repository enables efficient organization, tracking of all code changes, and collaboration among team members.
456
-
- Once the repository is set up, team members can leverage the VSC extension for local authoring and single input testing of the flow.
456
+
- Once the repository is set up, team members can use the VSC extension for local authoring and single input testing of the flow.
457
457
- This standardized integrated development environment fosters collaboration among multiple members working on different aspects of the flow.
458
458
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/prompt-flow-local-develop.png" alt-text="Screenshot of local development. " lightbox = "media/how-to-integrate-with-llm-app-devops/prompt-flow-local-develop.png":::
459
459
1. Cloud-based experimental batch testing and evaluation - prompt flow CLI/SDK and workspace portal UI
@@ -467,7 +467,7 @@ The introduction of the prompt flow **SDK/CLI** and the **Visual Studio Code Ext
467
467
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/cloud-run-list.png" alt-text="Screenshot of run list in workspace. " lightbox = "media/how-to-integrate-with-llm-app-devops/cloud-run-list.png":::
468
468
:::image type="content" source="media/how-to-integrate-with-llm-app-devops/cloud-run-compare.png" alt-text="Screenshot of run comparison in workspace. " lightbox = "media/how-to-integrate-with-llm-app-devops/cloud-run-compare.png":::
469
469
1. Local iterative development or one-step UI deployment for production
470
-
- Following the analysis of experiments, team members can return to the code repository for additional development and fine-tuning. Subsequent runs can then be submitted to the cloud in an iterative manner.
470
+
- Following the analysis of experiments, team members can return to the code repository for another development and fine-tuning. Subsequent runs can then be submitted to the cloud in an iterative manner.
471
471
- This iterative approach ensures consistent enhancement until the team is satisfied with the quality ready for production.
472
472
- Once the team is fully confident in the quality of the flow, it can be seamlessly deployed via a UI wizard as an online endpoint in Azure Machine Learning. Once the team is entirely confident in the flow's quality, it can be seamlessly transitioned into production via a UI deploy wizard as an online endpoint in a robust cloud environment.
473
473
- This deployment on an online endpoint can be based on a run snapshot, allowing for stable and secure serving, further resource allocation and usage tracking, and log monitoring in the cloud.
0 commit comments