You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/llmops-azure-devops-promptflow.md
+2-11Lines changed: 2 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,18 +6,16 @@ services: azure-ai-studio
6
6
author: ritesh-modi
7
7
ms.author: rimod
8
8
ms.service: azure-ai-studio
9
-
ms.subservice: prompt-flow
10
9
ms.topic: how-to
11
10
ms.reviewer: lagayhar
12
-
ms.date: 23/07/2024
13
11
ms.custom:
14
12
- cli-v2
15
13
- sdk-v2
16
14
- ignite-2024
17
15
- build-2024
18
16
---
19
17
20
-
# LLMOps with prompt flow and Azure DevOps
18
+
# Streamlining LLMOps with Prompt Flow and Azure DevOps: A Comprehensive Approach
21
19
22
20
Large Language Operations, or LLMOps, is the cornerstone of efficient prompt engineering and LLM-infused application development and deployment. As the demand for LLM-infused applications continues to soar, organizations find themselves in need of a cohesive and streamlined process to manage their end-to-end lifecycle.
23
21
@@ -47,7 +45,7 @@ LLMOps with prompt flow is a "LLMOps template and guidance" to help you build LL
47
45
-**Centralized Code Hosting**: This repo supports hosting code for multiple flows based on prompt flow, providing a single repository for all your flows. Think of this platform as a single repository where all your prompt flow code resides. It's like a library for your flows, making it easy to find, access, and collaborate on different projects.
48
46
49
47
-**Lifecycle Management**: Each flow enjoys its own lifecycle, allowing for smooth transitions from local experimentation to production deployment.
50
-
:::image type="content" source=".../media/prompt-flow/llmops/pipeline.png" alt-text="Screenshot of pipeline." lightbox = "../media/prompt-flow/llmops/pipeline.png":::
48
+
:::image type="content" source="../media/prompt-flow/llmops/pipeline.png" alt-text="Screenshot of pipeline." lightbox = "../media/prompt-flow/llmops/pipeline.png":::
51
49
52
50
-**Variant and Hyperparameter Experimentation**: Experiment with multiple variants and hyperparameters, evaluating flow variants with ease. Variants and hyperparameters are like ingredients in a recipe. This platform allows you to experiment with different combinations of variants across multiple nodes in a flow.
53
51
@@ -63,7 +61,6 @@ LLMOps with prompt flow is a "LLMOps template and guidance" to help you build LL
63
61
64
62
-**Comprehensive Reporting**: Generate detailed reports for each variant configuration, allowing you to make informed decisions. Provides detailed Metric collection, experiment, and variant bulk runs for all runs and experiments, enabling data-driven decisions in csv as well as HTML files.
:::image type="content" source="../media/prompt-flow/llmops/metrics.png" alt-text="Screenshot of metrics report." lightbox = "../media/prompt-flow/llmops/metrics.png":::
67
64
68
65
Other features for customization:
69
66
- Offers **BYOF** (bring-your-own-flows). A **complete platform** for developing multiple use-cases related to LLM-infused applications.
@@ -134,9 +131,6 @@ The repository for this article is available at [LLMOps with Prompt flow templat
134
131
135
132
From here on, you can learn **LLMOps with prompt flow** by following the end-to-end samples we provided, which help you build LLM-infused applications using prompt flow and Azure DevOps. Its primary objective is to provide assistance in the development of such applications, leveraging the capabilities of prompt flow and LLMOps.
136
133
137
-
> [!TIP]
138
-
> We recommend you understand how to integrate [LLMOps with prompt flow](how-to-integrate-with-llm-app-devops.md).
139
-
140
134
## Prerequisites
141
135
142
136
- An Azure subscription. If you don't have an Azure subscription, create a free account before you begin. Try the [Azure AI Studio](https://azure.microsoft.com/free/).
@@ -162,8 +156,6 @@ Prompt flow uses connections resource to connect to endpoints like Azure OpenAI,
162
156
163
157
Connections can be created through **prompt flow portal UI** or using the **REST API**. Please follow the [guidelines](https://github.com/microsoft/llmops-promptflow-template/blob/main/docs/Azure_devops_how_to_setup.md#setup-connections-for-prompt-flow) to create connections for prompt flow.
164
158
165
-
Click on the link to know more about [connections](./concept-connections.md).
166
-
167
159
> [!NOTE]
168
160
>
169
161
> The sample flows use 'aoai' connection and connection named 'aoai' should be created to execute them.
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/llmops-github-promptflow.md
+1-10Lines changed: 1 addition & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,18 +6,16 @@ services: azure-ai-studio
6
6
author: ritesh-modi
7
7
ms.author: rimod
8
8
ms.service: azure-ai-studio
9
-
ms.subservice: prompt-flow
10
9
ms.topic: how-to
11
10
ms.reviewer: lagayhar
12
-
ms.date: 23/07/2024
13
11
ms.custom:
14
12
- cli-v2
15
13
- sdk-v2
16
14
- ignite-2024
17
15
- build-2024
18
16
---
19
17
20
-
# LLMOps with prompt flow and GitHub
18
+
# Elevating LLMOps with Prompt Flow and GitHub: A Unified Strategy for AI Workflows
21
19
22
20
Large Language Operations, or LLMOps, is the cornerstone of efficient prompt engineering and LLM-infused application development and deployment. As the demand for LLM-infused applications continues to soar, organizations find themselves in need of a cohesive and streamlined process to manage their end-to-end lifecycle.
23
21
@@ -62,7 +60,6 @@ LLMOps with prompt flow is a "LLMOps template and guidance" to help you build LL
62
60
63
61
-**Comprehensive Reporting**: Generate detailed reports for each **variant configuration**, allowing you to make informed decisions. Provides detailed Metric collection, experiment, and variant bulk runs for all runs and experiments, enabling data-driven decisions in csv as well as HTML files.
:::image type="content" source="../media/prompt-flow/llmops/metrics.png" alt-text="Screenshot of metrics report." lightbox = "../media/prompt-flow/llmops/metrics.png":::
66
63
67
64
Other features for customization:
68
65
- Offers **BYOF** (bring-your-own-flows). A **complete platform** for developing multiple use-cases related to LLM-infused applications.
@@ -133,10 +130,6 @@ The repository for this article is available at [LLMOps with Prompt flow templat
133
130
134
131
From here on, you can learn **LLMOps with prompt flow** by following the end-to-end samples we provided, which help you build LLM-infused applications using prompt flow and GitHub. Its primary objective is to provide assistance in the development of such applications, using the capabilities of prompt flow and LLMOps.
135
132
136
-
> [!TIP]
137
-
> We recommend you understand how to integrate [LLMOps with prompt flow](how-to-integrate-with-llm-app-devops.md).
138
-
139
-
140
133
## Prerequisites
141
134
142
135
- An Azure subscription. If you don't have an Azure subscription, create a free account before you begin. Try the [Azure AI Studio](https://azure.microsoft.com/free/).
@@ -161,7 +154,6 @@ Prompt Flow uses connections resource to connect to endpoints like Azure OpenAI,
161
154
162
155
Connections can be created through **prompt flow portal UI** or using the **REST API**. Follow the [guidelines](https://github.com/microsoft/llmops-promptflow-template/blob/main/docs/Azure_devops_how_to_setup.md#setup-connections-for-prompt-flow) to create connections for prompt flow.
163
156
164
-
Select on the link to know more about [connections](./concept-connections.md).
0 commit comments