You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/how-to-integrate-with-langchain.md
+28-26Lines changed: 28 additions & 26 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,35 +12,35 @@ ms.topic: how-to
12
12
author: lgayhardt
13
13
ms.author: lagayhar
14
14
ms.reviewer: chenlujiao
15
-
ms.date: 10/21/2024
15
+
ms.date: 10/22/2024
16
16
---
17
17
18
18
# Integrate LangChain in prompt flows
19
19
20
-
The [LangChain](https://python.langchain.com) Python library is a framework for developing applications powered by LLMs, agents, and dependency tools. You can use LangChain in Azure Machine Learning prompt flows. This article shows you how to supercharge your LangChain development with prompt flow.
20
+
The [LangChain](https://python.langchain.com) Python library is a framework for developing applications powered by large language models (LLMs), agents, and dependency tools. You can use LangChain in Azure Machine Learning prompt flows. This article shows you how to supercharge your LangChain development with prompt flow.
21
21
22
-
The integration of LangChain with prompt flow is a powerful combination that can help you build and test your custom language models with ease. You can use LangChain modules to initially build the flow, and then use the prompt flow process to scale experiments for bulk testing, evaluation, and eventual deployment. For example, you can conduct larger scale experiments based on larger datasets.
22
+
The integration of LangChain with prompt flow is a powerful combination that can help you build and test your custom language models with ease. You can use LangChain modules to initially build the flow, and then use the prompt flow process to scale experiments for bulk testing, evaluation, and eventual deployment. For example, you can conduct large scale experiments based on larger datasets.
23
23
24
-
If you already have a local prompt flow based on LangChain code, you can use streamlined prompt flow integration to easily convert it into a flow for further experimentation. Or, if you're already familiar with the LangChain Python SDK and prefer to use its classes and functions directly, you can easily build flows by using Python nodes that contain your custom code.
24
+
If you already have a local prompt flow based on LangChain code, you can use streamlined prompt flow integration to easily convert it into an Azure Machine Learning prompt flow for further experimentation. Or, if youprefer to use LangChain SDK classes and functions directly, you can easily build flows that use Python nodes containing your custom LangChain code.
25
25
26
26
## Prerequisites
27
27
28
-
-Local LangChain flow code that's properly tested and ready for deployment.
29
-
- A compute session that can run the prompt flow by adding packages listed in the *requirements.txt* file. For more information, see [Manage prompt flow compute session](how-to-manage-compute-session.md).
28
+
-A local LangChain flow that's properly tested and ready for deployment.
29
+
- A compute session that can run the Machine Learning prompt flow by adding packages listed in the *requirements.txt* file, including `langchain`. For more information, see [Manage prompt flow compute session](how-to-manage-compute-session.md).
30
30
31
31
## Convert LangChain code into prompt flows
32
32
33
33
The rest of the article describes how to convert your local LangChain code to a runnable Azure Machine Learning prompt flow.
34
34
35
35
### Convert credentials to a prompt flow connection
36
36
37
-
When you developed your LangChain code, you might have [defined environment variables](https://python.langchain.com/docs/integrations/platforms/microsoft) to store credentials such as the AzureOpenAI API key, which is necessary for invoking AzureOpenAI models. For example, the following code sets environmental variables for the OpenAI API type, key, base or endpoint, and version:
37
+
Your LangChain codemight [define environment variables](https://python.langchain.com/docs/integrations/platforms/microsoft) to store credentials, such as the AzureOpenAI API keynecessary for invoking AzureOpenAI models. For example, the following image shows environmental variables being set for OpenAI API type, key, base, and version.
38
38
39
39
:::image type="content" source="./media/how-to-integrate-with-langchain/langchain-env-variables.png" alt-text="Screenshot of Azure OpenAI example in LangChain.":::
40
40
41
-
When you run LangChain code in an Azure Machine Learning prompt flow in the cloud, it's better not to expose credentials as environment variables. To securely store and manage credentials separately from your code, you should convert the environmental variables into a prompt flow connection.
41
+
When you run an Azure Machine Learning prompt flow in the cloud, it's better not to expose credentials as environment variables. To securely store and manage credentials separately from your code, you should convert the environmental variables into a prompt flow connection.
42
42
43
-
To create a connection that securely stores your credentials, such as your LLM API key or other required credentials, follow these instructions:
43
+
To create a connection that securely stores credentials such as your LLM API key or other required keys, follow these instructions:
44
44
45
45
1. On the **Prompt flow** page in your Azure Machine Learning workspace, select the **Connections** tab, and then select **Create**.
46
46
1. Select a connection type from the dropdown list. For this example, select **Custom**.
@@ -51,46 +51,48 @@ To create a connection that securely stores your credentials, such as your LLM A
51
51
52
52
:::image type="content" source="./media/how-to-integrate-with-langchain/custom-connection-2.png" alt-text="Screenshot of adding custom connection key-value pairs.":::
53
53
54
-
1. To store an encrypted value for a key, select the **is secret** checkbox next to one or more key-value pairs. At least one value must be set as secret for the connection to create successfully.
54
+
1. To store an encrypted value for a key, select the **is secret** checkbox next to one or more key-value pairs. At least one value must be set as secret for the connection creation to succeed.
55
55
56
56
1. Select **Save**.
57
57
58
-
The custom connection replaces the key and credential you explicitly defined in your LangChain code. To use the custom connection in your flow, see [Configure connection](#configure-connection).
58
+
The custom connection can replace the keys and credentials or corresponding environmental variables explicitly defined in your LangChain code. To use the custom connection in the flow, see [Configure connection](#configure-connection).
59
59
60
60
### Convert LangChain code to a runnable flow
61
61
62
-
To create a flow, select **Create** on the **Prompt flow** page in Azure Machine Learning studio. For detailed instructions and information, see [Develop prompt flow](how-to-develop-a-standard-flow.md).
62
+
To create a flow, select **Create** on the **Prompt flow** page in Azure Machine Learning studio, and choose a flow type. On the flow authoring page, start your compute session before you author the flow. Select tool types at the top of the pane to insert corresponding nodes into the flow. For detailed flow authoring instructions, see [Develop prompt flow](how-to-develop-a-standard-flow.md).
63
63
64
-
All your LangChain code can directly run in Python tools in your flow, as long as your compute session contains the dependency packages. There are two ways to convert your LangChain code into a flow:
64
+
All your LangChain code can directly run in Python nodes in your flow, as long as your compute session contains the `langchain` package dependency.
65
65
66
-
- For a simple conversion process, you can directly initialize the LLM model and invoke it in a Python node by using the LangChain integrated LLM library.
67
-
- For better experiment management, you can convert your LLM model to use Azure Machine Learning Python LLM tools in the flow.
66
+
There are two ways to convert your LangChain code into a flow:
68
67
69
-
Which type of flow to implement depends on your use case.
68
+
- For a simple conversion process, you can initialize and invoke the LLM model within a Python node by using the integrated LangChain LLM library.
69
+
- For better experiment management, you can convert the LLM model to use Azure Machine Learning Python LLM tools in the flow.
70
+
71
+
The type of flow to implement depends on your use case.
70
72
71
73
| Flow type | Implementation | Use case |
72
74
|-------| -------- | -------- | -------- |
73
-
| A flow that includes both prompt nodes and Python nodes | Extract your prompt template from your code into a prompt node, and combine the remaining code in single or multiple Python nodes or tools. |Easily tune the prompt by running flow variants, and then choose the optimal prompt based on evaluation results.|
74
-
| A flow that includes Python nodes only| Create a new flow with Python nodes only. All code, including prompt definitions, runs in Python nodes. | No explicit prompt tuning, but faster batch testing based on larger scale datasets. |
75
+
| A flow that includes both prompt nodes and Python nodes | Extract your prompt template into a prompt node, and combine the remaining code in single or multiple Python nodes or tools. |Easy prompt tuning by running flow variants, to choose the optimal prompt based on evaluation results.|
76
+
| A flow that includes Python nodes only| Create a new flow with Python nodes only. All coderuns in Python nodes, including prompt definitions. | Faster batch testing based on larger scale datasets. |
75
77
76
-
The following examples show a flow that uses both prompt nodes and Python nodes:
78
+
The following example shows a flow that uses both prompt nodes and Python nodes:
77
79
78
80
:::image type="content" source="./media/how-to-integrate-with-langchain/flow-node-a-1.png" alt-text="Screenshot of flows highlighting the prompt button and system template. " lightbox = "./media/how-to-integrate-with-langchain/flow-node-a-1.png":::
79
81
80
-
:::image type="content" source="./media/how-to-integrate-with-langchain/flow-node-a-2.png" alt-text="Screenshot of system template showing variant one and zero with the finish tuning button highlighted. " lightbox = "./media/how-to-integrate-with-langchain/flow-node-a-2.png":::
81
-
82
82
The following example shows a flow that uses Python nodes only:
83
83
84
84
:::image type="content" source="./media/how-to-integrate-with-langchain/flow-node-b.png" alt-text="Screenshot of flows showing the LangChain code node and graph. " lightbox = "./media/how-to-integrate-with-langchain/flow-node-b.png":::
85
85
86
86
### Configure connection
87
87
88
-
After you structure your flow and move your code to specific tool nodes, you need to replace your original environment variables with the corresponding keys in the connection. To do so, import the `promptflow.connections` library in your Python nodes.
88
+
After you structure your flow and move your code to specific tool nodes, you need to replace your original environment variables with the corresponding keys from your connection. To use the custom connection you created, follow these steps:
89
+
90
+
1. In your Python code, import the custom connection library by entering `from promptflow.connections import CustomConnection`.
89
91
90
-
To use the custom connection you created, follow these steps:
92
+
>[!NOTE]
93
+
>For an Azure OpenAI connection, use `from promptflow.connections import AzureOpenAIConnection`.
91
94
92
-
1. In your Python node, import the custom connection library by entering `from promptflow.connections import CustomConnection`.
93
-
1. Define an input parameter of the type `CustomConnection` in the tool function.
95
+
1. In your tool function, define an input parameter of the type `CustomConnection`.
94
96
95
97
:::image type="content" source="./media/how-to-integrate-with-langchain/custom-connection-python-node-1.png" alt-text="Screenshot of doc search chain node highlighting the custom connection. " lightbox = "./media/how-to-integrate-with-langchain/custom-connection-python-node-1.png":::
96
98
@@ -105,7 +107,7 @@ If you have a LangChain code that consumes an AzureOpenAI model, import the libr
105
107
106
108
:::image type="content" source="./media/how-to-integrate-with-langchain/code-consume-aoai.png" alt-text="Screenshot of LangChain code in prompt flow. " lightbox = "./media/how-to-integrate-with-langchain/code-consume-aoai.png":::
107
109
108
-
### Configure input and output**
110
+
### Configure input and output
109
111
110
112
Before you run the flow, configure the node inputs and outputs and the overall flow inputs and outputs. This step is crucial to ensure that all the required data passes properly through the flow and produces desired results. For more information, see [Flow inputs and outputs](how-to-develop-flow.md#flow-input-and-output).
0 commit comments