You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/develop/langchain.md
+14-18Lines changed: 14 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,11 +30,7 @@ In this tutorial, you learn how to use the packages `langchain-azure-ai` to buil
30
30
To run this tutorial, you need:
31
31
32
32
* An [Azure subscription](https://azure.microsoft.com).
33
-
* An Azure AI project as explained at [Create a project in Azure AI Foundry portal](../create-projects.md).
34
-
* A model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed. In this example, we use a `Mistral-Large` deployment, but use any model of your preference.
35
-
36
-
* You can follow the instructions at [Deploy models as serverless APIs](../deploy-models-serverless.md).
37
-
33
+
* A model deployment supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed. In this example, we use a `Mistral-Large-2407` deployment in the [Azure AI model inference](../../../ai-foundry/model-inference/overview.md).
38
34
* Python 3.9 or later installed, including pip.
39
35
* LangChain installed. You can do it with:
40
36
@@ -78,25 +74,13 @@ from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
> For Azure OpenAI models, configure the client as indicated at [Using Azure OpenAI models](#using-azure-openai-models).
86
83
87
-
If your endpoint is serving more than one model, like with the [Azure AI model inference service](../../ai-services/model-inference.md) or [GitHub Models](https://github.com/marketplace/models), you have to indicate `model_name` parameter:
88
-
89
-
```python
90
-
import os
91
-
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
Let's first use the model directly. `ChatModels` are instances of LangChain `Runnable`, which means they expose a standard interface for interacting with them. To simply call the model, we can pass in a list of messages to the `invoke` method.
0 commit comments