You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/tools-reference/open-model-llm-tool.md
+5-7Lines changed: 5 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,7 +31,7 @@ This prompt flow tool supports two different LLM API types:
31
31
1. Choose a model from the Azure Machine Learning Model Catalog and get it deployed.
32
32
2. Connect to the model deployment.
33
33
3. Configure the open model llm tool settings.
34
-
4.[Prepare the prompt](./prompt-tool.md#write-a-prompt).
34
+
4.[Prepare the prompt](https://microsoft.github.io/promptflow/reference/tools-reference/prompt-tool.html#how-to-write-prompt).
35
35
5. Run the flow.
36
36
37
37
## Prerequisites: Model deployment
@@ -44,7 +44,7 @@ To learn more, see [Deploy foundation models to endpoints for inferencing](../..
44
44
45
45
## Prerequisites: Connect to the model
46
46
47
-
In order for prompt flow to use your deployed model, you need to connect to it. There are several ways to connect.
47
+
In order for prompt flow to use your deployed model, you need to connect to it. There are two ways to connect.
48
48
49
49
### Endpoint connections
50
50
@@ -58,11 +58,9 @@ Once your flow is associated to an Azure Machine Learning or Azure AI Studio wor
58
58
59
59
The Open Model LLM tool uses the CustomConnection. Prompt flow supports two types of connections:
60
60
61
-
-**Workspace connections** - Connections that are stored as secrets on an Azure Machine Learning workspace. While these connections can be used, in many places, the are commonly created and maintained in the Studio UI.
61
+
-**Workspace connections** - Connections that are stored as secrets on an Azure Machine Learning workspace. While these connections can be used, in many places, the are commonly created and maintained in the Studio UI. To learn how to create a custom connection in Studio UI, see [how to create a custom connection](./python-tool.md#create-a-custom-connection).
62
62
63
-
-**Local connections** - Connections that are stored locally on your machine. These connections aren't available in the Studio UX, but can be used with the VS Code extension.
64
-
65
-
To learn how to create a workspace or local Custom Connection, see [Create a connection](https://microsoft.github.io/promptflow/how-to-guides/manage-connections.html#create-a-connection).
63
+
-**Local connections** - Connections that are stored locally on your machine. These connections aren't available in the Studio UX, but can be used with the VS Code extension. To learn how to create a local Custom Connection, see [how to create a local connection](https://microsoft.github.io/promptflow/how-to-guides/manage-connections.html#create-a-connection).
66
64
67
65
The required keys to set are:
68
66
@@ -82,7 +80,7 @@ The Open Model LLM tool has many parameters, some of which are required. See the
82
80
| Name | Type | Description | Required |
83
81
|------|------|-------------|----------|
84
82
| api | string | The API mode that depends on the model used and the scenario selected. *Supported values: (Completion \| Chat)*| Yes |
85
-
| endpoint_name | string | Name of an Online Inferencing Endpoint with a supported model deployed on it. Takes priority over connection. |No|
83
+
| endpoint_name | string | Name of an Online Inferencing Endpoint with a supported model deployed on it. Takes priority over connection. |Yes|
86
84
| temperature | float | The randomness of the generated text. Default is 1. | No |
87
85
| max_new_tokens | integer | The maximum number of tokens to generate in the completion. Default is 500. | No |
88
86
| top_p | float | The probability of using the top choice from the generated tokens. Default is 1. | No |
0 commit comments