Skip to content

Commit 2ada160

Browse files
Merge pull request #268086 from ChenJieting/jieting/update-open-model-llm-tool-doc
small updates
2 parents 4376eb9 + aa26692 commit 2ada160

File tree

1 file changed

+5
-7
lines changed

1 file changed

+5
-7
lines changed

articles/machine-learning/prompt-flow/tools-reference/open-model-llm-tool.md

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ This prompt flow tool supports two different LLM API types:
3131
1. Choose a model from the Azure Machine Learning Model Catalog and get it deployed.
3232
2. Connect to the model deployment.
3333
3. Configure the open model llm tool settings.
34-
4. [Prepare the prompt](./prompt-tool.md#write-a-prompt).
34+
4. [Prepare the prompt](https://microsoft.github.io/promptflow/reference/tools-reference/prompt-tool.html#how-to-write-prompt).
3535
5. Run the flow.
3636

3737
## Prerequisites: Model deployment
@@ -44,7 +44,7 @@ To learn more, see [Deploy foundation models to endpoints for inferencing](../..
4444

4545
## Prerequisites: Connect to the model
4646

47-
In order for prompt flow to use your deployed model, you need to connect to it. There are several ways to connect.
47+
In order for prompt flow to use your deployed model, you need to connect to it. There are two ways to connect.
4848

4949
### Endpoint connections
5050

@@ -58,11 +58,9 @@ Once your flow is associated to an Azure Machine Learning or Azure AI Studio wor
5858

5959
The Open Model LLM tool uses the CustomConnection. Prompt flow supports two types of connections:
6060

61-
- **Workspace connections** - Connections that are stored as secrets on an Azure Machine Learning workspace. While these connections can be used, in many places, the are commonly created and maintained in the Studio UI.
61+
- **Workspace connections** - Connections that are stored as secrets on an Azure Machine Learning workspace. While these connections can be used, in many places, the are commonly created and maintained in the Studio UI. To learn how to create a custom connection in Studio UI, see [how to create a custom connection](./python-tool.md#create-a-custom-connection).
6262

63-
- **Local connections** - Connections that are stored locally on your machine. These connections aren't available in the Studio UX, but can be used with the VS Code extension.
64-
65-
To learn how to create a workspace or local Custom Connection, see [Create a connection](https://microsoft.github.io/promptflow/how-to-guides/manage-connections.html#create-a-connection).
63+
- **Local connections** - Connections that are stored locally on your machine. These connections aren't available in the Studio UX, but can be used with the VS Code extension. To learn how to create a local Custom Connection, see [how to create a local connection](https://microsoft.github.io/promptflow/how-to-guides/manage-connections.html#create-a-connection).
6664

6765
The required keys to set are:
6866

@@ -82,7 +80,7 @@ The Open Model LLM tool has many parameters, some of which are required. See the
8280
| Name | Type | Description | Required |
8381
|------|------|-------------|----------|
8482
| api | string | The API mode that depends on the model used and the scenario selected. *Supported values: (Completion \| Chat)* | Yes |
85-
| endpoint_name | string | Name of an Online Inferencing Endpoint with a supported model deployed on it. Takes priority over connection. | No |
83+
| endpoint_name | string | Name of an Online Inferencing Endpoint with a supported model deployed on it. Takes priority over connection. | Yes |
8684
| temperature | float | The randomness of the generated text. Default is 1. | No |
8785
| max_new_tokens | integer | The maximum number of tokens to generate in the completion. Default is 500. | No |
8886
| top_p | float | The probability of using the top choice from the generated tokens. Default is 1. | No |

0 commit comments

Comments
 (0)