Skip to content

Commit 78a267e

Browse files
authored
Merge pull request #476 from Adolfi/patch-1
Updated AI Toolkit for Visual Studio Code with instructions.
2 parents 713eac3 + eb5c224 commit 78a267e

File tree

1 file changed

+13
-1
lines changed

1 file changed

+13
-1
lines changed

articles/ai-foundry/foundry-local/concepts/foundry-local-architecture.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,11 +122,23 @@ Foundry Local supports integration with various SDKs in most languages, such as
122122

123123
The AI Toolkit for Visual Studio Code provides a user-friendly interface for developers to interact with Foundry Local. It allows users to run models, manage the local cache, and visualize results directly within the IDE.
124124

125-
- **Features**:
125+
**Features**:
126126
- Model management: Download, load, and run models from within the IDE.
127127
- Interactive console: Send requests and view responses in real-time.
128128
- Visualization tools: Graphical representation of model performance and results.
129129

130+
**Prerequisites:**
131+
- You have installed [Foundry Local](../get-started.md) and have a model service running.
132+
- You have installed the [AI Toolkit for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=ms-windows-ai-studio.windows-ai-studio) extension.
133+
134+
**Connect Foundry Local model to AI Toolkit:**
135+
1. **Add model in AI Toolkit**: Open AI Toolkit from the activity bar of Visual Studio Code. In the 'My Models' panel, click the 'Add model for remote interface' button and then select 'Add a custom model' from the dropdown menu.
136+
2. **Enter the chat compatible endpoint URL**: Enter `http://localhost:PORT/v1/chat/completions` where PORT is replaced with the port number of your Foundry Local service endpoint. You can see the port of your locally running service using the CLI command `foundry service status`. Foundry Local dynamically assigns a port, so it might not always the same.
137+
3. **Provide model name**: Enter the exact model name you which to use from Foundry Local, for example `phi-3.5-mini`. You can list all previously downloaded and locally cached models using the CLI command `foundry cache list` or use `foundry model list` to see all available models for local use. You’ll also be asked to enter a display name, which is only for your own local use, so to avoid confusion it’s recommended to enter the same name as the exact model name.
138+
4. **Authentication**: If your local setup doesn't require authentication *(which is the default for a Foundry Local setup)*, you can leave the authentication headers field blank and press Enter.
139+
140+
After completing these steps, your Foundry Local model will appear in the 'My Models' list in AI Toolkit and is ready to be used by right-clicking on your model and select 'Load in Playground'.
141+
130142
## Next Steps
131143

132144
- [Get started with Foundry Local](../get-started.md)

0 commit comments

Comments
 (0)