Skip to content

Commit c2cbeeb

Browse files
dimodiyordan-mitev
andauthored
Update ai/mcp-server.md
Co-authored-by: Yordan <[email protected]>
1 parent 26cb075 commit c2cbeeb

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

ai/mcp-server.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -171,7 +171,7 @@ The following list describes how your prompts may look like:
171171

172172
@[template](/_contentTemplates/common/ai-coding-assistant.md#number-of-requests)
173173

174-
## Connect with Local AI Model
174+
## Connect to Local AI Model
175175

176176
You can use the Telerik Blazor MCP server with local large language models (LLM). For example, run your local model through [Ollama](https://ollama.com) and use a third-party package such as [MCP-LLM Bridge](https://github.com/patruff/ollama-mcp-bridge) to connect the model to the Telerik MCP server. This will allow you to use the Telerik AI Coding Assistant without a cloud-based AI model.
177177

0 commit comments

Comments
 (0)