Skip to content

Commit 26cb075

Browse files
committed
docs(AI): Connect Telerik MCP server to local LLM
1 parent feebb9a commit 26cb075

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

ai/mcp-server.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -171,6 +171,10 @@ The following list describes how your prompts may look like:
171171

172172
@[template](/_contentTemplates/common/ai-coding-assistant.md#number-of-requests)
173173

174+
## Connect with Local AI Model
175+
176+
You can use the Telerik Blazor MCP server with local large language models (LLM). For example, run your local model through [Ollama](https://ollama.com) and use a third-party package such as [MCP-LLM Bridge](https://github.com/patruff/ollama-mcp-bridge) to connect the model to the Telerik MCP server. This will allow you to use the Telerik AI Coding Assistant without a cloud-based AI model.
177+
174178
## See Also
175179

176180
* [Telerik Blazor extension for GitHub Copilot](slug:ai-copilot-extension)

0 commit comments

Comments
 (0)