Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions ai/mcp-server.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,6 +171,10 @@ The following list describes how your prompts may look like:

@[template](/_contentTemplates/common/ai-coding-assistant.md#number-of-requests)

## Connect with Local AI Model

You can use the Telerik Blazor MCP server with local large language models (LLM). For example, run your local model through [Ollama](https://ollama.com) and use a third-party package such as [MCP-LLM Bridge](https://github.com/patruff/ollama-mcp-bridge) to connect the model to the Telerik MCP server. This will allow you to use the Telerik AI Coding Assistant without a cloud-based AI model.

## See Also

* [Telerik Blazor extension for GitHub Copilot](slug:ai-copilot-extension)