Skip to content

Conversation

@matteoserva
Copy link
Contributor

The copilot chat extension checks if ollama version is at least "0.6.4" before loading the models.
The patch adds a new /api/version endpoints that returns the llama.cpp build number.
This allows loading llama.cpp models in copilot.

@ggerganov
Copy link
Member

This is a well-known bug in VSCode and AFAIK they are still working on fixing it #15177

@ggerganov ggerganov closed this Oct 23, 2025
@matteoserva matteoserva deleted the ollama_version branch October 23, 2025 08:55
@matteoserva matteoserva restored the ollama_version branch October 23, 2025 09:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants