We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 528783c commit eef895cCopy full SHA for eef895c
README.md
@@ -17,7 +17,7 @@ LLM inference in C/C++
17
18
## Hot topics
19
20
-- [Hot PRs](https://github.com/ggml-org/llama.cpp/pulls?q=is%3Apr+label%3Ahot+)
+- Hot PRs: [All](https://github.com/ggml-org/llama.cpp/pulls?q=is%3Apr+label%3Ahot+) / [Open](https://github.com/ggml-org/llama.cpp/pulls?q=is%3Apr+label%3Ahot+is%3Aopen)
21
- Multimodal support arrived in `llama-server`: [#12898](https://github.com/ggml-org/llama.cpp/pull/12898) | [documentation](./docs/multimodal.md)
22
- VS Code extension for FIM completions: https://github.com/ggml-org/llama.vscode
23
- Vim/Neovim plugin for FIM completions: https://github.com/ggml-org/llama.vim
0 commit comments