Skip to content

Chat with AI with project context

igardev edited this page Sep 15, 2025 · 8 revisions

Chat with AI with project context

Required servers

  • Chat server
  • Embeddings server

How to use it

This is a conversation with the local AI. It uses the project information and therefore is slower than Chat with AI, but could answer questions related to the project.

  • Press Ctrl+Shift+; inside an editor (or select from llama.vscode menu Chat with AI with project context)
  • Enter your question
  • llama-vscode collects a relevant context information from the project and sends it to the AI together with your question
  • Project context information is sent to the AI only if the question is entered with Ctrl+Shift+;. If the question is written directly in the chat window - no new context information is sent to the AI.
  • If the AI answers too slowly - close the VS Code chat window and open a new one with Ctrl+Shift+;
  • Press Esc if you want to return from the chat to the editor

It is possible to configure rag_* settings to adjust the rag search according to models and hardware ressources

Chat with AI with project context

Clone this wiki locally