A plugin for the LLM tool.
When installed, running llm webchat starts a local webserver.
Visiting its address in the browser lets you see conversations
in your llm database and chat with supported language models.
The appearance and functionality of llm-webchat are heavily
customizable: styles, frontend behavior and even the backend
logic.
For more details, see the high-level architecture docs or the extension docs.
llm install llm-webchat
llm webchatLLM Webchat recognizes the following environment variables (all of which are optional):
LLM_WEBCHAT_CONVERSATION_IDS: a comma-separated whitelist of conversations.LLM_WEBCHAT_JAVASCRIPT_PLUGINS: a comma-separated list of .js files containing frontend plugins (see the extension docs for more details).LLM_WEBCHAT_STATIC_PATHS: a comma-separated list of additional file paths that should be served from/plugins.LLM_WEBCHAT_HOST: the host address the server binds to. Defaults to127.0.0.1.LLM_WEBCHAT_PORT: the port the server listens on. Defaults to8000.LLM_WEBCHAT_TOOL_CHAIN_LIMIT: maximum number of chained tool responses allowed per message. Defaults to20. Set to0for unlimited.
- A reasonably recent version of Python with uv
- A reasonably recent version of Node
Build the frontend (output goes to src/llm_webchat/static/):
cd frontend
npm install
npm run buildRun the backend (serves the built frontend at /):
uv run llm-webchatFor frontend development with hot reload (proxies /api requests to the backend):
cd frontend
npm run devuv run pytest
cd frontend
npm test
