Skip to content

Latest commit

 

History

History
78 lines (54 loc) · 2.44 KB

File metadata and controls

78 lines (54 loc) · 2.44 KB

LLM Webchat logo

LLM Webchat

PyPI Tests License

A plugin for the LLM tool. When installed, running llm webchat starts a local webserver. Visiting its address in the browser lets you see conversations in your llm database and chat with supported language models.

The appearance and functionality of llm-webchat are heavily customizable: styles, frontend behavior and even the backend logic.

For more details, see the high-level architecture docs or the extension docs.

Quickstart

llm install llm-webchat
llm webchat

Screenshot

Screenshot of LLM Webchat in a browser

Configuration

LLM Webchat recognizes the following environment variables (all of which are optional):

  • LLM_WEBCHAT_CONVERSATION_IDS: a comma-separated whitelist of conversations.
  • LLM_WEBCHAT_JAVASCRIPT_PLUGINS: a comma-separated list of .js files containing frontend plugins (see the extension docs for more details).
  • LLM_WEBCHAT_STATIC_PATHS: a comma-separated list of additional file paths that should be served from /plugins.
  • LLM_WEBCHAT_HOST: the host address the server binds to. Defaults to 127.0.0.1.
  • LLM_WEBCHAT_PORT: the port the server listens on. Defaults to 8000.
  • LLM_WEBCHAT_TOOL_CHAIN_LIMIT: maximum number of chained tool responses allowed per message. Defaults to 20. Set to 0 for unlimited.

Development

Prerequisites

  • A reasonably recent version of Python with uv
  • A reasonably recent version of Node

Building and running

Build the frontend (output goes to src/llm_webchat/static/):

cd frontend
npm install
npm run build

Run the backend (serves the built frontend at /):

uv run llm-webchat

For frontend development with hot reload (proxies /api requests to the backend):

cd frontend
npm run dev

Running tests

uv run pytest
cd frontend
npm test