A simple web-based interface for chatting with local LLMs using Ollama.
-
Clone the repository
-
Install dependencies:
pip install -r requirements.txt
-
Run Ollama locally:
ollama run llama3
-
Start the server:
python app.py
- The app assumes Ollama is available at
http://localhost:11434