A simple chatbot interface for self-hosted Large Language Models.
- Frontend: Vue.js
- Backend: A self-hosted Ollama instance
- Tunneling: Cloudflare Tunnel
You can run this frontend against your own local Ollama server.
-
Clone the repository:
git clone https://github.com/Odinman9847/ai-chat.git cd ai-chat -
Install dependencies:
npm install
-
Configure your backend URL: Modify the file
.envin the project root and add your Ollama server's URL.# .env VITE_API_URL=http://localhost:11434 -
Run the development server:
npm run dev
The public version of this site connects to a server running on my personal computer. It will be offline most of the time.
For a reliable experience, please follow the instructions above to run the chatbot against your own backend.
