A modern, multi-model AI chat application built with Next.js, React, and Tailwind CSS. This project provides a beautiful chat interface for interacting with local Ollama models, supporting multiple conversations, markdown rendering, and model selection.
- Chat with local Ollama models (Llama 3, Gemma, Mistral, Code Llama, Phi 3, etc.)
- Multiple conversations with dynamic topic naming
- Markdown and code rendering in responses
- Responsive, modern UI with dark mode support
- Model selection per conversation
git clone [email protected]:utshabeb/ullama.git
cd ullamayarn install
# or
npm installMake sure you have Ollama installed and running locally. You can start a model with:
ollama run llama3yarn dev
# or
npm run devOpen http://localhost:3000 in your browser to use the app.
The default model for new conversations is set in src/app/page.tsx:
const [lastChosenModel, setLastChosenModel] = useState('llama3.2:latest');To change the default, update the string (e.g., to 'mistral', 'gemma', etc.).
You can also edit the <select> options in the sidebar to add or remove available models.
src/app/page.tsx— Main chat UI and logicsrc/app/layout.tsx— App layout and metadatasrc/app/api/ollama/route.ts— API route for communicating with Ollamapublic/— Static assetstailwind.config.ts— Tailwind CSS configuration
MIT