A modern, real-time streaming chat application built with FastAPI (Python) backend and React TypeScript frontend.
oneline-chat/
βββ backend/ # FastAPI Python backend
β βββ src/ # Source code
β βββ tests/ # Test files
β βββ requirements.txt
βββ ui/ # React TypeScript frontend
β βββ src/ # Source code
β βββ package.json
βββ docker-compose.yml
βββ README.md
- π Real-time streaming chat via Server-Sent Events (SSE)
- π€ AI Integration with OpenAI/Ollama support
- ποΈ PostgreSQL database with SQLModel ORM
- π§ Configurable settings via environment variables
- π Comprehensive logging and error handling
- π§ͺ Full test coverage with pytest
- π¬ Modern chat interface with real-time streaming
- π Dark/light mode support
- π± Responsive design with Tailwind CSS
- βοΈ Chat settings (model selection, temperature, etc.)
- π‘ Axios HTTP client with interceptors
- π§ Configurable API endpoints via environment variables
- Node.js 18+
- Python 3.11+
- PostgreSQL 14+
- npm/yarn
cd backend
pip install -r requirements.txt
python -m oneline_chat.main
cd ui
npm install
npm run dev
Backend (.env
in /backend
):
DB_HOST=localhost
DB_PORT=5432
DB_USER=postgres
DB_PASSWORD=your_password
DB_NAME=oneline_chat_app
AI_PROVIDER=ollama
OLLAMA_BASE_URL=http://localhost:11434/v1
OLLAMA_MODEL=deepseek-r1:8b
Frontend (.env
in /ui
):
VITE_API_BASE_URL=http://localhost:8000
VITE_API_TIMEOUT=30000
docker-compose up -d
This will start:
- PostgreSQL database
- FastAPI backend
- React frontend
- Nginx reverse proxy
Once the backend is running, visit:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
POST /api/v1/chat/stream
- Streaming chat endpointPOST /api/v1/chat/completions
- Non-streaming chatGET /api/v1/chat/history/{chat_id}
- Chat historyGET /api/v1/models
- Available models
cd backend
pip install -r requirements.txt
pytest # Run tests
python -m oneline_chat.main # Start server
cd ui
npm install
npm run dev # Start dev server
npm run build # Build for production
npm run lint # Run linting
- FastAPI - Modern Python web framework
- SQLModel - SQLAlchemy + Pydantic for database ORM
- PostgreSQL - Primary database
- Pydantic Settings - Configuration management
- Uvicorn - ASGI server
- React 18 - UI framework
- TypeScript - Type safety
- Vite - Build tool and dev server
- Tailwind CSS - Utility-first styling
- Axios - HTTP client with interceptors
- REST API for standard operations
- Server-Sent Events (SSE) for real-time streaming
- JSON for data exchange
cd backend
pytest
pytest --cov # With coverage
cd ui
npm test
# Backend
cd backend
pip install -r requirements.txt
python -m oneline_chat.main
# Frontend
cd ui
npm install
npm run build
npm run preview
docker-compose -f docker-compose.prod.yml up -d
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Single: Single AI agent responds
- Multiple: Multiple AI agents can participate
- DeepSeek R1 8B (default)
- GPT-3.5 Turbo
- GPT-4
- Claude 3 Haiku/Sonnet
- Temperature: 0.0-2.0 (creativity level)
- Max Tokens: 50-4000 (response length)
- Save to DB: Toggle conversation persistence