A modern full-stack application for interacting with Groq AI models through a clean, responsive chat interface.
- Interactive Chat Interface: User-friendly chat UI similar to ChatGPT and Claude
- Multiple AI Models: Support for various Groq models including Mixtral, LLaMA2, and Gemma
- Customizable Parameters: Adjust temperature and token length for different response styles
- Streaming Support: Option for real-time streaming responses
- Responsive Design: Works seamlessly on desktop and mobile devices
- Dark Mode Support: Automatic dark/light theme based on system preferences
This project consists of two main components:
- Backend API: FastAPI server that proxies requests to the Groq API
- Frontend UI: React + TypeScript application with a modern chat interface
- Node.js (v16+)
- Python (v3.8+)
- Groq API key (obtain from Groq's website)
-
Clone the repository:
git clone https://github.com/Abhay-Kanwasi/Groq-powered-Chatbot.git cd Groq-powered-Chatbot/backend -
Create a virtual environment and install dependencies:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install -r requirements.txt
-
Create a
.envfile with your Groq API key:GROQ_API_KEY=your_groq_api_key_here -
Start the backend server:
uvicorn app:app --reload
The FastAPI backend will run on http://localhost:8000.
-
Navigate to the frontend directory:
cd ../frontend -
Install dependencies:
npm install
-
Start the development server:
npm run dev
The React frontend will run on http://localhost:3000.
groq-chatbot/
βββ backend/ # FastAPI backend
β βββ app.py # Main FastAPI application
β βββ requirements.txt # Python dependencies
β βββ .env # Environment variables
β
βββ frontend/ # React frontend
β βββ src/
β β βββ api/ # API integration
β β βββ components/ # React components
β β βββ store/ # Zustand state management
β β βββ types.ts # TypeScript types
β β βββ App.tsx # Main application component
β β βββ main.tsx # Entry point
β βββ package.json # Node.js dependencies
β βββ vite.config.ts # Vite configuration
β βββ tsconfig.json # TypeScript configuration
β
βββ README.md # Project documentation
| Endpoint | Method | Description |
|---|---|---|
/chat |
POST | Send a chat message and get an AI response |
/models |
GET | Get a list of available AI models |
/health |
GET | Health check endpoint |
The frontend communicates with the backend using Axios for HTTP requests and React Query for data fetching/caching.
- FastAPI: High-performance web framework for building APIs
- httpx: Asynchronous HTTP client for Python
- python-dotenv: Environment variable management
- React: UI library for building user interfaces
- TypeScript: Typed JavaScript for better developer experience
- Vite: Next-generation frontend tooling
- Zustand: Lightweight state management
- React Query: Data fetching and caching library
- Tailwind CSS: Utility-first CSS framework
- shadcn/ui: Reusable UI components
- Lucide React: Beautiful icons
GROQ_API_KEY: Your Groq API keyGROQ_ORGANIZATION_ID: Groq Organization ID- Available models are defined in the
MODELSdictionary inapp.py
The chat settings can be adjusted via the settings panel:
- Model: Select which Groq model to use [Available models are: deepseek-r1-distill-llama-70b, llama-3.3-70b-versatile, qwen-qwq-32b, qwen-2.5-coder-32b]
- Temperature: Adjust from 0.0 (more deterministic) to 1.0 (more creative)
- Max Tokens: Set the maximum length of responses (100-2000)
- Stream: Toggle streaming mode for real-time responses
- Make changes to the backend or frontend code
- Backend changes will automatically reload with uvicorn's
--reloadflag - Frontend changes will automatically reload with Vite's hot module replacement
cd backend
pip install -r requirements.txtRun with a production ASGI server like Uvicorn or Gunicorn:
gunicorn main:app -w 4 -k uvicorn.workers.UvicornWorkercd frontend
npm run buildThis creates a dist directory with production-ready static files that can be served by any static file server.
- The backend should validate all inputs
- In production, configure CORS properly to restrict access
- Don't expose your Groq API key in client-side code
- Consider adding rate limiting for the chat endpoint
Contributions are welcome! Please feel free to submit a Pull Request.
- Groq for providing the AI API
- FastAPI for the backend framework
- React for the frontend library
- Tailwind CSS for styling