A simple chat application demonstrating integration with the Groq SDK large language model inference API. The app features a React Vite frontend and a Fastify backend with streaming chat completion responses and session-based chat history.
- React + Vite client with chat UI and polished message bubbles
- Fastify server serving chat requests to Groq LLM models via
groq-sdk
- Streaming assistant replies via Server-Sent Events (SSE) and incremental UI updates
- Session-based chat history persisted in
localStorage
- CORS-enabled backend for local development
- Node.js v18+
- A Groq API key (Get one here)
- npm or yarn package manager
git clone https://github.com/yourusername/groq-chat-app.git
cd groq-chat-app
Create a .env
file in the server
folder:
GROQ_API_KEY=your-groq-api-key
Replace your-groq-api-key
with your actual API key.
cd server
npm install
cd ../client
npm install
npm run dev
The server will start on http://localhost:3001
npm run dev
The client will start on http://localhost:5173
- Open the frontend URL in your browser.
- Start typing messages in the textarea and click Send.
- The assistant’s replies will stream in real-time in chat bubbles.
- Your conversation is saved per session in localStorage.
groq-chat-app/
│
├── server/ # Fastify backend
│ ├── index.js # Main server file with streaming route
│ ├── package.json
│ └── .env # Groq API key
│
├── client/ # Vite + React frontend
│ ├── src/
│ │ ├── App.jsx # React app with streaming chat UI
│ │ └── App.css # Chat UI styles
│ ├── package.json
│ └── vite.config.js
│
└── README.md # This file
- The streaming endpoint uses Server-Sent Events (SSE) via manual raw response writes.
- CORS headers are set manually in the streaming route to allow requests from
http://localhost:5173
. - Session management is done on the client using a UUID stored in
localStorage
. - The UI supports word wrapping and preserves message formatting.
- CORS errors: Ensure backend CORS headers include your frontend origin.
- Streaming issues: Verify your Node.js version supports streams and that your Groq API key is valid.
- Slow responses: Network latency or model load can impact streaming speed.
MIT License
For questions or feedback, please open an issue or contact [[email protected]].