Chat SDK is a free, open-source template built with Next.js and the AI SDK that helps you quickly build powerful chatbot applications.
Read Docs · Features · Model Providers · Running locally
- Next.js App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Supports xAI (default), OpenAI, Fireworks, and other model providers
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- Neon Serverless Postgres for saving chat history and user data
- Vercel Blob for efficient file storage
- Auth.js
- Simple and secure authentication
This template ships with OpenAI gpt-4o-mini as the default chat model. However, with the AI SDK, you can switch LLM providers to Anthropic, Cohere, and many more with just a few lines of code.
In the chat header you can enter your own OpenAI API key, which will be used for chat responses and artifact generation.
You will need to use the environment variables defined in .env.example to run Next.js AI Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.
Note: You should not commit your
.envfile or it will expose secrets that will allow others to control access to your various AI and authentication provider accounts.
- Install Vercel CLI:
npm i -g vercel - Link local instance with Vercel and GitHub accounts (creates
.verceldirectory):vercel link - Download your environment variables:
vercel env pull
pnpm install
pnpm devYour app template should now be running on localhost:3000.
The webset artifact focuses on people and company profiles. It renders CSV data in a sleek, responsive table with search, per-column filters, sorting, column visibility, sticky headers, and CSV export. Recommended columns include: name, title, company, industry, website/company_url, linkedin_url, location, size, funding, description.
- Provider:
lib/ai/providers.tsmaps logical model ids to OpenAI models.chat-model,artifact-model, andtitle-modelusegpt-5;chat-model-reasoningusesgpt-5-reasoningwith reasoning traces extracted via<think>tags. - System prompts:
lib/ai/prompts.tsselects prompts per model. Non-reasoning chat includes the Artifacts instructions so GPT-5 knows when/how to call tools. - Tools/Artifacts:
app/(chat)/api/chat/route.tswires tools for weather, document create/update, suggestions, Gmail. Artifact servers inartifacts/*/server.tsstream typed outputs (text/code/sheet/webset) to the UI. - API key: Users set an OpenAI API key in Settings overlay (stored as
openai-api-keycookie). If empty,OPENAI_API_KEYenv var is used. - Model selection: UI exposes a model selector (
components/model-selector.tsx) constrained by entitlements. Selection is stored aschat-modelcookie and used by the chat API.