A blank starter template combining Cedar-OS for the frontend AI interface.
- 🤖 AI Chat Integration: Built-in chat workflows powered by OpenAI through Mastra agents
- ⚡ Real-time Streaming: Server-sent events (SSE) for streaming AI responses
- 🎨 Beautiful UI: Cedar-OS components with 3D effects and modern design
- 🔧 Type-safe Workflows: Mastra-based backend with full TypeScript support
- 📡 Dual API Modes: Both streaming and non-streaming chat endpoints
The fastest way to get started:
npx cedar-os-cli plant-seed
Then select this template when prompted. This will set up the entire project structure and dependencies automatically.
This template demonstrates the Cedar chat interface.
For more details, see the Cedar Getting Started Guide.
- Node.js 18+
- OpenAI API key
- pnpm (recommended) or npm
- Clone and install dependencies:
git clone <repository-url>
cd cedar-blank
npm install
- Set up environment variables:
Create a
.env
file in the root directory:
OPENAI_API_KEY=your-openai-api-key-here
- Start the development servers:
npm run dev
This runs both the Next.js frontend and Mastra backend concurrently:
- Frontend: http://localhost:3000
- Backend API: http://localhost:4111
- Simple Chat UI: See Cedar OS components in action in a pre-configured chat interface
- Cedar-OS Components: Cedar-OS Components installed in shadcn style for local changes
- Tailwind CSS, Typescript, NextJS: Patterns you're used to in any NextJS project
# Start the development server
npm run dev
MIT License - see LICENSE file for details.