Open-Fiesta is a web-based AI chat playground built with Next.js and TypeScript. It allows users to interact with and compare various large language models (LLMs) from different providers side-by-side. The application features a flexible interface where users can select up to five models to chat with simultaneously.
- Multi-Provider Support: Integrates with major AI providers like Google (Gemini), OpenRouter, and Ollama, which provide access to a wide range of open-source and proprietary models.
- Model Comparison View: The core feature is the ability to send a single prompt to multiple selected models and view their responses in a clean, organized grid layout.
- Persistent Chat Threads: Chat history is saved locally in the browser, allowing users to resume previous conversations.
- Customizable Experience: Users can manage their API keys, select their preferred models, and customize the user interface.
- Web Search and Image Attachments: Supports web search capabilities and image attachments for certain models like Gemini.
- Dockerized Environment: Comes with pre-configured Docker setups for both development and production, simplifying deployment.
- Framework: Next.js 14 (with App Router)
- Language: TypeScript
- Styling: Tailwind CSS
- API Handling: Next.js API routes are used to proxy requests to the different AI provider APIs. This allows for secure handling of API keys and normalization of responses.
- State Management: Primarily uses React hooks (
useState,useMemo) anduseLocalStoragefor persistent state. - Deployment: Configured for standalone Next.js output, suitable for containerized deployments.
- Node.js and npm
- Docker (optional, for containerized workflows)
-
Install Dependencies:
npm install
-
Configure Environment: Copy the example environment file and add your API keys:
cp .env.example .env
Edit
.envto add yourGEMINI_API_KEYand/orOPENROUTER_API_KEY. -
Run the Development Server:
npm run dev
The application will be available at
http://localhost:3000.
-
Build the Application:
npm run build
-
Start the Production Server:
npm run start
-
Build Production Image:
npm run docker:build
-
Run Production Container:
npm run docker:run
-
Run in Development Mode with Docker Compose:
npm run docker:dev
- Linting: The project uses ESLint for code quality. Run the linter with:
npm run lint
- Component-Based Architecture: The UI is built with reusable React components located in the
components/directory. - API Routes: Server-side logic for communicating with AI providers is handled in the
app/api/directory. Each provider has its own route for handling requests and normalizing responses. - Model Definitions: The available AI models are defined in
lib/models.ts. To add a new model, this file should be updated. - Styling: Utility-first CSS with Tailwind CSS is the standard. Custom styles are defined in
app/globals.css. - State Management: For client-side state, prefer React hooks. For state that needs to persist across sessions, use the
useLocalStoragehook found inlib/useLocalStorage.ts. - Types: TypeScript types are used throughout the project. Global or shared types are defined in
lib/types.ts.