This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
- Development server:
npm run dev- starts Next.js development server - Build:
npm run build- creates production build - Production server:
npm start- runs production server - Linting:
npm run lint- runs ESLint with Next.js configuration
For the best development experience with hot reload:
-
Setup:
npm run setup # Edit .env.local file with your OpenAI API key -
Start Development:
npm run dev:local
This starts PostgreSQL in Docker + Next.js locally with hot reload.
-
Your app: Open http://localhost:3000 ✨
Local Development (Hot Reload):
npm run setup- Creates local environment filenpm run dev:local- Starts database + app with hot reloadnpm run db:start- Start only the databasenpm run db:stop- Stop the databasenpm run db:seed- Seed database with embeddings
Full Docker (Production-like):
npm run docker:full- Full Docker setup + seedingnpm run docker:prod- Production deployment
- Hot Reload: ✅ Works perfectly with local development
- Database: Runs in Docker on port 5433
- Environment: Uses
.env.localfor local development - Changes: Automatically reflected in browser
- Database initialization: Automatic setup of pgvector extension and documents table
- Data persistence: PostgreSQL data stored in Docker volume
postgres_data - Connection: Application connects to database via Docker internal network
The application requires vector embeddings for RAG functionality. You can seed the database with documents:
-
Automatic seeding with data download:
docker-compose --profile seeding up seeder
This will download data from the pat-data repository and create embeddings
-
Manual data seeding:
- Place
.txtfiles in a./datadirectory - Set
DOWNLOAD_DATA=falsein your environment - Run:
docker-compose --profile seeding up seeder
- Place
-
Seeding configuration:
DOWNLOAD_DATA=true- Downloads data from GitHub repositoryPAT_DATA_REPO- Repository URL for source documentsCHUNK_SIZE=2500- Text chunk size for embeddingsOVERLAP_SENTENCES=2- Sentence overlap between chunks
-
Resetting data:
docker-compose down -v && docker-compose up -d docker-compose --profile seeding up seeder
Pat is a Next.js-based chatbot focused on philosophical discussions about cognitive science. The application uses a Retrieval Augmented Generation (RAG) architecture with PostgreSQL vector storage.
Frontend (app/)
page.tsx- Main chat interface using Vercel AI SDK'suseChathookcomponents/- Modular UI components (chatmessage, clearbutton, printbutton, sendbutton)- Uses Tailwind CSS for styling with custom color scheme and typography
Backend API (app/api/)
message/route.ts- Main chat endpoint that processes user messagesmodel-config.ts- OpenAI integration with RAG functionalitymodel-prompts.ts- System prompts and message processing
The system performs semantic search on each user message:
- Creates embeddings using OpenAI's
text-embedding-3-large - Queries PostgreSQL with pgvector for similar content using cosine distance (
<=>) - Injects top 3 matching excerpts into system prompt
- Streams responses using GPT-4o model
- User input → embedding generation → vector similarity search → context retrieval
- System prompt + retrieved context + conversation history → OpenAI API
- Streaming response back to frontend via Vercel AI SDK
- Vercel AI SDK (
ai) - streaming chat interface and OpenAI integration - OpenAI - embeddings and chat completions
- PostgreSQL + pgvector - vector similarity search
- LangChain - additional AI tooling (community package)
- Supabase - likely used for database hosting
The local/ directory contains scripts for:
create-embeddings-v2.js/create-embeddings.mjs- document embedding creationtest-pg-vectorstore.mjs- vector store testing
- Uses proxy agent for OpenAI requests
- Environment variables expected:
OPENAI_API_KEY, PostgreSQL connection params - Local storage for chat persistence
The easiest way to deploy to production:
./deploy.shThis automated script handles everything:
- ✅ Validates system requirements (Docker, Docker Compose)
- ✅ Sets up environment configuration
- ✅ Builds and deploys all services
- ✅ Provides nginx configuration guidance
- ✅ Optional database seeding
- ✅ Shows deployment status and useful commands
If you prefer manual control:
# Copy environment template
cp .env.template .env
# Edit with your values
nano .env # Set OPENAI_API_KEY and POSTGRES_PASSWORD# Deploy application and database
docker-compose -f docker-compose.production.yml up -d --build
# Optional: Seed database with documents
docker-compose -f docker-compose.production.yml --profile seeding up seeder# Copy nginx configuration
sudo cp nginx.conf /etc/nginx/sites-available/pat
# Edit domain name
sudo nano /etc/nginx/sites-available/pat # Replace 'your-domain.com'
# Enable site
sudo ln -s /etc/nginx/sites-available/pat /etc/nginx/sites-enabled/
sudo nginx -t && sudo systemctl reload nginx[Internet] → [System Nginx:80/443] → [Docker App:3000] → [Docker PostgreSQL:5432]
Components:
- Next.js Application: Dockerized, optimized build, localhost-only access
- PostgreSQL Database: pgvector extension, persistent storage, no external access
- System Nginx: Reverse proxy, SSL termination, public access
Create .env file with these required values:
# REQUIRED
OPENAI_API_KEY=your_openai_api_key_here
POSTGRES_PASSWORD=your_secure_password_here
# OPTIONAL
POSTGRES_USER=pat_user
POSTGRES_DB=pat_db
DOWNLOAD_DATA=true
PAT_DATA_REPO=https://github.com/Vassar-Cognitive-Science/pat-data.git# View status
docker-compose -f docker-compose.production.yml ps
# View logs
docker-compose -f docker-compose.production.yml logs -f
# Stop deployment
docker-compose -f docker-compose.production.yml down
# Update deployment
git pull && docker-compose -f docker-compose.production.yml up -d --build
# Backup database
docker-compose -f docker-compose.production.yml exec postgres \
pg_dump -U pat_user pat_db > backup.sqlCommon Issues:
- Port conflicts: App runs on localhost:3000, ensure no conflicts
- Environment variables: Check
.envfile has required values set - Database connection: PostgreSQL starts before app, check health status
- Nginx configuration: Verify domain name and proxy settings
Health Checks:
# Test app directly
curl http://127.0.0.1:3000
# Check container health
docker-compose -f docker-compose.production.yml ps
# View detailed logs
docker-compose -f docker-compose.production.yml logs postgres
docker-compose -f docker-compose.production.yml logs app