Backend implementation for AI-powered dream analysis with Inngest workflows, permanent sandbox IDs, and comprehensive pattern recognition.
Trigger: When user hits "Save" button after journal entry Process:
- Create permanent sandbox ID
- OpenAI agent analyzes single dream
- Generate personalized response (funny remarks, jokes, recommendations)
- Store analysis results in database
Trigger: Button click on patterns page Process:
- Create permanent sandbox ID
- OpenAI agent analyzes ALL journal entries
- Find patterns across dreams
- Generate comprehensive insights and recommendations
- Store pattern analysis in database
-
Setup Inngest Development Server
- Install Inngest CLI
- Configure inngest.json
- Set up permanent sandbox environment
- Test webhook connections
-
Database Schema Updates
- Add
sandbox_idfield to dreams table - Add
analysis_typefield (individual vs pattern) - Add
processing_statusfield - Create indexes for performance
- Add
-
API Endpoint Enhancement
- Update
/api/dreamsPOST to trigger Inngest workflow - Add sandbox ID generation
- Implement immediate response + background processing
- Update
-
Inngest Job: Single Dream Analysis
- Create
dream-individual-analysisjob - Implement OpenAI integration with humor prompts
- Add error handling and retries
- Store results in
dream_analysestable
- Create
-
AI Prompt Engineering
- Create base system prompt for individual analysis
- Add humor detection and response generation
- Include recommendation engine based on dream content
- Test with various dream types (funny, scary, emotional)
-
Frontend Pattern Trigger
- Add "Generate New AI Analysis" button to patterns page
- Implement API call to trigger pattern analysis
- Show processing status and progress
-
API Endpoint: Pattern Analysis
- Create
/api/patterns/analyzePOST endpoint - Fetch all user dreams from database
- Trigger Inngest pattern analysis job
- Return job ID for status tracking
- Create
-
Inngest Job: Pattern Analysis
- Create
dream-pattern-analysisjob - Fetch all user dreams efficiently
- Implement batch processing for large datasets
- Generate comprehensive pattern insights
- Store in
user_patternstable
- Create
-
Performance Improvements
- Add database indexes for dream queries
- Implement pagination for large dream collections
- Add caching for frequently accessed data
- Optimize SQL queries for pattern analysis
-
Data Integrity
- Add foreign key constraints
- Implement soft deletes
- Add audit logging
- Create backup procedures
-
Advanced AI Features
- Implement user-specific prompts based on history
- Add emotional intelligence to responses
- Create joke database for humor enhancement
- Implement learning from user feedback
-
Response Quality
- A/B testing for different prompt styles
- User rating system for AI responses
- Continuous prompt improvement
- Quality assurance checks
-
Robust Error Handling
- Implement retry mechanisms
- Add fallback responses
- Handle OpenAI rate limits
- Database connection error handling
-
Monitoring & Logging
- Add comprehensive logging
- Implement health checks
- Monitor processing times
- Alert system for failures
// File: src/lib/inngest-jobs/dream-analysis.ts
export const dreamAnalysisJob = inngest.createFunction(
{ id: "dream-individual-analysis" },
{ event: "dream.created" },
async ({ event, step }) => {
// Implementation details
}
);// File: src/lib/inngest-jobs/pattern-analysis.ts
export const patternAnalysisJob = inngest.createFunction(
{ id: "dream-pattern-analysis" },
{ event: "pattern.analysis.requested" },
async ({ event, step }) => {
// Implementation details
}
);ALTER TABLE dreams ADD COLUMN sandbox_id UUID;
ALTER TABLE dreams ADD COLUMN processing_status VARCHAR(50) DEFAULT 'pending';
CREATE INDEX idx_dreams_sandbox_id ON dreams(sandbox_id);
CREATE INDEX idx_dreams_processing_status ON dreams(processing_status);CREATE TABLE analysis_jobs (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id VARCHAR(255) NOT NULL,
job_type VARCHAR(50) NOT NULL, -- 'individual' or 'pattern'
sandbox_id UUID NOT NULL,
status VARCHAR(50) DEFAULT 'running',
created_at TIMESTAMP DEFAULT NOW(),
completed_at TIMESTAMP,
error_message TEXT
);- POST /api/dreams - Save dream + trigger analysis
- GET /api/dreams/[id]/analysis-status - Check processing status
- POST /api/patterns/analyze - Trigger pattern analysis
- GET /api/patterns/analysis-status/[jobId] - Check status
const INDIVIDUAL_ANALYSIS_PROMPT = `
You are a friendly AI dream analyst with a sense of humor.
Analyze this dream and provide:
1. Light-hearted observations or funny remarks if appropriate
2. Personalized recommendations based on dream content
3. Emotional insights
4. Symbolic interpretations
Dream: {dreamContent}
Context: {userContext}
`;const PATTERN_ANALYSIS_PROMPT = `
You are an expert dream pattern analyst.
Analyze all dreams and provide:
1. Recurring themes and their significance
2. Emotional patterns over time
3. Sleep quality correlations
4. Wellness recommendations
5. Behavioral insights
Dreams: {allDreams}
User Profile: {userProfile}
`;- Test individual job functions
- Test AI prompt generation
- Test database operations
- Test error handling
- Test full workflow end-to-end
- Test Inngest job execution
- Test database transactions
- Test API endpoints
- Test with large dream datasets
- Test concurrent job processing
- Test OpenAI API rate limits
- Test database query performance
- Inngest dev server running
- Database migrations applied
- Environment variables configured
- OpenAI API key configured
- Inngest production setup
- Database backup procedures
- Error monitoring configured
- Performance monitoring setup
- Dreams analyzed within 30 seconds of saving
- Pattern analysis completes within 2 minutes
- 99% job success rate
- Meaningful and humorous AI responses
- Handle 100+ concurrent dream analyses
- Process 1000+ dreams for pattern analysis
- Sub-second API response times
- Efficient database queries
- Immediate: Set up Inngest development environment
- Week 1: Implement individual dream analysis workflow
- Week 2: Implement pattern analysis workflow
- Week 3: AI prompt optimization and testing
- Week 4: Performance optimization and monitoring
- All sandbox IDs must be permanent and persistent
- AI responses should be personalized and contextual
- Error handling must be comprehensive
- Performance must scale with user growth
- User experience should be seamless and engaging
Status: Phase 1 & 2 & 3 Complete - Ready for Database Migration and Testing Last Updated: July 7, 2025
Before testing, you must run the database migration manually in Supabase.
See: DATABASE_MIGRATION_INSTRUCTIONS.md for detailed steps.
-
Individual Dream Analysis: When you save a dream, it will:
- Generate a permanent sandbox ID
- Trigger background AI analysis with Dr. DreamBot
- Provide humorous observations and recommendations
- Store results in the database
-
Pattern Analysis: When you click "New AI Analysis":
- Analyze ALL your dreams for patterns
- Generate comprehensive wellness insights
- Provide personalized recommendations
- Track analysis progress
-
Inngest Integration: Both servers running:
- Next.js dev server: http://localhost:3001
- Inngest dev server: http://localhost:8288