-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy path.env.example
More file actions
40 lines (31 loc) · 1.59 KB
/
.env.example
File metadata and controls
40 lines (31 loc) · 1.59 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
# Voice Journal Environment Variables
# Copy this file to .env and fill in your actual values
# Sarvam AI API Key (for Speech-to-Text and Text-to-Speech)
# Get your key at: https://console.sarvam.ai/
SARVAM_API_KEY=your_sarvam_api_key_here
# Redis Cloud Connection URL
# Format: redis://default:<password>@<host>:<port>
# Get a free Redis Cloud instance at: https://redis.com/try-free/
REDIS_URL=redis://default:your_password@your-redis-host.cloud.redislabs.com:12345
# Agent Memory Server URL (local server for memory management)
MEMORY_SERVER_URL=http://localhost:8000
# OpenAI API Key (for semantic router embeddings)
# Get your key at: https://platform.openai.com/api-keys
OPENAI_API_KEY=sk-your_openai_api_key_here
# Google OAuth Client ID for browser sign-in
# Use the same value for frontend and backend Google login verification
GOOGLE_CLIENT_ID=your_google_oauth_client_id_here
NEXT_PUBLIC_GOOGLE_CLIENT_ID=your_google_oauth_client_id_here
# Frontend API URL (used by Docker/production frontend builds)
NEXT_PUBLIC_API_URL=http://localhost:8080
# Allowed frontend origins for FastAPI CORS (comma-separated)
CORS_ORIGINS=http://localhost:3000
# Ollama Configuration (for LLM responses)
# For App Runner + EC2, point this at the Ollama EC2 host in the same AWS region.
# Example: http://10.0.12.34:11434
# Optional: if Ollama is unavailable, the backend falls back to simpler text responses
OLLAMA_URL=http://localhost:11434
OLLAMA_MODEL=llama3.2
# Optional Google Calendar integration
# Place `credentials.json` in the project root, then authorize once to generate `token.json`.
# See README.md for details.