Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 51 additions & 0 deletions .github/workflows/docker-build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
name: Docker Build and Test

on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main, develop ]

jobs:
build:
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Build Backend
run: |
cd Backend
docker build -t inpactai-backend:test .
- name: Build Frontend
run: |
cd Frontend
docker build -t inpactai-frontend:test .
- name: Start services
run: |
docker compose up -d
sleep 30
Comment on lines +31 to +33
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The workflow starts services with 'docker compose up -d' on line 32 but doesn't create or configure the required .env files first. The services will fail to start properly without valid environment configurations. Consider adding a step to create mock .env files with placeholder values, or document that this workflow requires repository secrets to be configured.

Copilot uses AI. Check for mistakes.
Comment on lines +32 to +34
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The workflow waits 30 seconds after starting services (line 33) before checking health, but this is arbitrary and may not be sufficient for all environments. The docker-compose.yml includes health checks that indicate when services are ready. Consider using 'docker compose up -d --wait' instead, which waits for health checks to pass rather than using a fixed sleep duration.

Suggested change
docker compose up -d
sleep 30
docker compose up -d --wait

Copilot uses AI. Check for mistakes.
- name: Check backend health
run: |
curl -f http://localhost:8000/ || exit 1
- name: Check frontend health
run: |
curl -f http://localhost:5173/ || exit 1
- name: Show logs on failure
if: failure()
run: |
docker compose logs
- name: Cleanup
if: always()
run: |
docker compose down -v
21 changes: 21 additions & 0 deletions Backend/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
__pycache__
*.pyc
*.pyo
*.pyd
.Python
*.so
.env
.venv
env/
venv/
ENV/
.git
.gitignore
.pytest_cache
.coverage
htmlcov/
dist/
build/
*.egg-info/
.DS_Store
*.log
12 changes: 12 additions & 0 deletions Backend/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
user=postgres
password=your_postgres_password
host=your_postgres_host
port=5432
dbname=postgres
GROQ_API_KEY=your_groq_api_key
SUPABASE_URL=your_supabase_url
SUPABASE_KEY=your_supabase_key
GEMINI_API_KEY=your_gemini_api_key
YOUTUBE_API_KEY=your_youtube_api_key
REDIS_HOST=redis
REDIS_PORT=6379
18 changes: 18 additions & 0 deletions Backend/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
FROM python:3.10-slim

WORKDIR /app

RUN apt-get update && apt-get install -y --no-install-recommends \
gcc \
libpq-dev \
curl \
&& rm -rf /var/lib/apt/lists/*

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

EXPOSE 8000

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
33 changes: 33 additions & 0 deletions Backend/Dockerfile.prod
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
FROM python:3.10-slim AS builder

WORKDIR /app

RUN apt-get update && apt-get install -y --no-install-recommends \
gcc \
libpq-dev \
&& rm -rf /var/lib/apt/lists/*

COPY requirements.txt .
RUN pip install --no-cache-dir --user -r requirements.txt

FROM python:3.10-slim

WORKDIR /app

RUN apt-get update && apt-get install -y --no-install-recommends \
libpq5 \
&& rm -rf /var/lib/apt/lists/* \
&& groupadd -r appuser && useradd -r -g appuser appuser

COPY --from=builder /root/.local /root/.local
COPY . .

RUN chown -R appuser:appuser /app

USER appuser

ENV PATH=/root/.local/bin:$PATH

EXPOSE 8000

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
42 changes: 40 additions & 2 deletions Backend/app/main.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from fastapi import FastAPI
from fastapi import FastAPI, Request
from fastapi.middleware.cors import CORSMiddleware
from starlette.middleware.base import BaseHTTPMiddleware
from .db.db import engine
from .db.seed import seed_db
from .models import models, chat
Expand All @@ -9,13 +10,21 @@
from sqlalchemy.exc import SQLAlchemyError
import logging
import os
import time
from dotenv import load_dotenv
from contextlib import asynccontextmanager
from app.routes import ai

# Load environment variables
load_dotenv()

# Configure logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)


# Async function to create database tables with exception handling
async def create_tables():
Expand All @@ -38,13 +47,42 @@ async def lifespan(app: FastAPI):
print("App is shutting down...")


# Custom middleware for logging and timing
class RequestMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request: Request, call_next):
start_time = time.time()

logger.info(f"Incoming: {request.method} {request.url.path}")

response = await call_next(request)

process_time = time.time() - start_time
response.headers["X-Process-Time"] = str(process_time)
response.headers["X-Content-Type-Options"] = "nosniff"
response.headers["X-Frame-Options"] = "DENY"
response.headers["X-XSS-Protection"] = "1; mode=block"
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The security headers added in lines 61-63 are good practice, but they're missing important security headers like "Strict-Transport-Security" for HTTPS enforcement and "Content-Security-Policy" for XSS protection. While the existing headers are beneficial, a more complete security header set would better protect the application.

Suggested change
response.headers["X-XSS-Protection"] = "1; mode=block"
response.headers["X-XSS-Protection"] = "1; mode=block"
response.headers["Strict-Transport-Security"] = "max-age=63072000; includeSubDomains; preload"
response.headers["Content-Security-Policy"] = "default-src 'none';"

Copilot uses AI. Check for mistakes.

logger.info(f"Completed: {request.method} {request.url.path} - {response.status_code} ({process_time:.3f}s)")

return response

# Initialize FastAPI
app = FastAPI(lifespan=lifespan)

# Add custom middleware
app.add_middleware(RequestMiddleware)

# Add CORS middleware
app.add_middleware(
CORSMiddleware,
allow_origins=["http://localhost:5173"],
allow_origins=[
"http://localhost:5173",
"http://localhost:5174",
"http://localhost:5175",
"http://localhost:5176",
"http://frontend:5173",
"http://127.0.0.1:5173"
Comment on lines +78 to +84
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The CORS origins list includes redundant entries. "http://localhost:5173" appears on line 79 and "http://127.0.0.1:5173" on line 85, but these refer to the same host (localhost resolves to 127.0.0.1). Similarly, including multiple sequential ports (5173-5176) without clear documentation of why each is needed creates maintenance overhead. Consider documenting the purpose of each origin or consolidating to only necessary entries.

Suggested change
allow_origins=[
"http://localhost:5173",
"http://localhost:5174",
"http://localhost:5175",
"http://localhost:5176",
"http://frontend:5173",
"http://127.0.0.1:5173"
# CORS origins for local development and Dockerized frontend.
# Only include necessary ports. If you need to add more, document the reason.
allow_origins=[
"http://localhost:5173", # Vite default dev server
"http://localhost:5174", # Alternate dev server (if used)
"http://frontend:5173", # Docker Compose frontend service

Copilot uses AI. Check for mistakes.
],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
Expand Down
24 changes: 18 additions & 6 deletions Backend/app/routes/post.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,25 +18,37 @@
import uuid
from datetime import datetime, timezone

# Load environment variables
load_dotenv()
url: str = os.getenv("SUPABASE_URL")
key: str = os.getenv("SUPABASE_KEY")
supabase: Client = create_client(url, key)

url: str = os.getenv("SUPABASE_URL", "")
key: str = os.getenv("SUPABASE_KEY", "")

if not url or not key or "your-" in url:
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error handling on line 26-27 uses an overly broad check for placeholder values by looking for "your-" in the URL. This could create false positives if a legitimate Supabase project URL happens to contain this substring. Consider checking against the exact example values from .env.example or using a more specific pattern.

Suggested change
if not url or not key or "your-" in url:
if (
not url or not key or
url.strip() == "your-supabase-url" or
key.strip() == "your-supabase-key"
):

Copilot uses AI. Check for mistakes.
print("⚠️ Supabase credentials not configured. Some features will be limited.")
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Print statement may execute during import.

Copilot uses AI. Check for mistakes.
supabase = None
else:
try:
supabase: Client = create_client(url, key)
except Exception as e:
print(f"❌ Supabase connection failed: {e}")
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Print statement may execute during import.

Copilot uses AI. Check for mistakes.
supabase = None

# Define Router
router = APIRouter()

# Helper Functions
def generate_uuid():
return str(uuid.uuid4())

def current_timestamp():
return datetime.now(timezone.utc).isoformat()

# ========== USER ROUTES ==========
def check_supabase():
if not supabase:
raise HTTPException(status_code=503, detail="Database service unavailable. Please configure Supabase credentials.")
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message in the check_supabase function has inconsistent capitalization. "Database service unavailable. Please configure Supabase credentials." should be "Database service unavailable. Please configure Supabase credentials" (remove period before secondary sentence or maintain consistent sentence structure).

Suggested change
raise HTTPException(status_code=503, detail="Database service unavailable. Please configure Supabase credentials.")
raise HTTPException(status_code=503, detail="Database service unavailable. Please configure Supabase credentials")

Copilot uses AI. Check for mistakes.

@router.post("/users/")
async def create_user(user: UserCreate):
check_supabase()
user_id = generate_uuid()
t = current_timestamp()

Expand Down
Loading