Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 49 additions & 0 deletions .github/workflows/assets.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
name: assets
on:
pull_request: {}

jobs:
validate-assets:
runs-on: ubuntu-latest
timeout-minutes: 20
steps:
- uses: actions/checkout@v4

- name: Setup Node 20
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
cache-dependency-path: paform/frontend/package-lock.json

- name: Install frontend deps
run: |
cd paform/frontend
npm ci
- name: Generate reference GLBs
run: |
cd paform
python3 scripts/generate_reference_glbs.py
Comment on lines +24 to +27
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider pinning the Python version.

The workflow uses python3 without specifying a version, which relies on the Ubuntu runner's default Python installation (currently 3.12). For reproducibility and to prevent unexpected breakage, consider using actions/setup-python to pin a specific Python version.

Add a Python setup step before line 24:

+      - name: Setup Python
+        uses: actions/setup-python@v5
+        with:
+          python-version: '3.12'
+
       - name: Generate reference GLBs
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- name: Generate reference GLBs
run: |
cd paform
python3 scripts/generate_reference_glbs.py
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.12'
- name: Generate reference GLBs
run: |
cd paform
python3 scripts/generate_reference_glbs.py
🤖 Prompt for AI Agents
In .github/workflows/assets.yml around lines 24 to 27, the job runs python3
without pinning a version; add an actions/setup-python step immediately before
the "Generate reference GLBs" step to install and pin a specific Python version
(e.g., 3.11) using actions/setup-python@v4 and set python-version to the chosen
version so the script runs on a stable, reproducible interpreter; keep the
existing run step but ensure the runner uses the pinned Python (you can keep
invoking python3 or python after setup).

- name: Pack GLBs (LOD0/LOD1) with KTX2
run: |
cd paform
bash scripts/pack_models.sh
- name: Validate GLBs (fail on warnings)
run: |
cd paform
python3 scripts/glb_validate.py frontend/public/models/*.glb --fail-on-warning
- name: Generate manifest.json
run: |
cd paform
python3 scripts/gen_glb_manifest.py > frontend/public/models/manifest.json
- name: Upload manifest
if: always()
uses: actions/upload-artifact@v4
with:
name: assets-artifacts
path: paform/frontend/public/models/manifest.json
101 changes: 101 additions & 0 deletions .github/workflows/perf-light.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
name: perf-light

on:
pull_request: {}

jobs:
k6:
runs-on: ubuntu-latest
timeout-minutes: 15
concurrency:
group: perf-${{ github.ref }}
cancel-in-progress: true
steps:
- uses: actions/checkout@v4

- name: Install k6
run: |
sudo apt-get update
sudo apt-get install -y k6
- name: Start stack
run: |
cd paform
docker compose --env-file .env.development -f docker-compose.dev.yml up -d --build
Comment on lines +23 to +24
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Remove the failing cd paform hops

After checkout the runner is already inside the repository root (~/work/paform/paform). There is no nested paform/ directory, so cd paform exits with cd: paform: No such file or directory, and because steps run with set -e, the workflow aborts before Docker Compose even runs. Drop the cd paform here (and in the later steps that repeat it) or point to the actual path that contains docker-compose.dev.yml.

🤖 Prompt for AI Agents
In .github/workflows/perf-light.yml around lines 23-24, the step runs `cd
paform` but the runner is already at the repository root
(`~/work/paform/paform`) so the extra `cd paform` fails; remove the `cd paform`
command (and any other duplicate `cd paform` occurrences in subsequent steps) or
update those steps to use the correct path that actually contains
docker-compose.dev.yml so the docker compose command runs from the existing
working directory.

- name: Wait for API
run: |
cd paform
for i in {1..60}; do curl -sf http://localhost:8000/healthcheck && break || sleep 2; done
- name: Wait for Frontend
run: |
for i in {1..60}; do curl -sf http://localhost:3000/models/manifest.json && break || sleep 2; done
- name: Seed backend for perf
env:
BASE_URL: http://localhost:8000
run: |
cd paform
python - <<'PY'
import json
import os
import urllib.error
import urllib.request

BASE_URL = os.environ.get("BASE_URL", "http://localhost:8000")

def post(path: str, payload: dict) -> dict:
req = urllib.request.Request(
f"{BASE_URL}{path}",
data=json.dumps(payload).encode("utf-8"),
headers={"Content-Type": "application/json"},
)
try:
with urllib.request.urlopen(req, timeout=10) as resp:
return json.loads(resp.read().decode("utf-8"))
except urllib.error.HTTPError as exc:
detail = exc.read().decode("utf-8", "ignore")
raise SystemExit(f"Seed request failed ({exc.code}): {detail}")

material = post(
"/api/materials/",
{"name": "Walnut", "texture_url": None, "cost_per_sq_ft": 12.5},
)
material_id = material.get("id")
if not material_id:
raise SystemExit("Material creation failed; missing id")

post(
"/api/modules/",
{
"name": "Base600",
"width": 600.0,
"height": 720.0,
"depth": 580.0,
"base_price": 100.0,
"material_id": material_id,
},
)
PY

- name: Run k6 light profile
env:
BASE_URL: http://localhost:8000
FRONTEND_BASE_URL: http://localhost:3000
run: |
cd paform
k6 run --summary-export k6-summary.json tests/perf/k6-quote-cnc.js
- name: Upload k6 summary
if: always()
uses: actions/upload-artifact@v4
with:
name: k6-summary
path: paform/k6-summary.json

- name: Shutdown stack
if: always()
run: |
cd paform
docker compose --env-file .env.development -f docker-compose.dev.yml down
31 changes: 31 additions & 0 deletions backend/alembic/versions/cfe1d8e4e001_add_sync_events.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
from alembic import op
import sqlalchemy as sa

# revision identifiers, used by Alembic.
revision = "cfe1d8e4e001_add_sync_events"
down_revision = "0001"
branch_labels = None
depends_on = None


def upgrade() -> None:
op.create_table(
"sync_events",
sa.Column("id", sa.Integer(), primary_key=True, autoincrement=True),
sa.Column("event_id", sa.String(length=255), nullable=True),
sa.Column("source", sa.String(length=100), nullable=False),
sa.Column("body_sha256", sa.String(length=64), nullable=False),
sa.Column(
"received_at",
sa.TIMESTAMP(timezone=True),
server_default=sa.text("CURRENT_TIMESTAMP"),
nullable=False,
),
sa.UniqueConstraint("source", "event_id", name="uq_sync_events_src_event"),
sa.UniqueConstraint("source", "body_sha256", name="uq_sync_events_src_body"),
sa.UniqueConstraint("body_sha256", name="uq_sync_events_body"),
)


def downgrade() -> None:
op.drop_table("sync_events")
13 changes: 4 additions & 9 deletions backend/api/db.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
"""Database configuration and session management using SQLAlchemy 2.x."""

from __future__ import annotations

import os
Expand All @@ -21,7 +20,6 @@ def _normalize_postgres_url(url: str) -> str:
Accepts legacy prefixes and ensures the SQLAlchemy URL includes the
``+psycopg`` dialect when using PostgreSQL.
"""

if url.startswith("postgres://"):
return url.replace("postgres://", "postgresql+psycopg://", 1)
if url.startswith("postgresql://") and "+psycopg" not in url:
Expand All @@ -31,17 +29,14 @@ def _normalize_postgres_url(url: str) -> str:

def get_engine_url() -> str:
"""Resolve the database URL from env or settings with sane defaults."""

settings = Settings()
url = os.getenv("DATABASE_URL", settings.database_url)
if url.startswith("sqlite"):
# SQLite works as-is
return url
return _normalize_postgres_url(url)


ENGINE_URL = get_engine_url()

engine = create_engine(ENGINE_URL, pool_pre_ping=True, future=True)

SessionLocal = sessionmaker(
Expand All @@ -51,14 +46,14 @@ def get_engine_url() -> str:
future=True,
)

# Ensure models are registered on Base.metadata for create_all/drop_all
import api.models # noqa: E402,F401


def get_db() -> Generator:
"""FastAPI dependency to provide a session per request."""

db = SessionLocal()
try:
yield db
finally:
db.close()


db.close()
39 changes: 36 additions & 3 deletions backend/api/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,9 @@
import logging
from typing import Dict

from fastapi import FastAPI
from fastapi import FastAPI, Response, Request
from fastapi.exceptions import RequestValidationError
from fastapi.responses import JSONResponse
from starlette.middleware.cors import CORSMiddleware

from api.config import Settings
Expand All @@ -13,6 +15,7 @@
from api.routes_quote import router as quote_router
from api.routes_cnc import router as cnc_router
from api.routes_sync import router as sync_router
from prometheus_client import CONTENT_TYPE_LATEST, generate_latest

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Declare prometheus_client dependency for new /metrics endpoint

The API now imports CONTENT_TYPE_LATEST and generate_latest from prometheus_client, but the backend’s pyproject.toml does not list prometheus_client as a requirement. Deployments without the package installed will crash at import time with ModuleNotFoundError. Add the dependency to the backend project configuration.

Useful? React with 👍 / 👎.


# Configure logging
logging.basicConfig(
Expand All @@ -30,6 +33,15 @@
version="0.1.0",
)

from api.db import Base, engine
from api.config import Settings as _Settings

# Ensure tables exist in SQLite (dev/test); migrations still handle Postgres
if _Settings().database_url.startswith("sqlite"):
@app.on_event("startup")
async def _ensure_sqlite_tables() -> None:
Base.metadata.create_all(bind=engine)

# Add CORS middleware
app.add_middleware(
CORSMiddleware,
Expand All @@ -48,6 +60,22 @@
app.include_router(sync_router)


@app.exception_handler(RequestValidationError)
async def handle_validation_error(request: Request, exc: RequestValidationError) -> JSONResponse:
# Unified error envelope for malformed JSON / validation errors
return JSONResponse(
status_code=422,
content={
"ok": False,
"error": {
"code": "BAD_REQUEST",
"message": "invalid request",
"details": exc.errors(),
},
},
)


@app.get("/")
async def root() -> Dict[str, str]:
"""Root endpoint of the API.
Expand All @@ -62,8 +90,7 @@ async def root() -> Dict[str, str]:

@app.get("/healthcheck")
async def healthcheck() -> Dict[str, str]:
"""Health check endpoint.
"""
This endpoint can be used to verify that the API is running and responsive.
Returns
Expand All @@ -72,3 +99,9 @@ async def healthcheck() -> Dict[str, str]:
A dictionary indicating the health status of the API.
"""
return {"status": "healthy"}


@app.get("/metrics")
async def metrics() -> Response:
# Expose Prometheus metrics including default process/python collectors
return Response(generate_latest(), media_type=CONTENT_TYPE_LATEST)
21 changes: 21 additions & 0 deletions backend/api/metrics.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
from __future__ import annotations
from prometheus_client import Counter

# Hygraph sync counters
sync_success_total = Counter(
"sync_success_total",
"Successful Hygraph sync operations",
labelnames=("type",),
)

sync_failure_total = Counter(
"sync_failure_total",
"Failed Hygraph sync operations",
labelnames=("type",),
)

sync_records_upserted_total = Counter(
"sync_records_upserted_total",
"Records upserted during Hygraph sync",
labelnames=("type",),
)
20 changes: 18 additions & 2 deletions backend/api/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
H3: Hardened data models for materials and modules supporting assembly logic,
placement constraints, connection points, and external IDs for BIM/Revit.
"""

from __future__ import annotations

import uuid
Expand All @@ -18,13 +17,13 @@
Index,
String,
func,
UniqueConstraint,
)
from sqlalchemy.dialects.postgresql import JSONB
from sqlalchemy.orm import Mapped, mapped_column, relationship

from api.db import Base


# Cross-dialect JSON -> JSONB for PostgreSQL, JSON otherwise
JSON_COMPAT = JSON().with_variant(JSONB, "postgresql")

Expand Down Expand Up @@ -115,3 +114,20 @@ class Module(Base):
Index("ix_modules_material_name", Module.material_id, Module.name)


# Sync events for webhook deduplication
class SyncEvent(Base):
__tablename__ = "sync_events"

id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
event_id: Mapped[str | None] = mapped_column(String(255), nullable=True)
source: Mapped[str] = mapped_column(String(100), nullable=False)
body_sha256: Mapped[str] = mapped_column(String(64), nullable=False)
received_at: Mapped[datetime] = mapped_column(
TIMESTAMP(timezone=True), server_default=func.now(), nullable=False
)

__table_args__ = (
UniqueConstraint("source", "event_id", name="uq_sync_events_src_event"),
UniqueConstraint("source", "body_sha256", name="uq_sync_events_src_body"),
UniqueConstraint("body_sha256", name="uq_sync_events_body"),
)
Loading
Loading