Build once, then let production fix itself. Samba listens to error signals (e.g., Sentry alerts), finds the faulty line, proposes a minimal fix with GPT‑5, commits on a short‑lived branch, and opens a PR—end to end.
– Track: Autonomous Reactive Agents
– Event: AI Coding Agents Hackathon @ YC Office
– Test App: Next.js demo repo yc-hackathon-social
- In production, users hit an error. Sentry captures it and sends an
event_alertwebhook to Samba. (Sentry setup is external to this repo and required.) - Samba parses stack + metadata, resolves
owner/repo/branch, locates the implicated file/line, and fetches it from GitHub. - GPT‑5 proposes the smallest safe edit; Samba commits on a short‑lived branch and opens a PR. No auto‑deploy.
- You review and merge the PR. The fix ships via your normal CI/CD.
We’ve used tools like Cursor BugBot and CodeRabbit PR Reviewer; they’re great, but none provided a seamless signal-to-fix loop that runs without a developer in the middle. Our insight: the next step is a composable, event-driven agent workflow that closes the loop from error to PR with minimal human intervention.
Core is a Mastra workflow with a fixed, auditable orchestration path from signal to PR; agent/tool executions are LLM-driven and may vary in output:
- Parse input
- Resolve repo and base branch
- Locate implicated file
- Read file content
- Propose fix (GPT‑5)
- Commit fix to feature branch
- Open PR
Key file: src/mastra/workflows/fix-from-stacktrace-workflow-v2.ts
High-level steps implemented there:
- parse-input: Extracts repo URL, optional ref, explicit file path, and stack frames
- resolve-repo: Hits GitHub API to resolve owner/repo and default branch (or provided ref)
- locate-file: Tries direct path; otherwise GitHub Code Search by filename
- read-file: Reads decoded file via GitHub Contents API
- propose-fix: Calls
propose-fix-gpt5tool to generate the smallest safe change - commit-fix: Creates a short-lived branch and commits updated content
- open-pr: Opens a PR against the default branch
All agents use openai('gpt-5-nano') and LibSQLStore memory.
- Discovery (
src/mastra/agents/discovery-agent.ts): parse stack, resolve repo, locate file; tools: parse stack, repo by URL, code search, get file/decoded. - Execution (
src/mastra/agents/execution-agent.ts): generate minimal fix, commit/update file on feature branch; tools: commit-or-update file, get decoded file. - Finalize (
src/mastra/agents/finalize-agent.ts): open PR to default branch; tool: create PR. - GitHub (
src/mastra/agents/github-agent.ts): end-to-end assistant (parse, search, patch, PR); rich GitHub + parsing toolset.
-
propose-fix-gpt5(src/mastra/tools/propose-fix-gpt5-tool.ts)- Calls OpenAI GPT‑5 to propose a minimal edit; outputs the full updated file content
- Honors line context, language hints, and un-fences markdown if returned
-
GitHub helpers: search, contents (decoded), commit/update, branch/PR operations
Samba exposes a webhook: POST /webhook
- Verifies HMAC signature if
WEBHOOK_SECRETis set - Detects Sentry
event_alertpayloads, extracts stack/file hints - Maps Sentry project → repo via
SENTRY_PROJECT_REPO_MAP(or falls back toSENTRY_DEFAULT_REPO_URL) - Builds a prompt and triggers
fix-from-stacktrace-v2
See src/mastra/index.ts for route wiring. On successful detection, the workflow either starts natively or via the manual runner startFixFromStacktraceV2.
- Flow: users hit an error → Sentry posts webhook → Samba turns signal into a PR (no auto‑deploy).
- Data received: stack, title/message, breadcrumbs, exception values, project metadata.
- Checklist: (1) Enable Sentry in prod, (2) Add Webhook action to
https://<your-samba-host>/webhook, (3) Set matchingWEBHOOK_SECRET, (4) ConfigureSENTRY_PROJECT_REPO_MAPorSENTRY_DEFAULT_REPO_URL. - Boundaries: Sentry setup lives in your app; Samba consumes webhooks and opens PRs; tokens are never logged.
Minimal demo: Next.js repo yc-hackathon-social with an intentional simple bug to showcase the automated fix loop.
Prerequisites: Node ≥ 20.9, GitHub token (repo scope), OpenAI API key (GPT‑5 access)
Run:
npm install
npm run dev
# or
npm run startWebhook: served at /webhook.
Env vars:
OPENAI_API_KEYorOPENAI_API_KEY_GPT5(model viaGPT5_MODEL, defaultgpt-5)GITHUB_TOKENorGITHUB_PERSONAL_ACCESS_TOKENWEBHOOK_SECRET(optional)SENTRY_PROJECT_REPO_MAP(JSON) orSENTRY_DEFAULT_REPO_URL- Example:
{ "my-sentry-project": { "owner": "org", "repo": "service", "branch": "main" }, "default": "https://github.com/org/service" }
- Example:
- Export env vars; start Samba:
npm run dev - Trigger:
curl -X POST "http://localhost:PORT/webhook" \
-H 'Content-Type: application/json' \
-H 'Sentry-Hook-Resource: event_alert' \
-d '{
"resource": "event_alert",
"data": { "event": { "message": "TypeError at src/app/page.tsx:42" } },
"project": "demo"
}'Replace PORT accordingly. If WEBHOOK_SECRET is set, include the matching signature header.
- Webhook arrives →
parse-inputderives repo/path/line from Sentry event and optional mapping - GitHub API fetches file at the inferred path and ref
- GPT‑5 proposes a minimal safe edit
- GitHub Contents API commits the change on a new
fix/<timestamp>branch - A PR is opened to the default branch with context
- Minimal diffs: bias towards the smallest viable fix; preserve formatting and imports
- Guard rails: never log tokens; prefer additive changes and safe checks over deletions
- Deterministic steps: every API call is explicit and traced; fallback paths are conservative
- Fixed orchestration; LLM/tool outputs and external APIs can vary.
- Mitigations: smallest‑safe‑change prompts, strict API checks, conservative fallbacks, idempotent branches, no auto‑merge/deploy.
- Observability:
runId+ step logs (no secrets) for replay/debugging. - Repro: pin
GPT5_MODEL; identical inputs may still vary slightly.
- Tool call limits can apply depending on model/provider quotas
- Multi-file fixes are out of scope for the MVP; planned as iterative PRs
- Broader signal sources (CloudWatch, Datadog) can be added via additional webhook adapters
- Built with Mastra for agent orchestration and tooling
- GitHub and OpenAI APIs for code operations and LLM fixes
- Demo app: Next.js repo yc-hackathon-social
ISC