Skip to content

Conversation

@frankbria
Copy link
Owner

@frankbria frankbria commented Jan 14, 2026

Summary

  • Adds complete GitHub PR integration (Sprint 11) enabling PR creation, listing, merging, and closing via GitHub API
  • New /api/projects/{id}/prs/ endpoints with full CRUD operations
  • WebSocket broadcasts for real-time PR status updates (pr_created, pr_merged, pr_closed)

Changes

New Files (6)

  • codeframe/git/github_integration.py - Async GitHub API client using httpx
  • codeframe/persistence/repositories/pr_repository.py - Database operations for PRs
  • codeframe/ui/routers/prs.py - FastAPI endpoints for PR management
  • tests/unit/test_github_integration.py - 17 unit tests
  • tests/unit/test_pr_repository.py - 14 unit tests
  • tests/unit/test_pr_router.py - 14 unit tests

Modified Files (7)

  • Schema: Added pull_requests table with indexes
  • Config: Added GITHUB_TOKEN and GITHUB_REPO settings
  • Server: Registered PR router
  • WebSocket: Added PR event broadcasts

API Endpoints

Endpoint Method Description
/api/projects/{id}/prs POST Create PR via GitHub API
/api/projects/{id}/prs GET List PRs (optional status filter)
/api/projects/{id}/prs/{number} GET Get PR details
/api/projects/{id}/prs/{number}/merge POST Merge PR
/api/projects/{id}/prs/{number}/close POST Close PR

Test plan

  • 45 new unit tests added and passing
  • 459+ existing tests verified with no regressions
  • Ruff linting passes
  • Manual testing with real GitHub token (optional)

Refs: #272

Summary by CodeRabbit

  • New Features

    • GitHub integration to create, list, retrieve, merge, and close pull requests from the platform.
    • API endpoints for project-level PR management and added server routing.
    • Real-time PR event notifications via WebSocket (created, merged, closed).
    • Persistent PR tracking and status management in the database.
    • New configuration options to enable GitHub integration (token and repo).
  • Tests

    • Extensive unit tests covering the GitHub client, PR repository, and API router.

✏️ Tip: You can customize this high-level summary in your review settings.

Sprint 11 implementation - adds complete GitHub PR integration:

Database:
- Added pull_requests table with status tracking (draft/open/merged/closed)
- Added indexes for project, issue, and branch lookups

Backend:
- PRRepository: Database operations for PR CRUD
- GitHubIntegration: Async httpx client for GitHub REST API
- PR Router: POST/GET/MERGE/CLOSE endpoints at /api/projects/{id}/prs
- WebSocket broadcasts for pr_created, pr_merged, pr_closed events

Configuration:
- GITHUB_TOKEN and GITHUB_REPO environment variables
- Updated .env.example with GitHub integration docs

Tests (45 new tests):
- PRRepository: 14 unit tests for database operations
- GitHubIntegration: 17 unit tests with mocked API
- PR Router: 14 API endpoint tests

This enables GitHub-based PR workflows when GITHUB_TOKEN is configured,
while maintaining backward compatibility for local-only operations.

Refs: #272
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 14, 2026

Note

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

Walkthrough

Adds end-to-end GitHub pull request support: config vars, an async GitHub client, DB schema and PR repository, API router endpoints, WebSocket broadcast helpers, and unit tests covering the PR lifecycle.

Changes

Cohort / File(s) Summary
Configuration
\.env.example`, `codeframe/core/config.py``
Added GITHUB_TOKEN and GITHUB_REPO examples and optional github_token / github_repo fields on GlobalConfig.
GitHub API Integration
\codeframe/git/github_integration.py``
New async GitHubIntegration client with typed PR operations, GitHubAPIError, PRDetails, MergeResult, centralized request handling and logging.
Database Schema & Repos
\codeframe/persistence/schema_manager.py`, `codeframe/persistence/repositories/pr_repository.py`, `codeframe/persistence/repositories/init.py`, `codeframe/persistence/database.py``
Added pull_requests table and indexes; new PRRepository with CRUD and domain operations; wired PRRepository into Database and exports.
API Router & Server
\codeframe/ui/routers/prs.py`, `codeframe/ui/server.py``
New PR router exposing create/list/get/merge/close endpoints; mounted in server; includes access/config validation, DB persistence, GitHub sync, and broadcast calls.
WebSocket Broadcasting
\codeframe/ui/websocket_broadcasts.py``
Added broadcast_pr_created, broadcast_pr_merged, and broadcast_pr_closed helpers with message payloads and error logging.
Tests
\tests/unit/test_github_integration.py`, `tests/unit/test_pr_repository.py`, `tests/unit/test_pr_router.py``
Comprehensive unit tests for GitHubIntegration, PRRepository, and PR router behavior including success and error scenarios.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant Router as PR Router
    participant GitHub as GitHubIntegration
    participant DB as Database
    participant Broadcast as WebSocket Manager

    User->>Router: POST /api/projects/{id}/prs (create PR)
    Router->>Router: Validate access & GitHub config
    Router->>GitHub: create_pull_request(branch, title, body, base)
    GitHub-->>GitHub: HTTP POST to GitHub API
    GitHub-->>Router: PRDetails
    Router->>DB: create_pr(...) then update_pr_github_data(...)
    DB-->>Router: PR ID / success
    Router->>Broadcast: broadcast_pr_created(...)
    Router-->>User: 201 CreatePRResponse
Loading
sequenceDiagram
    participant User
    participant Router as PR Router
    participant GitHub as GitHubIntegration
    participant DB as Database
    participant Broadcast as WebSocket Manager

    User->>Router: POST /api/projects/{id}/prs/{number}/merge
    Router->>Router: Validate access & config
    Router->>GitHub: merge_pull_request(pr_number, method)
    GitHub-->>GitHub: HTTP PUT to GitHub API
    GitHub-->>Router: MergeResult
    Router->>DB: update_pr_status(merged, merge_commit_sha, merged_at)
    DB-->>Router: Success
    Router->>Broadcast: broadcast_pr_merged(...)
    Router-->>User: 200 MergePRResponse
Loading
sequenceDiagram
    participant User
    participant Router as PR Router
    participant DB as Database

    User->>Router: GET /api/projects/{id}/prs?status=...
    Router->>DB: list_prs(project_id, status)
    DB-->>Router: List[PR records]
    Router-->>User: 200 PRListResponse
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Possibly related issues

Possibly related PRs

Poem

🐰 I dug a burrow in CI's light,

I stitched branches, signed them right,
I hopped through tokens, found the repo,
Merged with a thump and sent a cheero,
Websockets cheered as PRs took flight. 🥕✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: adding GitHub PR integration for pull request management, which is the primary focus of this changeset.
Docstring Coverage ✅ Passed Docstring coverage is 98.91% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feature/github-pr-integration

🧹 Recent nitpick comments
codeframe/ui/routers/prs.py (2)

85-89: Add type annotation for prs field.

The prs field lacks a type annotation, which weakens OpenAPI schema generation and type safety. Consider using a more specific type.

♻️ Suggested improvement
+from typing import Any, Literal, Optional
+
 class PRListResponse(BaseModel):
     """Response containing list of PRs."""

-    prs: list
+    prs: list[dict[str, Any]]
     total: int

Alternatively, define a PRSummary model for even better type safety and documentation.


254-279: Consider adding a response model for OpenAPI documentation.

The endpoint returns the PR dict directly without a response_model, which means the OpenAPI schema won't fully document the response structure. Consider defining a PRDetailResponse model for consistency with other endpoints.


📜 Recent review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f594e04 and d93547c.

📒 Files selected for processing (1)
  • codeframe/ui/routers/prs.py
🧰 Additional context used
📓 Path-based instructions (1)
codeframe/**/*.py

📄 CodeRabbit inference engine (CLAUDE.md)

codeframe/**/*.py: Use Python 3.11+ for backend development
Use FastAPI framework for backend API development
Use SQLite with async support (aiosqlite) for database operations
Use tiktoken for token counting
Run linting with ruff check .
Import and use domain-specific repositories from codeframe/persistence/repositories/ for data access instead of direct database calls

Files:

  • codeframe/ui/routers/prs.py
🧬 Code graph analysis (1)
codeframe/ui/routers/prs.py (7)
codeframe/core/config.py (3)
  • GlobalConfig (84-195)
  • load_environment (198-211)
  • get (294-306)
codeframe/git/github_integration.py (5)
  • GitHubIntegration (56-329)
  • GitHubAPIError (17-29)
  • create_pull_request (186-221)
  • close (327-329)
  • get_pull_request (223-242)
codeframe/ui/dependencies.py (1)
  • get_db (14-29)
codeframe/ui/websocket_broadcasts.py (3)
  • broadcast_pr_created (1021-1057)
  • broadcast_pr_merged (1060-1087)
  • broadcast_pr_closed (1090-1119)
codeframe/auth/dependencies.py (1)
  • get_current_user (17-121)
codeframe/cli/api_client.py (1)
  • post (230-240)
codeframe/persistence/repositories/pr_repository.py (5)
  • create_pr (19-59)
  • update_pr_github_data (128-160)
  • list_prs (95-126)
  • get_pr_by_number (76-93)
  • update_pr_status (162-219)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
  • GitHub Check: Frontend Unit Tests
  • GitHub Check: Backend Unit Tests
  • GitHub Check: opencode-review
  • GitHub Check: E2E Smoke Tests (Chromium)
  • GitHub Check: Macroscope - Correctness Check
🔇 Additional comments (6)
codeframe/ui/routers/prs.py (6)

1-33: LGTM!

Imports are well-organized, and the router follows RESTful conventions with proper tagging for OpenAPI documentation.


35-38: LGTM!

The helper appropriately reloads environment to ensure fresh configuration. Creating a new GlobalConfig on each call is acceptable for the API call frequency expected here.


231-251: LGTM!

Clean implementation that properly uses the PR repository for data access and supports optional status filtering.


282-357: LGTM!

The merge endpoint correctly implements conditional DB updates and broadcasts only when result.merged is True. The try/except/finally pattern ensures proper cleanup of the GitHub client. This addresses previous feedback about unconditional status updates.


360-422: LGTM!

The close endpoint correctly implements conditional DB updates and broadcasts only when the GitHub API returns closed=True. This properly addresses previous feedback about unconditional status updates. Good consistency with the merge endpoint pattern.


114-134: Code follows repository pattern correctly—no changes needed.

The Database class acts as a facade that delegates get_project() and user_has_project_access() to the ProjectRepository via self.projects, which satisfies the coding guideline to use domain-specific repositories for data access. The async function with synchronous database calls is the correct pattern for FastAPI endpoint handlers, where database operations are fast and non-blocking.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.


Comment @coderabbitai help to get the list of available commands and usage tips.

@macroscopeapp
Copy link

macroscopeapp bot commented Jan 14, 2026

Add GitHub pull request management to API and expose routes under /api/projects/{project_id}/prs

Introduce GitHubIntegration with structured models and error handling, persist PRs in a new pull_requests table with indexes, and add FastAPI endpoints for create, list, get, merge, and close using validated config and websocket broadcasts. Core config now reads GITHUB_TOKEN and GITHUB_REPO.

📍Where to Start

Start with the create_pull_request handler in prs.py, then review GitHubIntegration._make_request in github_integration.py and SchemaManager.create_schema in schema_manager.py.


Macroscope summarized d93547c.


# Handle empty responses (204 No Content)
if response.status_code == 204:
return None
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Response handling is brittle: 204 returns None, JSON decode isn’t guarded, and callers assume types. Suggest centralising parsing to return {} on 204, guard response.json(), validate expected types, and raise GitHubAPIError on errors.

Suggested change
return None
return {}

🚀 Want me to fix this? Reply ex: "fix it for me".

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
codeframe/persistence/database.py (1)

212-219: Bug: pull_requests repository missing from async connection propagation.

The _update_repository_async_connections() method updates all other repositories but omits self.pull_requests. This will cause PRRepository to retain a stale or None async connection after reconnection or lazy initialization, potentially breaking any async PR operations.

🐛 Proposed fix
     def _update_repository_async_connections(self) -> None:
         """Update async connections in all repositories."""
         for repo in [self.projects, self.issues, self.tasks, self.agents, self.blockers,
                      self.memories, self.context_items, self.checkpoints, self.git_branches,
                      self.test_results, self.lint_results, self.code_reviews, self.quality_gates,
-                     self.token_usage, self.correction_attempts, self.activities, self.audit_logs]:
+                     self.token_usage, self.correction_attempts, self.activities, self.audit_logs,
+                     self.pull_requests]:
             if repo:
                 repo._async_conn = self._async_conn
🤖 Fix all issues with AI agents
In `@codeframe/persistence/repositories/pr_repository.py`:
- Around line 176-209: The code uses datetime.now() which is timezone-naive;
change it to a UTC-aware timestamp (e.g., datetime.now(timezone.utc)) and ensure
imports include timezone from datetime; also normalize merged_at when passed in
(if merged_at is naive, attach timezone.utc or convert to UTC) before calling
merged_at.isoformat(), so all stored timestamps (now and merged_at) are
timezone-aware ISO strings in the update branches that call
merged_at.isoformat() and now.isoformat().

In `@codeframe/ui/routers/prs.py`:
- Around line 322-327: The code calls db.pull_requests.update_pr_status(...)
unconditionally after attempting a GitHub merge, which can mark PRs as "merged"
even when result.merged is False; change the flow in the merge handling (where
result and result.sha are available) to check result.merged first and only call
db.pull_requests.update_pr_status(pr_id=pr["id"], status="merged",
merge_commit_sha=result.sha) when result.merged is True, otherwise update or log
a failed state (e.g., status="merge_failed" or leave unchanged) and include
error/response info for debugging.
🧹 Nitpick comments (11)
codeframe/core/config.py (1)

118-120: LGTM! Configuration fields follow existing patterns.

The new fields integrate well with the existing configuration structure. The github_repo format comment is helpful.

Consider adding a field_validator for github_repo to enforce the "owner/repo" format when a value is provided, similar to how log_level is validated. This would provide early feedback if misconfigured.

💡 Optional: Add format validation for github_repo
`@field_validator`("github_repo")
`@classmethod`
def validate_github_repo(cls, v: Optional[str]) -> Optional[str]:
    """Validate github_repo follows owner/repo format."""
    if v is None:
        return v
    if "/" not in v or v.count("/") != 1:
        raise ValueError(f"GITHUB_REPO must be in 'owner/repo' format, got: {v}")
    owner, repo = v.split("/")
    if not owner or not repo:
        raise ValueError(f"GITHUB_REPO must be in 'owner/repo' format, got: {v}")
    return v
codeframe/persistence/schema_manager.py (1)

634-657: LGTM! Well-designed pull_requests table schema.

The table design is solid with appropriate foreign key relationships, status constraints, and nullable fields for data populated after GitHub API responses.

One consideration: A unique constraint on (project_id, pr_number) where pr_number IS NOT NULL could prevent duplicate PR records for the same GitHub PR. This would require a partial unique index since pr_number starts as NULL.

💡 Optional: Add partial unique index for pr_number

Add this in _create_indexes:

cursor.execute(
    "CREATE UNIQUE INDEX IF NOT EXISTS idx_pull_requests_unique_pr_number "
    "ON pull_requests(project_id, pr_number) WHERE pr_number IS NOT NULL"
)
codeframe/persistence/repositories/pr_repository.py (1)

48-60: Consider using BaseRepository helper methods for consistency.

The repository directly accesses self.conn.cursor() instead of using BaseRepository's _execute, _fetchone, _fetchall, and _commit helpers. The base helpers provide:

  • Thread safety via sync_lock
  • Consistent error handling
  • Row-to-dict conversion via _row_to_dict

This applies to all methods in this file. Using the base helpers would ensure consistent behavior with other repositories.

♻️ Example refactor for create_pr
     def create_pr(
         self,
         project_id: int,
         issue_id: Optional[int],
         branch_name: str,
         title: str,
         body: str,
         base_branch: str,
         head_branch: str,
         status: str = "open",
     ) -> int:
-        cursor = self.conn.cursor()
-        cursor.execute(
+        self._execute(
             """
             INSERT INTO pull_requests (
                 project_id, issue_id, branch_name, title, body,
                 base_branch, head_branch, status
             )
             VALUES (?, ?, ?, ?, ?, ?, ?, ?)
             """,
             (project_id, issue_id, branch_name, title, body, base_branch, head_branch, status),
         )
-        self.conn.commit()
-        return cursor.lastrowid
+        self._commit()
+        return self._get_last_insert_id()
tests/unit/test_github_integration.py (1)

64-73: Consider adding cleanup in fixture.

The github fixture creates a real httpx.AsyncClient but doesn't close it after tests. While tests mock _make_request, the client connection pool may leak.

♻️ Suggested fixture with cleanup
     `@pytest.fixture`
-    def github(self):
+    async def github(self):
         """Create GitHubIntegration instance."""
-        return GitHubIntegration(
+        gh = GitHubIntegration(
             token="ghp_test_token_12345",
             repo="owner/test-repo",
         )
+        yield gh
+        await gh.close()

Note: This would require marking tests as @pytest.mark.asyncio or using pytest-asyncio auto mode.

tests/unit/test_pr_router.py (1)

105-138: Consider verifying gh.close() is called.

The test mocks GitHubIntegration but doesn't verify that close() is called in the finally block. This would ensure resource cleanup is tested.

♻️ Optional assertion to verify cleanup
             assert response.status_code == 201
             data = response.json()
             assert data["pr_number"] == 42
             assert data["pr_url"] == "https://github.com/owner/repo/pull/42"
             assert data["status"] == "open"
+
+            # Verify cleanup
+            mock_gh_instance.close.assert_called_once()
codeframe/git/github_integration.py (1)

243-265: Add pagination support to list_pull_requests for repositories with many PRs.

GitHub's PR list endpoint returns paginated results (default 30 items per page, max 100). The current implementation makes a single API call and returns only the first page. For repositories with more than 30 open PRs, this results in incomplete data. Consider adding per_page parameter support to the method signature or document this limitation.

tests/unit/test_pr_repository.py (1)

192-224: Consider adding assertion for pr_id1 to improve test clarity.

The pr_id1 variable is created but not used in assertions. While the test logic is correct (it verifies filtering works), adding an explicit assertion using pr_id1 would make the test intent clearer.

💡 Optional: Add assertion using pr_id1
         # List only open PRs
         open_prs = repo.list_prs(project_id, status="open")
         assert len(open_prs) == 1
         assert open_prs[0]["title"] == "Open PR"
+        assert open_prs[0]["id"] == pr_id1

         # List only merged PRs
         merged_prs = repo.list_prs(project_id, status="merged")
         assert len(merged_prs) == 1
         assert merged_prs[0]["title"] == "Merged PR"
+        assert merged_prs[0]["id"] == pr_id2
codeframe/ui/routers/prs.py (4)

35-38: Consider caching GlobalConfig to avoid repeated file reads.

get_global_config() is called in every endpoint that uses GitHub integration. Each call to load_environment() and GlobalConfig() reads from disk. While this ensures fresh config, it could be inefficient under load.

💡 Consider using FastAPI dependency with caching
+from functools import lru_cache
+
+@lru_cache
 def get_global_config() -> GlobalConfig:
     """Get global configuration with GitHub settings."""
     load_environment()
     return GlobalConfig()

Or use FastAPI's Depends pattern with a startup event to load config once.


64-68: Add validation for merge method.

The method field accepts any string, but GitHub only supports "merge", "squash", or "rebase". Invalid values will cause a 422 from GitHub.

💡 Add Literal type constraint
+from typing import Literal, Optional
+
 class MergePRRequest(BaseModel):
     """Request to merge a pull request."""

-    method: str = Field("squash", description="Merge method (squash, merge, rebase)")
+    method: Literal["merge", "squash", "rebase"] = Field(
+        "squash", description="Merge method (squash, merge, rebase)"
+    )

83-88: Consider adding type annotation for prs field.

Using bare list loses type information in the OpenAPI schema. Consider list[dict] or defining a PRDetailResponse model for better API documentation.

💡 Example with dict type
 class PRListResponse(BaseModel):
     """Response containing list of PRs."""

-    prs: list
+    prs: list[dict]
     total: int

386-404: Conditional broadcast based on close result would be safer.

The closed variable from gh.close_pull_request() indicates success, but the database update and broadcast happen unconditionally. Consider making these conditional.

💡 Make updates conditional on success
     try:
         closed = await gh.close_pull_request(pr_number)

-        # Update database
-        db.pull_requests.update_pr_status(
-            pr_id=pr["id"],
-            status="closed",
-        )
-
-        # Broadcast PR closed event
-        await broadcast_pr_closed(
-            manager=manager,
-            project_id=project_id,
-            pr_number=pr_number,
-        )
+        if closed:
+            # Update database
+            db.pull_requests.update_pr_status(
+                pr_id=pr["id"],
+                status="closed",
+            )
+
+            # Broadcast PR closed event
+            await broadcast_pr_closed(
+                manager=manager,
+                project_id=project_id,
+                pr_number=pr_number,
+            )

         logger.info(f"Closed PR #{pr_number} for project {project_id}")

         return ClosePRResponse(closed=closed)
📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between bc32826 and 2b4023b.

📒 Files selected for processing (13)
  • .env.example
  • codeframe/core/config.py
  • codeframe/git/github_integration.py
  • codeframe/persistence/database.py
  • codeframe/persistence/repositories/__init__.py
  • codeframe/persistence/repositories/pr_repository.py
  • codeframe/persistence/schema_manager.py
  • codeframe/ui/routers/prs.py
  • codeframe/ui/server.py
  • codeframe/ui/websocket_broadcasts.py
  • tests/unit/test_github_integration.py
  • tests/unit/test_pr_repository.py
  • tests/unit/test_pr_router.py
🧰 Additional context used
📓 Path-based instructions (2)
codeframe/**/*.py

📄 CodeRabbit inference engine (CLAUDE.md)

codeframe/**/*.py: Use Python 3.11+ for backend development
Use FastAPI framework for backend API development
Use SQLite with async support (aiosqlite) for database operations
Use tiktoken for token counting
Run linting with ruff check .
Import and use domain-specific repositories from codeframe/persistence/repositories/ for data access instead of direct database calls

Files:

  • codeframe/persistence/repositories/__init__.py
  • codeframe/ui/routers/prs.py
  • codeframe/persistence/schema_manager.py
  • codeframe/ui/websocket_broadcasts.py
  • codeframe/ui/server.py
  • codeframe/git/github_integration.py
  • codeframe/core/config.py
  • codeframe/persistence/repositories/pr_repository.py
  • codeframe/persistence/database.py
codeframe/persistence/**/*.py

📄 CodeRabbit inference engine (CLAUDE.md)

codeframe/persistence/**/*.py: Use Repository pattern for data access with domain-specific repositories in codeframe/persistence/repositories/
Use pre-production flattened v1.0 database schema with direct table creation (no migration system)
Maintain backward compatibility with existing imports and method signatures in Database repository migration

Files:

  • codeframe/persistence/repositories/__init__.py
  • codeframe/persistence/schema_manager.py
  • codeframe/persistence/repositories/pr_repository.py
  • codeframe/persistence/database.py
🧠 Learnings (3)
📚 Learning: 2026-01-11T23:33:31.895Z
Learnt from: CR
Repo: frankbria/codeframe PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T23:33:31.895Z
Learning: Applies to codeframe/**/*.py : Import and use domain-specific repositories from codeframe/persistence/repositories/ for data access instead of direct database calls

Applied to files:

  • codeframe/persistence/repositories/__init__.py
  • codeframe/persistence/repositories/pr_repository.py
  • codeframe/persistence/database.py
📚 Learning: 2026-01-11T23:33:31.895Z
Learnt from: CR
Repo: frankbria/codeframe PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T23:33:31.895Z
Learning: Applies to codeframe/persistence/**/*.py : Use Repository pattern for data access with domain-specific repositories in codeframe/persistence/repositories/

Applied to files:

  • codeframe/persistence/repositories/__init__.py
  • codeframe/persistence/repositories/pr_repository.py
  • codeframe/persistence/database.py
📚 Learning: 2026-01-11T23:33:31.895Z
Learnt from: CR
Repo: frankbria/codeframe PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T23:33:31.895Z
Learning: Applies to codeframe/persistence/**/*.py : Maintain backward compatibility with existing imports and method signatures in Database repository migration

Applied to files:

  • codeframe/persistence/repositories/__init__.py
  • codeframe/persistence/schema_manager.py
  • codeframe/persistence/repositories/pr_repository.py
  • codeframe/persistence/database.py
🧬 Code graph analysis (8)
tests/unit/test_pr_repository.py (2)
codeframe/persistence/repositories/pr_repository.py (7)
  • create_pr (19-60)
  • get_pr (62-77)
  • update_pr_github_data (136-160)
  • get_pr_by_number (79-98)
  • list_prs (100-134)
  • update_pr_status (162-209)
  • get_pr_for_branch (211-234)
codeframe/persistence/schema_manager.py (1)
  • create_schema (28-72)
codeframe/ui/routers/prs.py (6)
codeframe/core/config.py (3)
  • GlobalConfig (84-195)
  • load_environment (198-211)
  • get (294-306)
codeframe/git/github_integration.py (8)
  • GitHubIntegration (56-328)
  • GitHubAPIError (17-29)
  • create_pull_request (185-220)
  • close (326-328)
  • list_pull_requests (243-265)
  • get_pull_request (222-241)
  • merge_pull_request (267-299)
  • close_pull_request (301-324)
codeframe/persistence/database.py (2)
  • Database (52-786)
  • close (170-174)
codeframe/ui/dependencies.py (1)
  • get_db (14-29)
codeframe/ui/websocket_broadcasts.py (1)
  • broadcast_pr_created (1021-1057)
codeframe/persistence/repositories/pr_repository.py (4)
  • create_pr (19-60)
  • update_pr_github_data (136-160)
  • list_prs (100-134)
  • get_pr_by_number (79-98)
codeframe/ui/websocket_broadcasts.py (1)
codeframe/ui/shared.py (1)
  • broadcast (154-185)
codeframe/ui/server.py (1)
tests/unit/test_pr_router.py (1)
  • app (15-19)
codeframe/git/github_integration.py (2)
codeframe/core/config.py (1)
  • get (294-306)
codeframe/ui/routers/prs.py (5)
  • create_pull_request (141-225)
  • get_pull_request (252-276)
  • list_pull_requests (229-248)
  • merge_pull_request (280-350)
  • close_pull_request (354-411)
tests/unit/test_github_integration.py (1)
codeframe/git/github_integration.py (9)
  • GitHubIntegration (56-328)
  • PRDetails (33-44)
  • MergeResult (48-53)
  • GitHubAPIError (17-29)
  • create_pull_request (185-220)
  • get_pull_request (222-241)
  • list_pull_requests (243-265)
  • merge_pull_request (267-299)
  • close_pull_request (301-324)
codeframe/persistence/repositories/pr_repository.py (2)
codeframe/persistence/repositories/base.py (1)
  • BaseRepository (14-333)
tests/unit/test_pr_repository.py (2)
  • project_id (34-44)
  • issue_id (47-58)
codeframe/persistence/database.py (1)
codeframe/persistence/repositories/pr_repository.py (1)
  • PRRepository (16-248)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
  • GitHub Check: Backend Unit Tests
  • GitHub Check: Frontend Unit Tests
  • GitHub Check: E2E Smoke Tests (Chromium)
  • GitHub Check: opencode-review
  • GitHub Check: Macroscope - Correctness Check
🔇 Additional comments (33)
.env.example (1)

91-103: LGTM! Well-documented GitHub configuration section.

The configuration follows the existing file pattern with clear documentation. Good practice to keep the variables commented out by default, requiring explicit user action to enable GitHub integration.

codeframe/persistence/schema_manager.py (1)

805-814: LGTM! Indexes align with repository query patterns.

The indexes properly support:

  • idx_pull_requests_project: filtering PRs by project and status
  • idx_pull_requests_issue: lookups by associated issue
  • idx_pull_requests_branch: branch-based PR lookups
codeframe/persistence/repositories/pr_repository.py (3)

62-98: LGTM! Query methods are straightforward.

The get methods correctly handle the None case when no row is found.


100-134: LGTM! Properly parameterized queries.

The status filter uses parameterized queries, and the database constraint validates allowed status values.


236-248: LGTM! Delete method provides useful feedback.

Returning rowcount allows callers to verify whether a record was actually deleted.

codeframe/persistence/repositories/__init__.py (2)

25-26: LGTM! Import follows existing pattern.


46-47: LGTM! PRRepository properly exported in __all__.

This enables clean imports like from codeframe.persistence.repositories import PRRepository, consistent with the repository pattern used throughout the codebase. Based on learnings, this maintains backward compatibility with existing imports.

codeframe/persistence/database.py (1)

37-37: LGTM on PRRepository wiring.

The import, attribute declaration, and initialization follow the established pattern for other repositories.

Also applies to: 110-110, 156-156

codeframe/git/github_integration.py (5)

17-29: LGTM on exception class.

Clean exception design with status code, message, and optional details.


32-53: LGTM on data classes.

PRDetails and MergeResult are well-structured with appropriate optional fields.


65-91: LGTM on client initialization.

Good validation of repo format, appropriate timeout, and correct API headers including version pinning.


93-153: LGTM on request handling.

Proper error parsing, 204 handling, and mapping of network errors to GitHubAPIError.


326-328: LGTM on client cleanup.

Correctly uses aclose() for httpx async client.

tests/unit/test_github_integration.py (3)

85-136: LGTM on PR creation tests.

Good coverage of success path and API error propagation with proper mock verification.


212-271: LGTM on merge and close tests.

Covers success, not-mergeable scenarios, and verifies correct HTTP methods are used.


302-321: LGTM on error class tests.

Validates message formatting and details handling.

codeframe/ui/server.py (1)

31-31: LGTM on PR router registration.

Import and router mounting follow the established pattern and alphabetical ordering.

Also applies to: 339-339

tests/unit/test_pr_router.py (6)

14-19: LGTM on test app fixture.

Clean fixture that isolates the PR router for testing.


22-66: LGTM on mock_db fixture.

Comprehensive mock setup covering all PR repository methods used by the router.


87-99: LGTM on client fixture with dependency overrides.

Proper use of dependency_overrides and context manager for config patching.


219-249: LGTM on list PRs tests.

Good coverage including status filter verification via assert_called_with.


273-314: LGTM on merge PR tests.

Covers success and non-mergeable error scenarios.


317-339: LGTM on close PR tests.

Covers success and not-found scenarios.

tests/unit/test_pr_repository.py (3)

12-26: LGTM! Well-structured test fixtures.

The database fixture properly creates an isolated SQLite database per test with the full schema. Using tmp_path ensures test isolation.


60-74: LGTM! Solid test coverage for create_pr.

Tests verify the return value type and validity. The assertions are appropriate for ensuring the method returns a valid database ID.


285-298: LGTM! Good coverage of timestamp auto-generation.

This test verifies that created_at is automatically populated by the database, which is important for audit purposes.

codeframe/ui/websocket_broadcasts.py (4)

1016-1018: LGTM! Good organization with section header.

The section comment clearly marks the new Sprint 11 PR broadcast functions, maintaining consistency with the file's existing organization pattern.


1021-1057: LGTM! Well-structured broadcast helper.

The function follows the established pattern in this module consistently: message construction, try/except broadcast with project-scoped delivery, and appropriate logging.


1060-1087: LGTM! Consistent implementation.

The function appropriately includes the merge_commit_sha which is the key piece of information for a merge event.


1090-1119: LGTM! Good design with optional reason.

The optional reason parameter allows callers to provide context for why a PR was closed, which is useful for audit trails and UI feedback.

codeframe/ui/routers/prs.py (3)

112-132: LGTM! Clean validation helper.

The function properly separates authentication concerns and returns the project for downstream use, avoiding duplicate database queries.


228-248: LGTM! Simple and efficient list endpoint.

Correctly reads from the local database without requiring GitHub API calls, providing fast response times.


1-32: Well-organized module structure.

Good separation of concerns with clear sections for models, helpers, and endpoints. The module docstring and imports are clean and follow Python conventions. As per coding guidelines, the code correctly uses the repository pattern from codeframe/persistence/repositories/ via db.pull_requests.

@github-actions
Copy link
Contributor

Code Review Summary

Overall, this PR adds solid GitHub PR integration following the project's patterns. The code is well-structured and follows the repository pattern. However, there are a few issues that should be addressed before merging.

🔴 Critical Issues

1. Potential NameError in finally blocks (prs.py)

In create_pull_request, merge_pull_request, and close_pull_request, the finally block closes the GitHub client, but the gh variable is only created partway through the function. If an exception occurs before GitHubIntegration is instantiated, the finally block will raise a NameError.

Example (line 224-225):

try:
    # Validate access
    await validate_project_access(project_id, db, current_user)
    
    # Get GitHub config
    config = get_global_config()
    github_token, github_repo = validate_github_config(config)
    
    # Create PR via GitHub API
    gh = GitHubIntegration(token=github_token, repo=github_repo)  # ← gh defined here
    # ...
finally:
    await gh.close()  # ← If exception before line above, this raises NameError

Fix: Initialize gh = None before the try block and check before closing:

gh = None
try:
    # ...
    gh = GitHubIntegration(token=github_token, repo=github_repo)
    # ...
finally:
    if gh is not None:
        await gh.close()

This affects lines 224-225, 349-350, and 410-411.

2. Timezone inconsistency (pr_repository.py:178)

now = datetime.now()  # Uses local timezone

Should be UTC for consistency with the rest of the codebase:

now = datetime.now(UTC)

⚠️ Security & Performance Considerations

3. HTTP client created per request (prs.py:171, 314, 384)

Each endpoint creates a new httpx.AsyncClient, which establishes a new HTTP connection. This is inefficient for high-throughput scenarios.

Recommendation: Consider using a dependency-provided shared client with connection pooling (similar to how database connections are managed).

4. No validation of GitHub token scope

The code validates that GITHUB_TOKEN is set, but doesn't verify it has the required repo scope before making API calls. Errors will only surface when the GitHub API returns a 403/401.

Suggestion: Add a health check endpoint or initial validation that queries GitHub's user endpoint to verify token validity.

📝 Code Quality

5. Repeated config loading (prs.py:35-38)

def get_global_config() -> GlobalConfig:
    load_environment()  # Called every time
    return GlobalConfig()

This function is called in every endpoint and reloads the entire environment. While load_environment() uses dotenv.load_dotenv() which is idempotent, it's still unnecessary overhead.

Recommendation: Cache the config or use FastAPI's Depends with caching.

6. Test coverage looks good

  • 17 tests for GitHubIntegration
  • 14 tests for PRRepository
  • 14 tests for PRRouter

Tests cover success paths, error cases (not found, access denied, GitHub API errors), and edge cases. Good use of mocking to avoid real GitHub API calls.

✅ Positives

  • Clean separation of concerns (API client, repository, router)
  • Well-structured WebSocket broadcasts for real-time updates
  • Proper authentication/authorization checks via dependencies
  • Comprehensive error handling with custom GitHubAPIError
  • Good database schema with appropriate indexes
  • Clear, consistent naming conventions
  • No comments (following project guidelines)

Summary

Recommended fixes before merge:

  1. Fix NameError in finally blocks (Critical)
  2. Use UTC timestamps in repository

Nice-to-have improvements:
3. Consider shared httpx client for performance
4. Add token scope validation
5. Cache config loading

The implementation is solid and follows the project's patterns well. The critical issues are straightforward to fix.

@github-actions
Copy link
Contributor

Review posted at #284 (comment)

Key findings:

  • Critical: NameError risk in finally blocks if GitHub client creation fails
  • Critical: Timezone inconsistency (local vs UTC)
  • Security/Performance: New HTTP client per request, no token scope validation
  • Code Quality: Redundant config loading
  • Positives: Clean architecture, good test coverage (45 tests), proper auth checks

New%20session%20-%202026-01-14T02%3A00%3A07.069Z
opencode session  |  github run

@frankbria frankbria linked an issue Jan 14, 2026 that may be closed by this pull request
5 tasks
- Fix NameError in finally blocks by initializing gh = None before try
- Fix timezone inconsistency: use datetime.now(UTC) consistently
- Add pull_requests to async connection propagation in database.py
- Only update merge status when result.merged is True
- Improve repo format validation to reject empty owner/repo
- Constrain merge method to Literal["squash", "merge", "rebase"]
- Use BaseRepository helpers with proper cursor.lastrowid handling
- Add rowcount checks for update operations to raise on missing pr_id
@github-actions
Copy link
Contributor

Code Review

After reviewing the PR changes and comparing with previous reviews, most critical issues have been addressed. However, there's one remaining concern:

🐛 Bug: Data inconsistency in

Location: prs.py:396-404

The database status update and WebSocket broadcast happen unconditionally, even if gh.close_pull_request(pr_number) returns closed=False:

# Line 394-400
closed = await gh.close_pull_request(pr_number)

# Update database
db.pull_requests.update_pr_status(
    pr_id=pr["id"],
    status="closed",
)

# Broadcast PR closed event
await broadcast_pr_closed(...)

Issue: If GitHub fails to close the PR (network error, PR already closed, permissions issue), the local database will still mark it as closed and broadcast the event. This creates a discrepancy between GitHub's actual state and the local database.

Suggested fix: Make the updates conditional on closed being True, similar to how merge_pull_request was fixed:

closed = await gh.close_pull_request(pr_number)

if closed:
    # Update database only if close succeeded
    db.pull_requests.update_pr_status(
        pr_id=pr["id"],
        status="closed",
    )
    
    # Broadcast PR closed event
    await broadcast_pr_closed(
        manager=manager,
        project_id=project_id,
        pr_number=pr_number,
    )
    
    logger.info(f"Closed PR #{pr_number} for project {project_id}")
else:
    logger.warning(f"PR #{pr_number} close returned closed=False")

✅ Well Done

  • Used Literal["squash", "merge", "rebase"] for merge method validation
  • All previous critical issues fixed (NameError in finally blocks, timezone consistency, merge success check)
  • Clean repository pattern with BaseRepository helpers
  • Comprehensive test coverage (45 new unit tests)
  • Proper error handling with custom GitHubAPIError

💡 Optional Improvements

  1. Config caching: get_global_config() calls load_environment() on every request. Consider using @lru_cache or FastAPI Depends for efficiency.

  2. HTTP client pooling: Each endpoint creates a new httpx.AsyncClient. Consider using a dependency-injected shared client for better performance under load.

  3. Token scope validation: Add a health check to verify GITHUB_TOKEN has required repo scope before making API calls.

@github-actions
Copy link
Contributor

Code Review

After reviewing PR changes and comparing with previous reviews, most critical issues have been addressed. However, there's one remaining concern:

🐛 Bug: Data inconsistency in close_pull_request

Location: prs.py:396-404

The database status update and WebSocket broadcast happen unconditionally, even if gh.close_pull_request(pr_number) returns closed=False:

# Line 394-400
closed = await gh.close_pull_request(pr_number)

# Update database
db.pull_requests.update_pr_status(
    pr_id=pr["id"],
    status="closed",
)

# Broadcast PR closed event
await broadcast_pr_closed(...)

Issue: If GitHub fails to close the PR (network error, PR already closed, permissions issue), the local database will still mark it as closed and broadcast the event. This creates a discrepancy between GitHub's actual state and the local database.

Suggested fix: Make updates conditional on closed being True, similar to how merge_pull_request was fixed:

closed = await gh.close_pull_request(pr_number)

if closed:
    # Update database only if close succeeded
    db.pull_requests.update_pr_status(
        pr_id=pr["id"],
        status="closed",
    )
    
    # Broadcast PR closed event
    await broadcast_pr_closed(
        manager=manager,
        project_id=project_id,
        pr_number=pr_number,
    )
    
    logger.info(f"Closed PR #{pr_number} for project {project_id}")
else:
    logger.warning(f"PR #{pr_number} close returned closed=False")

✅ Well Done

  • Used Literal["squash", "merge", "rebase"] for merge method validation
  • All previous critical issues fixed (NameError in finally blocks, timezone consistency, merge success check)
  • Clean repository pattern with BaseRepository helpers
  • Comprehensive test coverage (45 new unit tests)
  • Proper error handling with custom GitHubAPIError

💡 Optional Improvements

  1. Config caching: get_global_config() calls load_environment() on every request. Consider using @lru_cache or FastAPI Depends for efficiency.

  2. HTTP client pooling: Each endpoint creates a new httpx.AsyncClient. Consider using a dependency-injected shared client for better performance under load.

  3. Token scope validation: Add a health check to verify GITHUB_TOKEN has required repo scope before making API calls.

@github-actions
Copy link
Contributor

Review posted. Main finding: data inconsistency bug in close_pull_request - the database status update and WebSocket broadcast happen even if GitHub fails to close the PR. Should make these conditional like merge_pull_request.

All other critical issues from previous reviews have been addressed. Test coverage is solid (45 unit tests).

New%20session%20-%202026-01-14T02%3A14%3A18.678Z
opencode session  |  github run

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@codeframe/ui/routers/prs.py`:
- Around line 394-411: The code unconditionally calls
db.pull_requests.update_pr_status(...) to set status="closed" even if
gh.close_pull_request(pr_number) returned False; update the DB only when close
returns truthy like in merge_pull_request: check the returned value from
gh.close_pull_request (stored in closed) and only call
db.pull_requests.update_pr_status(...) and broadcast_pr_closed(...) when closed
is True, otherwise log a failure and return ClosePRResponse(closed=False)
without changing DB state.
🧹 Nitpick comments (5)
codeframe/git/github_integration.py (2)

137-141: Consider guarding response.json() for non-JSON successful responses.

If the server returns a 2xx status with non-JSON content (e.g., empty body, malformed JSON), line 141 will raise an unhandled exception. Consider adding a try/except guard.

♻️ Suggested defensive handling
             # Handle empty responses (204 No Content)
             if response.status_code == 204:
                 return None

-            return response.json()
+            try:
+                return response.json()
+            except Exception:
+                # Return empty dict for non-JSON 2xx responses
+                return {}

259-266: Consider using httpx params for query parameters.

The query parameter is manually appended to the endpoint. Using httpx's params argument would be cleaner and handle URL encoding automatically.

♻️ Suggested improvement
-        data = await self._make_request(
-            method="GET",
-            endpoint=f"{endpoint}?state={state}",
-        )
+        data = await self._client.get(
+            f"{self.BASE_URL}{endpoint}",
+            params={"state": state},
+        )
+        # Note: Would need to handle response status/errors here
+        # Or extend _make_request to accept params

Alternatively, extend _make_request to accept an optional params dict.

codeframe/ui/routers/prs.py (3)

85-90: Consider adding type annotation for prs list elements.

The prs: list lacks element type information. For better API documentation and type safety, consider specifying the element type.

♻️ Suggested improvement
 class PRListResponse(BaseModel):
     """Response containing list of PRs."""

-    prs: list
+    prs: List[dict]  # or define a PRSummary model
     total: int

35-38: Consider caching or using Depends for global config.

get_global_config() creates a new GlobalConfig instance on each call. Consider using FastAPI's dependency injection for caching.

♻️ Suggested approach
from functools import lru_cache

`@lru_cache`()
def get_global_config() -> GlobalConfig:
    """Get global configuration with GitHub settings."""
    load_environment()
    return GlobalConfig()

Or use FastAPI Depends pattern for testability.


183-200: Consider wrapping DB operations for error resilience.

If update_pr_github_data fails after create_pr succeeds, the local PR record will be incomplete (missing GitHub number/URL). While unlikely, this could cause issues when listing/fetching PRs.

♻️ Suggested improvement
         # Store in database
+        try:
             pr_id = db.pull_requests.create_pr(
                 project_id=project_id,
                 issue_id=None,  # Can be linked later
                 branch_name=request.branch,
                 title=request.title,
                 body=request.body,
                 base_branch=request.base,
                 head_branch=request.branch,
             )

             # Update with GitHub data
             db.pull_requests.update_pr_github_data(
                 pr_id=pr_id,
                 pr_number=pr_details.number,
                 pr_url=pr_details.url,
                 github_created_at=pr_details.created_at,
             )
+        except Exception as db_error:
+            logger.error(
+                f"DB error after creating GitHub PR #{pr_details.number}: {db_error}"
+            )
+            raise HTTPException(
+                status_code=500,
+                detail=f"PR created on GitHub (#{pr_details.number}) but failed to save locally",
+            )
📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2b4023b and f594e04.

📒 Files selected for processing (4)
  • codeframe/git/github_integration.py
  • codeframe/persistence/database.py
  • codeframe/persistence/repositories/pr_repository.py
  • codeframe/ui/routers/prs.py
🚧 Files skipped from review as they are similar to previous changes (1)
  • codeframe/persistence/repositories/pr_repository.py
🧰 Additional context used
📓 Path-based instructions (2)
codeframe/**/*.py

📄 CodeRabbit inference engine (CLAUDE.md)

codeframe/**/*.py: Use Python 3.11+ for backend development
Use FastAPI framework for backend API development
Use SQLite with async support (aiosqlite) for database operations
Use tiktoken for token counting
Run linting with ruff check .
Import and use domain-specific repositories from codeframe/persistence/repositories/ for data access instead of direct database calls

Files:

  • codeframe/persistence/database.py
  • codeframe/ui/routers/prs.py
  • codeframe/git/github_integration.py
codeframe/persistence/**/*.py

📄 CodeRabbit inference engine (CLAUDE.md)

codeframe/persistence/**/*.py: Use Repository pattern for data access with domain-specific repositories in codeframe/persistence/repositories/
Use pre-production flattened v1.0 database schema with direct table creation (no migration system)
Maintain backward compatibility with existing imports and method signatures in Database repository migration

Files:

  • codeframe/persistence/database.py
🧠 Learnings (3)
📚 Learning: 2026-01-11T23:33:31.895Z
Learnt from: CR
Repo: frankbria/codeframe PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T23:33:31.895Z
Learning: Applies to codeframe/persistence/**/*.py : Maintain backward compatibility with existing imports and method signatures in Database repository migration

Applied to files:

  • codeframe/persistence/database.py
📚 Learning: 2026-01-11T23:33:31.895Z
Learnt from: CR
Repo: frankbria/codeframe PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T23:33:31.895Z
Learning: Applies to codeframe/**/*.py : Import and use domain-specific repositories from codeframe/persistence/repositories/ for data access instead of direct database calls

Applied to files:

  • codeframe/persistence/database.py
📚 Learning: 2026-01-11T23:33:31.895Z
Learnt from: CR
Repo: frankbria/codeframe PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T23:33:31.895Z
Learning: Applies to codeframe/persistence/**/*.py : Use Repository pattern for data access with domain-specific repositories in codeframe/persistence/repositories/

Applied to files:

  • codeframe/persistence/database.py
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
  • GitHub Check: Frontend Unit Tests
  • GitHub Check: Backend Unit Tests
  • GitHub Check: Macroscope - Correctness Check
  • GitHub Check: E2E Smoke Tests (Chromium)
  • GitHub Check: opencode-review
🔇 Additional comments (9)
codeframe/persistence/database.py (1)

37-37: LGTM! PRRepository integration follows established patterns.

The integration of PRRepository is consistent with the existing repository pattern:

  • Import added alongside other repositories
  • Optional attribute declaration matches existing style
  • Initialization uses the same constructor signature
  • Async connection propagation properly includes the new repository

Also applies to: 110-110, 156-156, 217-218

codeframe/git/github_integration.py (5)

17-29: LGTM!

Clean custom exception with appropriate attributes for error handling.


32-53: LGTM!

Well-structured dataclasses capturing essential PR information.


65-92: LGTM! Robust repo validation implemented.

The repo format validation now properly rejects empty owner or repo segments. The httpx client is configured with appropriate timeout and headers.


156-184: LGTM!

The parsing logic correctly handles GitHub's datetime format with the Z suffix to +00:00 conversion.


327-329: LGTM!

Proper async client cleanup with aclose().

codeframe/ui/routers/prs.py (3)

231-251: LGTM!

Clean implementation returning local DB state with optional status filtering.


254-279: LGTM!

Straightforward PR retrieval by number. Consider adding a response_model for consistency with other endpoints.


325-344: LGTM! Past review concerns addressed.

The merge status is now correctly updated only when result.merged is True, and a warning is logged for failed merges.

Only update database status and broadcast WebSocket event when
gh.close_pull_request returns True. This prevents data inconsistency
between GitHub's actual state and the local database when close fails.

Matches the pattern already used in merge_pull_request.
@github-actions
Copy link
Contributor

Code Review Summary

All critical issues from previous reviews have been addressed ✅

Fixed Issues

1. NameError in finally blocks (prs.py:173, 317, 391)

  • now initialized before try blocks
  • Cleanup properly checks in finally

2. Timezone consistency (pr_repository.py:180)

  • Uses instead of naive
  • Timezone-aware timestamps stored correctly in database

3. Conditional DB updates for merge/close operations

  • (lines 326-344): Only updates DB when is True
  • (lines 397-412): Only updates DB when is True
  • Prevents data inconsistency between local DB and GitHub state

4. Method validation (prs.py:67-69)

  • Uses for type safety
  • Invalid merge methods fail fast before API calls

5. GitHub repo format validation (github_integration.py:75-79)

  • Properly validates both owner and repo are non-empty
  • Rejects empty segments like , ,

6. Repository pattern (pr_repository.py)

  • All methods now use BaseRepository helpers (, , , )
  • Thread-safe with proper error handling

7. Async connection propagation (database.py:218)

  • included in async connection updates
  • PR async operations use active connection

Code Quality Highlights

  • Clean separation of concerns (GitHub client, repository, router)
  • Proper error handling with custom
  • WebSocket broadcasts for real-time updates
  • Comprehensive test coverage (45 new unit tests)
  • Authentication and authorization checks in place
  • Well-structured database schema with indexes

Optional Future Improvements

These are not blockers for merging:

  1. Config caching: calls on every request. Consider using or FastAPI Depends.

  2. HTTP client pooling: Each endpoint creates a new . Consider using a dependency-injected shared client for high-throughput scenarios.

  3. Token scope validation: Add a health check to verify has required scope before API calls.

Summary

The implementation is solid and follows the project's patterns well. All critical bugs have been fixed, and the code is production-ready. Optional improvements can be addressed in follow-up PRs.

@github-actions
Copy link
Contributor

Code Review Summary

All critical issues from previous reviews have been addressed ✅

Fixed Issues

1. NameError in finally blocks (prs.py:173, 317, 391)

  • gh initialized to None before try blocks
  • Cleanup properly checks if gh is not None

2. Timezone consistency (pr_repository.py:180)

  • Uses datetime.now(UTC) instead of naive datetime.now()
  • Handles timezone-aware merged_at and github_created_at

3. Conditional DB updates

  • merge_pull_request (lines 326-344): Updates DB only when result.merged is True
  • close_pull_request (lines 397-412): Updates DB only when closed is True

4. Merge method validation (prs.py:67-69)

  • Uses Literal["squash", "merge", "rebase"] for type safety

5. GitHub repo validation (github_integration.py:75-79)

  • Properly validates both owner and repo are non-empty

6. Repository pattern (pr_repository.py)

  • Uses BaseRepository helpers (_execute, _fetchone, _fetchall, _commit)

7. Async connection propagation (database.py:218)

  • pull_requests included in async connection updates

Optional Future Improvements

These are not blockers for merging:

  1. DB transaction safety (prs.py:183-200): Consider wrapping DB ops in try/except with 500 error if update_pr_github_data fails after create_pr.

  2. Config caching (prs.py:35-38): get_global_config() reloads environment on every request. Consider @lru_cache or FastAPI Depends.

  3. HTTP client pooling: Each endpoint creates a new httpx.AsyncClient. Consider shared client for high-throughput.

  4. Response model (prs.py:254-279): Add response_model for better OpenAPI docs.

  5. List type annotation (prs.py:88): prs: list could be list[dict] for type safety.

Summary

The PR is ready to merge. All critical bugs have been fixed, code follows project patterns, and test coverage is solid (45 new unit tests). Optional improvements can be addressed in follow-ups.

@github-actions
Copy link
Contributor

Review posted. All critical issues fixed - PR is ready to merge. Optional improvements noted for follow-ups.

New%20session%20-%202026-01-14T02%3A32%3A45.178Z
opencode session  |  github run

@frankbria frankbria merged commit 36c101d into main Jan 14, 2026
13 of 14 checks passed
@frankbria frankbria deleted the feature/github-pr-integration branch January 14, 2026 02:35
frankbria pushed a commit that referenced this pull request Jan 14, 2026
#284, #285)

Add comprehensive E2E tests for recently added Git and PR features:

Git Visualization (test_git_visualization.spec.ts - 10 tests):
- Git section visibility in active/review phases
- Branch indicator with dirty state detection
- Commit history display
- Negative test for discovery phase (hidden)
- API integration verification

PR Management (test_pr_management.spec.ts - 20 tests):
- Tab navigation and panel display
- PR list with status filtering
- Create PR dialog with form validation
- PR card actions (View, Merge buttons)
- CI and review status indicators
- API integration verification

Includes 5 smoke tests for core functionality validation.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[P0] Backend: GitHub PR creation and management API

2 participants