-
Notifications
You must be signed in to change notification settings - Fork 116
feat: sync status visible in user page w/ fastapi sse (2nd attempt) #543
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
KaliszS
wants to merge
2
commits into
main
Choose a base branch
from
extracted-commit-changes
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
2 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,105 @@ | ||
| """SSE (Server-Sent Events) endpoint for real-time sync progress. | ||
|
|
||
| Subscribes to a per-user Redis Pub/Sub channel and streams every sync | ||
| event to the browser over an open HTTP connection. | ||
| """ | ||
|
|
||
| import asyncio | ||
| import json | ||
| from collections.abc import AsyncGenerator | ||
| from logging import getLogger | ||
| from typing import Annotated | ||
| from uuid import UUID | ||
|
|
||
| import redis.asyncio as aioredis | ||
| from fastapi import APIRouter, Depends | ||
| from fastapi.responses import StreamingResponse | ||
|
|
||
| from app.config import settings | ||
| from app.integrations.sync_events import SYNC_CHANNEL_PREFIX | ||
| from app.utils.auth import verify_query_token | ||
|
|
||
| logger = getLogger(__name__) | ||
|
|
||
| router = APIRouter() | ||
|
|
||
|
|
||
| async def _event_generator(user_id: str) -> AsyncGenerator[str, None]: | ||
| """Yield SSE-formatted messages from a Redis Pub/Sub channel. | ||
|
|
||
| The ``timeout=15.0`` parameter in ``pubsub.get_message()`` controls | ||
| the keep-alive interval, NOT message latency. | ||
|
|
||
| - If a sync event arrives (e.g., at t=0.1s), Redis delivers it **immediately**. | ||
| - If NO event arrives for 15 seconds, ``get_message()`` returns None. | ||
| - We then yield a ``: keepalive`` comment to prevent the browser/proxy | ||
| from closing the idle connection. | ||
|
|
||
| This ensures instant updates while maintaining a stable long-lived connection. | ||
| """ | ||
| channel = f"{SYNC_CHANNEL_PREFIX}:{user_id}" | ||
|
|
||
| redis_client = aioredis.from_url(settings.redis_url, decode_responses=True) | ||
| pubsub = redis_client.pubsub() | ||
|
|
||
| try: | ||
| await pubsub.subscribe(channel) | ||
|
|
||
| while True: | ||
| # Wait up to 15 s for a message, then send a keep-alive comment | ||
| # This does NOT delay messages - they are returned as soon as published. | ||
| message = await pubsub.get_message(ignore_subscribe_messages=True, timeout=15.0) | ||
|
|
||
| if message and message["type"] == "message": | ||
| raw = message["data"] | ||
|
|
||
| # Parse to extract event type for SSE "event:" field | ||
| try: | ||
| payload = json.loads(raw) | ||
| event_type = payload.get("type", "message") | ||
| except (json.JSONDecodeError, TypeError): | ||
| event_type = "message" | ||
| raw = json.dumps({"type": "message", "data": raw}) | ||
|
|
||
| yield f"event: {event_type}\ndata: {raw}\n\n" | ||
|
|
||
| # If this was a terminal event, close the stream | ||
| if event_type in ("sync:completed", "sync:error"): | ||
| return | ||
| else: | ||
| # SSE keep-alive comment (ignored by EventSource) | ||
| yield ": keepalive\n\n" | ||
|
|
||
| except asyncio.CancelledError: | ||
| pass | ||
| finally: | ||
| await pubsub.unsubscribe(channel) | ||
| await pubsub.close() | ||
| await redis_client.aclose() | ||
|
|
||
|
|
||
| @router.get("/users/{user_id}/sync/events") | ||
| async def sync_events_stream( | ||
| user_id: UUID, | ||
| _developer_id: Annotated[str, Depends(verify_query_token)], | ||
| ) -> StreamingResponse: | ||
| """Stream real-time sync progress events via Server-Sent Events (SSE). | ||
|
|
||
| **Authentication:** | ||
| Pass your JWT token as a ``token`` query parameter | ||
| (``EventSource`` does not support custom headers). | ||
|
|
||
| **Event types emitted:** | ||
| (See ``SyncEventType`` in frontend types for full list) | ||
|
|
||
| The stream closes automatically after a terminal event. | ||
| """ | ||
| return StreamingResponse( | ||
| _event_generator(str(user_id)), | ||
| media_type="text/event-stream", | ||
| headers={ | ||
| "Cache-Control": "no-cache", | ||
| "Connection": "keep-alive", | ||
| "X-Accel-Buffering": "no", | ||
| }, | ||
| ) | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,58 @@ | ||
| """Redis Pub/Sub sync event publisher for real-time SSE updates. | ||
|
|
||
| Celery tasks call publish_sync_event() to notify connected SSE clients | ||
| of sync progress. Events are published to a per-user Redis channel. | ||
| """ | ||
|
|
||
| import json | ||
| from datetime import datetime, timezone | ||
| from logging import getLogger | ||
| from typing import Any | ||
|
|
||
| from app.integrations.redis_client import get_redis_client | ||
|
|
||
| logger = getLogger(__name__) | ||
|
|
||
| SYNC_CHANNEL_PREFIX = "sync:events" | ||
|
|
||
|
|
||
| def _channel_for_user(user_id: str) -> str: | ||
| """Return the Redis Pub/Sub channel name for a given user.""" | ||
| return f"{SYNC_CHANNEL_PREFIX}:{user_id}" | ||
|
|
||
|
|
||
| def publish_sync_event( | ||
| user_id: str, | ||
| event_type: str, | ||
| *, | ||
| task_id: str | None = None, | ||
| provider: str | None = None, | ||
| data: dict[str, Any] | None = None, | ||
| ) -> None: | ||
| """Publish a sync progress event to the user's Redis Pub/Sub channel. | ||
|
|
||
| Args: | ||
| user_id: UUID of the user (as string). | ||
| event_type: Event type identifier (e.g. "sync:started"). | ||
| task_id: Celery task ID, if applicable. | ||
| provider: Provider name being synced, if applicable. | ||
| data: Additional event payload. | ||
| """ | ||
| channel = _channel_for_user(user_id) | ||
| message: dict[str, Any] = { | ||
| "type": event_type, | ||
| "timestamp": datetime.now(timezone.utc).isoformat(), | ||
| } | ||
| if task_id: | ||
| message["task_id"] = task_id | ||
| if provider: | ||
| message["provider"] = provider | ||
| if data: | ||
| message["data"] = data | ||
|
|
||
| try: | ||
| redis_client = get_redis_client() | ||
| redis_client.publish(channel, json.dumps(message)) | ||
| except Exception: | ||
| # Publishing is best-effort — never break the sync task | ||
| logger.debug("Failed to publish sync event %s for user %s", event_type, user_id, exc_info=True) |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.