Conversation
|
You have reached your Codex usage limits for code reviews. You can see your limits in the Codex usage dashboard. |
Summary of ChangesHello @dittops, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request addresses critical issues preventing proper notification delivery within the budpipeline service. It rectifies a mismatch in CloudEvent publishing patterns that led to downstream services rejecting messages and resolves a database constraint problem that blocked subscription registrations. The changes ensure reliable event propagation and correct database state for execution subscriptions. Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request addresses two separate issues: it corrects the payload format for Pub/Sub messages to resolve 422 errors and fixes a stale database constraint that was preventing inserts. The database migration changes are well-structured and correctly handle both new and existing database schemas. The refactoring of the event publisher to use publish_metadata for CloudEvent creation is a good improvement. I've provided a couple of suggestions on the publisher implementation to further align it with CloudEvents best practices, specifically regarding the placement of extension attributes and the data content type.
| event_payload = payload.copy() | ||
| correlation_id = event_payload.pop("correlation_id", None) | ||
|
|
||
| # Construct a full CloudEvent envelope so Dapr does not inject | ||
| # its own id/datacontenttype fields (which budnotify rejects). | ||
| # Extension attributes (source_topic, correlation_id) go at the | ||
| # envelope level, not inside data. | ||
| cloud_event: dict[str, Any] = { | ||
| "specversion": "1.0", | ||
| "id": str(uuid4()), | ||
| "source": settings.name, | ||
| "type": event_type, | ||
| "datacontenttype": "application/json", | ||
| "source_topic": topic, | ||
| "data": event_payload, | ||
| event_payload["source"] = settings.name | ||
| event_payload["source_topic"] = topic | ||
|
|
||
| # Pass CloudEvent attributes via publish_metadata so Dapr | ||
| # does not inject them into the JSON body. | ||
| event_id = str(uuid4()) | ||
| publish_metadata = { | ||
| "cloudevent.id": event_id, | ||
| "cloudevent.source": settings.name, | ||
| "cloudevent.type": event_type, | ||
| } |
There was a problem hiding this comment.
This refactoring correctly switches to using publish_metadata to let Dapr build the CloudEvent envelope. However, it appears to have unintentionally changed where some attributes are placed. The previous implementation's comments indicated that correlation_id and source_topic should be top-level CloudEvent extension attributes. In this new version, they are added to the event_payload and will end up inside the data field of the CloudEvent.
While this may work with the current consumer, it's generally better practice to use CloudEvent extension attributes for metadata like this. It would also be more consistent with the previous design's intent. Consider moving these attributes to publish_metadata to make them proper CloudEvent extensions.
| event_payload = payload.copy() | |
| correlation_id = event_payload.pop("correlation_id", None) | |
| # Construct a full CloudEvent envelope so Dapr does not inject | |
| # its own id/datacontenttype fields (which budnotify rejects). | |
| # Extension attributes (source_topic, correlation_id) go at the | |
| # envelope level, not inside data. | |
| cloud_event: dict[str, Any] = { | |
| "specversion": "1.0", | |
| "id": str(uuid4()), | |
| "source": settings.name, | |
| "type": event_type, | |
| "datacontenttype": "application/json", | |
| "source_topic": topic, | |
| "data": event_payload, | |
| event_payload["source"] = settings.name | |
| event_payload["source_topic"] = topic | |
| # Pass CloudEvent attributes via publish_metadata so Dapr | |
| # does not inject them into the JSON body. | |
| event_id = str(uuid4()) | |
| publish_metadata = { | |
| "cloudevent.id": event_id, | |
| "cloudevent.source": settings.name, | |
| "cloudevent.type": event_type, | |
| } | |
| event_payload = payload.copy() | |
| correlation_id = event_payload.pop("correlation_id", None) | |
| # Pass CloudEvent attributes via publish_metadata so Dapr | |
| # does not inject them into the JSON body. | |
| event_id = str(uuid4()) | |
| publish_metadata: dict[str, Any] = { | |
| "cloudevent.id": event_id, | |
| "cloudevent.source": settings.name, | |
| "cloudevent.type": event_type, | |
| "source_topic": topic, | |
| } | |
| if correlation_id: | |
| publish_metadata["correlation_id"] = correlation_id |
| topic_name=topic, | ||
| data=json.dumps(cloud_event, cls=_DecimalEncoder), | ||
| data=json.dumps(event_payload, cls=_DecimalEncoder), | ||
| data_content_type="application/cloudevents+json", |
There was a problem hiding this comment.
The data_content_type is set to application/cloudevents+json. When letting Dapr create the CloudEvent envelope by providing publish_metadata, this content type should typically be application/json to describe the data payload itself. Dapr will then wrap this payload in a CloudEvent, and the datacontenttype attribute of the resulting CloudEvent will be correctly set to application/json.
| data_content_type="application/cloudevents+json", | |
| data_content_type="application/json", |
…DB constraint The EventPublisher._publish_single_topic() was manually constructing a CloudEvent envelope with id/datacontenttype in the JSON body, causing budnotify's CloudEventBase(extra="forbid") to reject messages with 422. Switch to the budmicroframe pattern: send flat data with CloudEvent attributes via publish_metadata. Also fix migration 003 which used the wrong constraint name, leaving a stale lowercase check constraint on execution_subscription.delivery_status that blocked all subscription inserts with 'ACTIVE'. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
718cd75 to
f765c83
Compare
Summary
EventPublisher._publish_single_topic()to send flat payload data with CloudEvent attributes viapublish_metadata, matching the budmicroframepublish_to_topicpattern. Previously, the method manually constructed a CloudEvent envelope withidanddatacontenttypeinside the JSON body, which budnotify'sCloudEventBase(extra="forbid")rejected with 422.chk_execution_subscription_statusinstead ofck_execution_subscription_chk_execution_subscription_status), leaving a stale lowercase check constraint that blocked allexecution_subscriptioninserts withACTIVE. Adds migration 009 to drop the stale constraint and fixes migration 003 to drop both constraint name variants.Root Cause
Two separate issues were blocking budpipeline notifications:
Publisher pattern mismatch:
_publish_single_topicwrapped data in a CloudEvent envelope (specversion,id,source,type,datacontenttype,data). Dapr passed the entire envelope as the message body. budnotify's Pydantic model withextra="forbid"rejectedidanddatacontenttypeas unexpected fields.Constraint name mismatch: SQLAlchemy generates constraint names with a table-prefix convention (
ck_execution_subscription_chk_execution_subscription_status), but migration 003 only dropped the short name (chk_execution_subscription_status). Both constraints ended up co-existing — the old one (lowercase) blocked all inserts, so callback topics were never registered and the publisher was never invoked.Changes
publisher.pypublish_metadatainstead of CloudEvent envelope003_fix_delivery_status_constraint.py009_drop_stale_delivery_status_constraint.pyTest plan
extra_forbiddenondatacontenttype/idstopped after fix200 OK+"Triggered notification successfully"for all messagesalembic upgrade headon a fresh DB to verify migration chain🤖 Generated with Claude Code