Skip to content

fix(budpipeline): align pub/sub publishing with budmicroframe CloudEvent pattern#1311

Merged
dittops merged 5 commits intomasterfrom
fix/budpipeline-notify-422
Feb 13, 2026
Merged

fix(budpipeline): align pub/sub publishing with budmicroframe CloudEvent pattern#1311
dittops merged 5 commits intomasterfrom
fix/budpipeline-notify-422

Conversation

@dittops
Copy link
Member

@dittops dittops commented Feb 13, 2026

Summary

  • Fix 422 errors from budnotify when budpipeline publishes notification events
  • Align _publish_single_topic() with budmicroframe's DaprService.publish_to_topic() pattern: set data_content_type="application/cloudevents+json", add publish_metadata with CloudEvent fields, enrich payload with source/source_topic/type
  • Fix notification_type enum case from "EVENT" to "event"
  • Add NOTIFY_SERVICE_TOPIC env var to Helm template (matching budcluster/budeval) and .env.sample

Root Cause

budpipeline was publishing with data_content_type="application/json" and no publish_metadata. This caused Dapr to wrap the payload in a CloudEvent envelope, injecting id and datacontenttype fields that budnotify's CloudEventBase(extra="forbid") rejected with 422.

Test plan

  • Verified budnotify Dapr sidecar logs show no new 422 errors after deploy
  • Trigger pipeline execution with subscriber_ids and verify notification reaches budnotify without 422
  • Verify existing unit tests pass (pytest tests/test_event_publishing.py)

🤖 Generated with Claude Code

…ent pattern

budnotify was returning 422 on pipeline notifications because Dapr was
wrapping the payload in a CloudEvent envelope (adding `id` and
`datacontenttype` fields), which CloudEventBase rejects via extra="forbid".

Changes:
- Fix notification_type from "EVENT" to "event" (enum case mismatch)
- Switch data_content_type to "application/cloudevents+json"
- Add publish_metadata with cloudevent.id/source/type (budmicroframe pattern)
- Enrich payload with source/source_topic/type fields
- Add NOTIFY_SERVICE_TOPIC to Helm template and .env.sample

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @dittops, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses critical notification delivery failures by standardizing how budpipeline publishes events. It ensures compatibility with budnotify by adopting the budmicroframe CloudEvent pattern, preventing 422 errors and improving overall system reliability for event-driven communication.

Highlights

  • Notification Error Fix: Resolved 422 errors encountered by budnotify when receiving events from budpipeline, caused by misaligned CloudEvent publishing.
  • CloudEvent Alignment: Standardized event publishing in _publish_single_topic() to match budmicroframe's CloudEvent pattern, including setting data_content_type to application/cloudevents+json, adding publish_metadata, and enriching the event payload with source, source_topic, and type.
  • Enum Case Correction: Corrected the case of the notification_type enum from uppercase "EVENT" to lowercase "event" for consistency.
  • Configuration Update: Introduced the NOTIFY_SERVICE_TOPIC environment variable in the Helm template and .env.sample for consistent topic configuration across services.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • infra/helm/bud/templates/microservices/budpipeline.yaml
    • Added NOTIFY_SERVICE_TOPIC environment variable to the budpipeline deployment.
  • services/budpipeline/.env.sample
    • Included NOTIFY_SERVICE_TOPIC in the sample environment variables.
  • services/budpipeline/budpipeline/progress/publisher.py
    • Imported uuid4 for generating unique CloudEvent IDs.
    • Modified _build_event_payload to use lowercase "event" for notification_type.
    • Updated _publish_single_topic to enrich event data with source, source_topic, and type.
    • Configured _publish_single_topic to use application/cloudevents+json as data_content_type.
    • Added publish_metadata to _publish_single_topic with CloudEvent specific fields (id, source, type).
  • services/budpipeline/tests/test_event_publishing.py
    • Adjusted test assertions to reflect the notification_type change from "EVENT" to "event".
Activity
  • The author identified and described the root cause of 422 errors in budnotify due to Dapr's CloudEvent wrapping.
  • A test plan was provided, with one item already verified, indicating initial validation of the fix.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request effectively addresses the 422 errors from budnotify by aligning event publishing with the CloudEvent standard. The changes to use application/cloudevents+json and fix the notification_type enum are correct. I've added one suggestion to improve the implementation of _publish_single_topic to ensure it constructs a fully compliant CloudEvent, which will make the solution more robust and aligned with Dapr best practices.

Comment on lines +370 to 394
# Copy payload to avoid mutating the shared dict across topics
data = payload.copy()

# Enrich data matching budmicroframe's DaprService.publish_to_topic() pattern
data["source"] = settings.name
data["source_topic"] = topic
if data.get("type") is None:
data["type"] = event_type

# CloudEvent metadata for Dapr (budmicroframe pattern)
event_id = str(uuid4())
publish_metadata = {
"cloudevent.id": event_id,
"cloudevent.source": settings.name,
"cloudevent.type": event_type,
}

async with DaprClient() as client:
await client.publish_event(
pubsub_name=self.pubsub_name,
topic_name=topic,
data=json.dumps(payload, cls=_DecimalEncoder),
data_content_type="application/json",
data=json.dumps(data, cls=_DecimalEncoder),
data_content_type="application/cloudevents+json",
publish_metadata=publish_metadata,
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The current implementation appears to mix two different Dapr publishing patterns. When using data_content_type="application/cloudevents+json", the data parameter should be a complete, valid CloudEvent JSON object. The publish_metadata parameter is intended for when Dapr wraps a non-CloudEvent payload (e.g., with data_content_type="application/json"), so it's redundant and potentially confusing here.

To correctly address the issue of Dapr injecting fields into the payload, we should manually construct the full CloudEvent envelope and place our application payload inside its data field. This ensures we send a valid CloudEvent and have full control over its structure, making the solution more robust and aligned with best practices.

            # To send a compliant CloudEvent and prevent Dapr from modifying the payload,
            # we should construct the full CloudEvent envelope and place our application
            # payload within its 'data' field.
            event_payload = payload.copy()
            event_payload["source_topic"] = topic

            cloud_event = {
                "specversion": "1.0",
                "id": str(uuid4()),
                "source": settings.name,
                "type": event_type,
                "datacontenttype": "application/json",
                "data": event_payload,
            }

            async with DaprClient() as client:
                await client.publish_event(
                    pubsub_name=self.pubsub_name,
                    topic_name=topic,
                    data=json.dumps(cloud_event, cls=_DecimalEncoder),
                    data_content_type="application/cloudevents+json",
                )

dittops and others added 4 commits February 13, 2026 12:31
content.get("result", {}) returns None when the key exists with a None
value, causing AttributeError on .get(). Use `or {}` fallback instead.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…ing_event

The persistence_service.mark_step_awaiting_event() was creating its own
DB session, but the RUNNING status update in _continue_pipeline_execution
was only flushed (not committed) in the outer session. The new session
could not see the uncommitted version, causing an optimistic lock conflict
that silently prevented external_workflow_id from being persisted.

This left event-driven steps (like model_add) stuck in RUNNING state with
no external_workflow_id, making them unable to receive completion events.

Fix: Use step_crud (same session) directly instead of persistence_service,
ensuring the version chain stays consistent within a single transaction.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The process_timeout function fetches the execution to get subscriber_ids
for event publishing. Without mocking PipelineExecutionCRUD, the mock
session's async chain returns a coroutine instead of an execution object,
causing "'coroutine' object has no attribute 'subscriber_ids'".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@dittops dittops force-pushed the fix/budpipeline-notify-422 branch from 8af8102 to fa1373b Compare February 13, 2026 22:24
@dittops dittops merged commit 59de84a into master Feb 13, 2026
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments