Skip to content

Export workflow instrumentation event types from public API#445

Open
enrico-stauss wants to merge 1 commit intorun-llama:mainfrom
enrico-stauss:fix/resolve-workflow-outputevent-openinference-warning
Open

Export workflow instrumentation event types from public API#445
enrico-stauss wants to merge 1 commit intorun-llama:mainfrom
enrico-stauss:fix/resolve-workflow-outputevent-openinference-warning

Conversation

@enrico-stauss
Copy link
Copy Markdown
Contributor

@enrico-stauss enrico-stauss commented Mar 23, 2026

WorkflowStepOutputEvent, WorkflowRunOutputEvent, and SpanCancelledEvent are emitted through the LlamaIndex instrumentation dispatcher during workflow execution. Observability integrations such as openinference-instrumentation-llama-index use singledispatch to handle known event types and log a warning for any type they don't recognise:

  • WARNING - Unhandled event of type WorkflowStepOutputEvent
  • WARNING - Unhandled event of type WorkflowRunOutputEvent

The root fix belongs in the openinference package (add no-op or meaningful handlers for these types). Exporting the classes from workflows.__init__ gives the openinference maintainers a stable public import path:

  • from workflows import WorkflowStepOutputEvent, WorkflowRunOutputEvent, SpanCancelledEvent

so they can register singledispatch handlers without importing from internal implementation modules.

@adrianlyjak
Copy link
Copy Markdown
Contributor

@enrico-stauss I'm not sure it makes sense to make these events public. At the very least, we carefully curate what's available as a root module import, and these do not belong there (perhaps exported from a separate more-out-of-the-way, but still stable file)

Alternatively, at least on the llama-index-workflows side, we've significantly improved our llama-index-observability-opentelemetry backend, and it now has detailed event details like the arize openinference telemetry adds. You could consider switching to that (arize supports open telemetry collectors). Note however it may not instrument llama_index based abstractions like arize does.

@enrico-stauss
Copy link
Copy Markdown
Contributor Author

I'll give it a try and come back to you, thanks.

@adrianlyjak
Copy link
Copy Markdown
Contributor

@enrico-stauss see also Arize-ai/openinference#2908 (comment). The coupling here seems hard to maintain well

@enrico-stauss
Copy link
Copy Markdown
Contributor Author

Hi @adrianlyjak
I migrated my application to llama-index-observability-otel but my finding is, that if I do so, I loose all of the commodities like token counts, LLM completions and so on. I would have to write a custom span exporter, correct?

I might do that at some point, but for now I decided to defer the migration and just silence the warning. I get that the situation where a fix spans 2 or 3 repos is not maintainable, but the current alternative is not yet mature (unless I missed something). You may close this PR if you want to.

Cheers
Enrico

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants