You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PraisonAI PR #1461 ("feat: implement langextract/langfuse observability follow-ups") merged into main on 2026-04-18 (head SHA a7efb680d33f48955257fbf59f7817bfb166a9f6, closes #1460) ships two user-visible Langfuse observability changes that are not reflected in the existing docs/observability/langfuse.mdx page.
This is the Langfuse twin of the still-open Langextract issue #173 — same architectural fix (_ContextToActionBridge + context_sink()), same trace-quality fix (_extract_llm_response_content), but applied to the Langfuse sink + CLI wiring instead.
Decision: Update existing content (docs/observability/langfuse.mdx), plus a small entry in docs/observability/overview.mdx. Confirmed by:
grep -rn "context_sink\|_ContextToActionBridge\|--observe.*langfuse" docs/ → zero matches in user-facing docs.
grep -rn "_setup_langfuse_observability" docs/ → zero matches.
docs/observability/langfuse.mdx only documents the obs.langfuse() global-OpenAI-instrumentation path; it does not mention praisonai --observe langfuse, LangfuseSink, lifecycle spans, or the trace-content improvement.
praisonaiagents/agent/chat_mixin.py previously emitted response_content=str(final_response), which serialised the entire ChatCompletion(...) object into the trace. Trace UIs (Langfuse, Langextract HTML) showed the verbose Python repr instead of the actual assistant text.
The fix — new helper _extract_llm_response_content(response):
If response.choices[0].message.content exists → return the assistant text.
Else if response.choices[0].message.tool_calls exists → return "[tool_calls: name1, name2]".
Fallback → str(response) (preserves old behaviour for unknown shapes).
Used at chat_mixin.py:592 to populate the llm_responseContextEventdata["response_content"] field.
User-visible effect: Langfuse spans (and any other trace sink) now show the readable assistant message — not ChatCompletion(id='chatcmpl-...', choices=[Choice(...)], ...).
This is the same architectural gap that was fixed for LangextractSink in #1420. The core runtime emits lifecycle events exclusively via ContextTraceEmitter / ContextTraceSinkProtocol, but LangfuseSink only consumes ActionEvents — so before #1461, praisonai --observe langfuse only captured RouterAgent token-usage and PlanningAgent.plan_created events. Real Agent.start() flows produced almost-empty Langfuse traces.
Three changes ship together:
praisonai/observability/langfuse.py — new _ContextToActionBridge class implementing ContextTraceSinkProtocol. Maps:
ContextEventType
→
ActionEventType
AGENT_START
→
AGENT_START
AGENT_END
→
AGENT_END
TOOL_CALL_START
→
TOOL_START (carries tool_name, tool_args)
TOOL_CALL_END
→
TOOL_END (carries tool_name, tool_result_summary)
LLM_REQUEST
→
TOOL_START
LLM_RESPONSE
→
TOOL_END (carries response_content — see Follow-up 1)
MEMORY_*, KNOWLEDGE_*, etc.
→
(skipped — not mappable)
praisonai/observability/langfuse.py — new LangfuseSink.context_sink() returning the bridge.
praisonai/cli/app.py::_setup_langfuse_observability — now also installs ContextTraceEmitter(sink=sink.context_sink(), enabled=True) via set_context_emitter, and registers atexit.register(sink.close) so spans flush even if the user forgets provider.flush().
User-visible effect: running praisonai --observe langfuse <command> now produces full agent_start / agent_end / tool_call_* / llm_* spans in Langfuse, mirroring what users already get from obs.langfuse() (which traces via the langfuse.openai drop-in).
Two paths to enable Langfuse — current docs only cover one
After #1461 there are now two complete, supported paths to wire Langfuse into PraisonAI. The current page only explains Path A.
Path A — obs.langfuse()
Path B — praisonai --observe langfuse (NEW)
Where the user calls it
Python script
CLI flag (also PRAISONAI_OBSERVE=langfuse)
Mechanism
Instruments OpenAI client globally via langfuse.openai drop-in (Langfuse v4 SDK)
praisonai/observability/langfuse.py (added by #1461 — will appear after the next daily update_repos.sh run; until then read the PR #1461 diff directly)
--observe langfuse CLI wiring
praisonai/cli/app.py::_setup_langfuse_observability (lines 16–40 after #1461)
Add two new top-level sections (plus minor wording tweaks). Existing obs.langfuse() content stays — it's correct and covers Path A.
New section: ## CLI Observability — --observe langfuse``
Insert after the existing ## CLI Commands section. Document Path B end-to-end:
Hero one-liner: praisonai --observe langfuse run agents.yaml
Env-var alias: PRAISONAI_OBSERVE=langfuse
Required env vars: LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGFUSE_HOST (or LANGFUSE_BASE_URL) — same table as the existing Environment Variables section can be referenced.
What spans appear in Langfuse (mermaid sequence diagram showing agent_start → llm_request → llm_response → tool_call_start → tool_call_end → agent_end).
A small <Note>: as of PR #1461, atexit auto-closes the sink — no manual flush required for CLI runs.
Cross-link to /observability/custom-tracing for the underlying ContextTraceSinkProtocol.
New section: ## Programmatic — LangfuseSink + context bridge
Insert after the new CLI section. Agent-centric example first (per AGENTS.md §1.1.9):
frompraisonaiagentsimportAgentfrompraisonaiagents.trace.protocolimportTraceEmitter, set_default_emitterfrompraisonaiagents.trace.context_eventsimportContextTraceEmitter, set_context_emitterfrompraisonai.observabilityimportLangfuseSink, LangfuseSinkConfigimportatexitsink=LangfuseSink(LangfuseSinkConfig()) # reads env vars# Action-level events (RouterAgent / PlanningAgent)set_default_emitter(TraceEmitter(sink=sink, enabled=True))
# Context-level events (Agent.start lifecycle, tool calls, LLM I/O) — required for full coverageset_context_emitter(ContextTraceEmitter(sink=sink.context_sink(), enabled=True))
atexit.register(sink.close)
agent=Agent(name="Writer", instructions="Write a haiku about code.")
agent.start("Write a haiku about code.")
⚠️ The set_context_emitter(... sink=sink.context_sink() ...) call is required for typical single-agent flows. Without it, only RouterAgent token-usage and PlanningAgent.plan_created events appear in Langfuse — Agent.start() lifecycle is silent. (This is the exact gap fixed by #1461.)
Follow with the LangfuseSinkConfig table above.
New <AccordionGroup> entry: "Trace content quality (PR #1461)"
Inside the existing ## Best Practices section, add an accordion explaining that as of #1461, llm_response spans contain the assistant message text (or [tool_calls: ...] summary), not the raw ChatCompletion(...) repr — so the Langfuse "Output" panel is now human-readable.
Minor tweaks
Reword the existing ## How It Works section to make clear it describes Path A specifically; add a one-line note pointing to the new CLI / programmatic-sink sections for Path B.
Add an "Always Flush Before Exit" caveat: still best practice for obs.langfuse() (Path A), but --observe langfuse (Path B) auto-registers atexit.close since #1461.
Add a "Path comparison" mermaid diagram near the top of the page (the table above, in mermaid form) so readers know which path to pick.
2. UPDATEdocs/observability/overview.mdx
Two small fixes on the supported-providers table (line ~51):
Add a footnote (or extra column) noting Langfuse can be enabled via praisonai --observe langfuseas well asobs.langfuse().
3. docs.json — no change required
The Langfuse page is already registered. CLI flag docs live inside the existing page.
Placement rules (AGENTS.md §1.8)
✅ docs/observability/ — agent-writable. Both files in scope live here.
❌ Do not touch docs/concepts/ — observability is a feature integration, not a core concept.
❌ Do not touch docs/js/ or docs/rust/ — LangfuseSink + the bridge are Python-only; those trees are auto-generated by the parity system.
Acceptance checklist (per AGENTS.md §9)
docs/observability/langfuse.mdx updated with the two new sections (CLI --observe langfuse; programmatic LangfuseSink + bridge).
Path A vs Path B comparison present (table or mermaid) so users can choose.
Agent-centric code example leads the programmatic section (AGENTS.md §1.1.9), and includes the set_context_emitter(... sink=sink.context_sink() ...) call.
Every LangfuseSinkConfig field documented with the exact type + default from source.
_extract_llm_response_content improvement (Follow-up 1) called out in Best Practices accordion ("trace output is now readable").
atexit auto-close behaviour noted; explicit provider.flush() still recommended for Path A.
All Mermaid diagrams use the AGENTS.md §3.1 colour scheme.
<Steps>, <AccordionGroup>, <CardGroup> components used per template (AGENTS.md §2).
docs/observability/overview.mdx Langfuse install column fixed to pip install langfuse.
No edits to docs/concepts/, docs/js/, or docs/rust/.
All code examples are runnable copy-paste (AGENTS.md §5.1) — including the imports.
Cross-link added between docs/observability/langfuse.mdx (Path B section) and docs/observability/custom-tracing.mdx (the underlying protocol).
References
PraisonAI PR #1461 — merged 2026-04-18, head SHA a7efb680d33f48955257fbf59f7817bfb166a9f6 (the source change driving this issue).
PraisonAI Issue #1460 — the spec PR #1461 implements (closed as completed).
PraisonAI PR #1420 — the prior Langextract context-bridge fix (identical pattern; reference for prose).
Summary
PraisonAI PR #1461 ("feat: implement langextract/langfuse observability follow-ups") merged into
mainon2026-04-18(head SHAa7efb680d33f48955257fbf59f7817bfb166a9f6, closes #1460) ships two user-visible Langfuse observability changes that are not reflected in the existingdocs/observability/langfuse.mdxpage.This is the Langfuse twin of the still-open Langextract issue #173 — same architectural fix (
_ContextToActionBridge+context_sink()), same trace-quality fix (_extract_llm_response_content), but applied to the Langfuse sink + CLI wiring instead.Decision: Update existing content (
docs/observability/langfuse.mdx), plus a small entry indocs/observability/overview.mdx. Confirmed by:grep -rn "context_sink\|_ContextToActionBridge\|--observe.*langfuse" docs/→ zero matches in user-facing docs.grep -rn "_setup_langfuse_observability" docs/→ zero matches.docs/observability/langfuse.mdxonly documents theobs.langfuse()global-OpenAI-instrumentation path; it does not mentionpraisonai --observe langfuse,LangfuseSink, lifecycle spans, or the trace-content improvement.docs/observability/overview.mdxline 51 still lists Langfuse with the wrong install (pip install opentelemetry-sdk opentelemetry-exporter-otlp) — should bepip install langfuse(this was already flagged in the now-closed Docs: Rewrite Langfuse integration page with correct Langfuse v4 API, obs.langfuse(), and CLI commands #64 but the overview row was never updated).What PR #1461 actually changes (SDK ground truth)
Follow-up 1 — Richer
llm_responsetrace content (Core SDK)praisonaiagents/agent/chat_mixin.pypreviously emittedresponse_content=str(final_response), which serialised the entireChatCompletion(...)object into the trace. Trace UIs (Langfuse, Langextract HTML) showed the verbose Python repr instead of the actual assistant text.The fix — new helper
_extract_llm_response_content(response):response.choices[0].message.contentexists → return the assistant text.response.choices[0].message.tool_callsexists → return"[tool_calls: name1, name2]".str(response)(preserves old behaviour for unknown shapes).Used at
chat_mixin.py:592to populate thellm_responseContextEventdata["response_content"]field.User-visible effect: Langfuse spans (and any other trace sink) now show the readable assistant message — not
ChatCompletion(id='chatcmpl-...', choices=[Choice(...)], ...).Follow-up 2 —
LangfuseSinkcontext-emitter bridge (Wrapper + CLI)This is the same architectural gap that was fixed for
LangextractSinkin #1420. The core runtime emits lifecycle events exclusively viaContextTraceEmitter/ContextTraceSinkProtocol, butLangfuseSinkonly consumesActionEvents — so before #1461,praisonai --observe langfuseonly capturedRouterAgenttoken-usage andPlanningAgent.plan_createdevents. RealAgent.start()flows produced almost-empty Langfuse traces.Three changes ship together:
praisonai/observability/langfuse.py— new_ContextToActionBridgeclass implementingContextTraceSinkProtocol. Maps:ContextEventTypeActionEventTypeAGENT_STARTAGENT_STARTAGENT_ENDAGENT_ENDTOOL_CALL_STARTTOOL_START(carriestool_name,tool_args)TOOL_CALL_ENDTOOL_END(carriestool_name,tool_result_summary)LLM_REQUESTTOOL_STARTLLM_RESPONSETOOL_END(carriesresponse_content— see Follow-up 1)MEMORY_*,KNOWLEDGE_*, etc.praisonai/observability/langfuse.py— newLangfuseSink.context_sink()returning the bridge.praisonai/cli/app.py::_setup_langfuse_observability— now also installsContextTraceEmitter(sink=sink.context_sink(), enabled=True)viaset_context_emitter, and registersatexit.register(sink.close)so spans flush even if the user forgetsprovider.flush().User-visible effect: running
praisonai --observe langfuse <command>now produces fullagent_start/agent_end/tool_call_*/llm_*spans in Langfuse, mirroring what users already get fromobs.langfuse()(which traces via thelangfuse.openaidrop-in).Two paths to enable Langfuse — current docs only cover one
After #1461 there are now two complete, supported paths to wire Langfuse into PraisonAI. The current page only explains Path A.
obs.langfuse()praisonai --observe langfuse(NEW)PRAISONAI_OBSERVE=langfuse)langfuse.openaidrop-in (Langfuse v4 SDK)LangfuseSink(TraceSinkProtocol) +ContextTraceEmitterbridgeagent_start,agent_end,tool_call_*,llm_*flush()needed?provider.flush()atexitregisters it (since #1461)docs/observability/langfuse.mdx)Both paths can co-exist: Path A traces the LLM HTTP call; Path B traces the agent's lifecycle around it.
SDK ground truth — files to read before editing the page
Per AGENTS.md §1.1 / §1.3 (SDK-first), the docs author must read these files in the daily-synced source trees before authoring:
_extract_llm_response_contenthelper + emit sitepraisonaiagents/agent/chat_mixin.py(helper at~line 460, emit at~line 592)LangfuseSink+LangfuseSinkConfigdataclasspraisonai/observability/langfuse.py_ContextToActionBridge+LangfuseSink.context_sink()praisonai/observability/langfuse.py(added by #1461 — will appear after the next dailyupdate_repos.shrun; until then read the PR #1461 diff directly)--observe langfuseCLI wiringpraisonai/cli/app.py::_setup_langfuse_observability(lines16–40after #1461)--observeflag declarationpraisonai/cli/app.py(lines180–214)praisonaiagents/trace/context_events.py(ContextTraceSinkProtocol,ContextEvent,ContextEventType)obs.langfuse()factory (already-documented Path A)praisonaiagents/obs/__init__.pyLangfuseProvider.init()(already-documented Path A)praisonai_tools/observability/providers/langfuse_provider.pyLangfuseSinkConfig— extract verbatim, do not guessFrom
@dataclass LangfuseSinkConfiginpraisonai/observability/langfuse.py:public_keystr""(thenos.getenv("LANGFUSE_PUBLIC_KEY", ""))pk-lf-...)secret_keystr""(thenos.getenv("LANGFUSE_SECRET_KEY", ""))sk-lf-...)hoststr""(thenLANGFUSE_HOST→LANGFUSE_BASE_URL→https://cloud.langfuse.com)flush_atint20flush_intervalfloat10.0enabledboolTrueFiles to create / modify
1. UPDATE
docs/observability/langfuse.mdx(primary change)Add two new top-level sections (plus minor wording tweaks). Existing
obs.langfuse()content stays — it's correct and covers Path A.New section:
## CLI Observability —--observe langfuse``Insert after the existing
## CLI Commandssection. Document Path B end-to-end:praisonai --observe langfuse run agents.yamlPRAISONAI_OBSERVE=langfuseLANGFUSE_PUBLIC_KEY,LANGFUSE_SECRET_KEY,LANGFUSE_HOST(orLANGFUSE_BASE_URL) — same table as the existing Environment Variables section can be referenced.agent_start → llm_request → llm_response → tool_call_start → tool_call_end → agent_end).<Note>: as of PR #1461,atexitauto-closes the sink — no manual flush required for CLI runs./observability/custom-tracingfor the underlyingContextTraceSinkProtocol.New section:
## Programmatic —LangfuseSink+ context bridgeInsert after the new CLI section. Agent-centric example first (per AGENTS.md §1.1.9):
Follow with the
LangfuseSinkConfigtable above.New
<AccordionGroup>entry: "Trace content quality (PR #1461)"Inside the existing
## Best Practicessection, add an accordion explaining that as of #1461,llm_responsespans contain the assistant message text (or[tool_calls: ...]summary), not the rawChatCompletion(...)repr — so the Langfuse "Output" panel is now human-readable.Minor tweaks
## How It Workssection to make clear it describes Path A specifically; add a one-line note pointing to the new CLI / programmatic-sink sections for Path B.obs.langfuse()(Path A), but--observe langfuse(Path B) auto-registersatexit.closesince #1461.2. UPDATE
docs/observability/overview.mdxTwo small fixes on the supported-providers table (line ~51):
pip install opentelemetry-sdk opentelemetry-exporter-otlp→pip install langfuse(this is the residue from the pre-v4 era; Docs: Rewrite Langfuse integration page with correct Langfuse v4 API, obs.langfuse(), and CLI commands #64 fixed the Langfuse page itself but missed the overview row).praisonai --observe langfuseas well asobs.langfuse().3.
docs.json— no change requiredThe Langfuse page is already registered. CLI flag docs live inside the existing page.
Placement rules (AGENTS.md §1.8)
docs/observability/— agent-writable. Both files in scope live here.docs/concepts/— observability is a feature integration, not a core concept.docs/js/ordocs/rust/—LangfuseSink+ the bridge are Python-only; those trees are auto-generated by the parity system.Acceptance checklist (per AGENTS.md §9)
docs/observability/langfuse.mdxupdated with the two new sections (CLI--observe langfuse; programmaticLangfuseSink+ bridge).set_context_emitter(... sink=sink.context_sink() ...)call.LangfuseSinkConfigfield documented with the exact type + default from source._extract_llm_response_contentimprovement (Follow-up 1) called out in Best Practices accordion ("trace output is now readable").atexitauto-close behaviour noted; explicitprovider.flush()still recommended for Path A.<Steps>,<AccordionGroup>,<CardGroup>components used per template (AGENTS.md §2).docs/observability/overview.mdxLangfuse install column fixed topip install langfuse.docs/concepts/,docs/js/, ordocs/rust/.docs/observability/langfuse.mdx(Path B section) anddocs/observability/custom-tracing.mdx(the underlying protocol).References
2026-04-18, head SHAa7efb680d33f48955257fbf59f7817bfb166a9f6(the source change driving this issue).