Context
Narrow follow-up to #1421 / #1420. PR #1424 attempted to deliver follow-ups 1β3 in one go but mixed observability changes with a duplicated langextract_* tool that belongs in praisonai-tools. Follow-up 3 has now shipped separately in PraisonAI-Tools#25 (merged, 16/16 tests green). PR #1424 is being closed in favour of this cleanly-scoped ticket.
Scope β observability only, 2 follow-ups
Follow-up 1 β Richer llm_response trace content (Core SDK)
praisonaiagents/agent/chat_mixin.py currently emits response_content=str(final_response), which serialises the whole ChatCompletion object. The HTML final_output field then shows the verbose repr instead of the actual assistant text.
Required change β add a helper and use it when emitting the event:
def _extract_llm_response_content(self, response) -> Optional[str]:
"""Return assistant message text, a tool-call summary, or str(response) as fallback."""
if not response:
return None
try:
if hasattr(response, "choices") and response.choices:
choice = response.choices[0]
msg = getattr(choice, "message", None)
if msg is not None:
content = getattr(msg, "content", None)
if content:
return content
tool_calls = getattr(msg, "tool_calls", None) or []
if tool_calls:
names = [getattr(tc.function, "name", "?") for tc in tool_calls]
return f"[tool_calls: {', '.join(names)}]"
except (AttributeError, IndexError, TypeError):
pass
return str(response)
Then pass self._extract_llm_response_content(final_response) to the emitter instead of str(final_response).
Acceptance: real agentic Agent.start() run produces a trace where final_output contains the assistant message text, not a ChatCompletion(...) repr.
Follow-up 2 β LangfuseSink context-emitter bridge (Wrapper)
Identical architectural gap to the one fixed for LangextractSink in #1420: the core runtime emits lifecycle events exclusively via ContextTraceEmitter / ContextTraceSinkProtocol, but LangfuseSink only consumes ActionEvents, so today Langfuse captures almost nothing in real agent flows.
Required changes in praisonai/praisonai/observability/langfuse.py:
- Add
_ContextToActionBridge that implements ContextTraceSinkProtocol and forwards ContextEvent β ActionEvent into the existing LangfuseSink.
- Add
LangfuseSink.context_sink() returning the bridge.
Required changes in praisonai/praisonai/cli/app.py::_setup_langfuse_observability:
- Install a
ContextTraceEmitter(sink=sink.context_sink(), enabled=True) via set_context_emitter (mirror the pattern already used for LangextractSink).
- Register
atexit close for the sink.
Acceptance: running Agent.start() with Langfuse observability enabled produces agent_start / agent_end / tool_call_* / llm_* spans in Langfuse (not just router/planning events).
Explicitly out of scope
- β Any
langextract_* tool code in praisonaiagents/tools/ β lives in praisonai-tools (PraisonAI-Tools#25, already merged) per AGENTS.md Β§2.2 package hierarchy.
- β Any changes to
TOOL_MAPPINGS in praisonaiagents/tools/__init__.py for langextract.
Acceptance criteria
Reference β working implementation
A working implementation of both follow-ups already exists on the closed PR #1424 branch. The new PR should cherry-pick only these four files and ignore the rest:
src/praisonai-agents/praisonaiagents/agent/chat_mixin.py
src/praisonai/praisonai/cli/app.py
src/praisonai/praisonai/observability/langfuse.py
src/praisonai/tests/unit/test_langfuse_sink.py
References
@claude please pick this up and open a new PR with only the 4 files listed above.
Context
Narrow follow-up to #1421 / #1420. PR #1424 attempted to deliver follow-ups 1β3 in one go but mixed observability changes with a duplicated
langextract_*tool that belongs inpraisonai-tools. Follow-up 3 has now shipped separately in PraisonAI-Tools#25 (merged, 16/16 tests green). PR #1424 is being closed in favour of this cleanly-scoped ticket.Scope β observability only, 2 follow-ups
Follow-up 1 β Richer
llm_responsetrace content (Core SDK)praisonaiagents/agent/chat_mixin.pycurrently emitsresponse_content=str(final_response), which serialises the wholeChatCompletionobject. The HTMLfinal_outputfield then shows the verbose repr instead of the actual assistant text.Required change β add a helper and use it when emitting the event:
Then pass
self._extract_llm_response_content(final_response)to the emitter instead ofstr(final_response).Acceptance: real agentic
Agent.start()run produces a trace wherefinal_outputcontains the assistant message text, not aChatCompletion(...)repr.Follow-up 2 β LangfuseSink context-emitter bridge (Wrapper)
Identical architectural gap to the one fixed for
LangextractSinkin #1420: the core runtime emits lifecycle events exclusively viaContextTraceEmitter/ContextTraceSinkProtocol, butLangfuseSinkonly consumesActionEvents, so today Langfuse captures almost nothing in real agent flows.Required changes in
praisonai/praisonai/observability/langfuse.py:_ContextToActionBridgethat implementsContextTraceSinkProtocoland forwardsContextEventβActionEventinto the existingLangfuseSink.LangfuseSink.context_sink()returning the bridge.Required changes in
praisonai/praisonai/cli/app.py::_setup_langfuse_observability:ContextTraceEmitter(sink=sink.context_sink(), enabled=True)viaset_context_emitter(mirror the pattern already used forLangextractSink).atexitclose for the sink.Acceptance: running
Agent.start()with Langfuse observability enabled producesagent_start/agent_end/tool_call_*/llm_*spans in Langfuse (not just router/planning events).Explicitly out of scope
langextract_*tool code inpraisonaiagents/tools/β lives inpraisonai-tools(PraisonAI-Tools#25, already merged) per AGENTS.md Β§2.2 package hierarchy.TOOL_MAPPINGSinpraisonaiagents/tools/__init__.pyfor langextract.Acceptance criteria
final_output.tests/unit/test_langfuse_sink.pycovers_ContextToActionBridgemapping for allContextEventtypes.PRAISONAI_ALLOW_NETWORK=1 OPENAI_API_KEY=... pytest -m real_agent) passes locally.praisonaiagents/tools/langextract*.praisonaiagents/tools/__init__.py::TOOL_MAPPINGSfor langextract.Reference β working implementation
A working implementation of both follow-ups already exists on the closed PR #1424 branch. The new PR should cherry-pick only these four files and ignore the rest:
References
@claude please pick this up and open a new PR with only the 4 files listed above.