Langfuse not capturing LangChain LLM calls inside Celery tasks #11715
Replies: 1 comment
-
|
The issue you're experiencing with Langfuse not capturing LangChain LLM calls inside Celery tasks is likely related to context propagation in multi-process environments. Here are the key considerations: Upgrade to SDK v3You're using Langfuse Python SDK v2.48, but the documentation strongly recommends upgrading to v3, which has significant architectural improvements for handling async and multi-process scenarios(1). The v3 SDK is built on OpenTelemetry and provides better context propagation, which is essential for worker environments like Celery. Ensure Proper FlushingIn short-lived or worker environments like Celery tasks, you must explicitly flush events before the task completes. The SDK queues events in the background, and if the worker process terminates before flushing, traces will be lost(1). from langfuse import get_client
# At the end of your Celery task
get_client().flush()Initialize Handler Per TaskBased on the documentation, you should initialize the from langfuse.langchain import CallbackHandler
# Inside your Celery task
langfuse_handler = CallbackHandler()
response = chain.invoke(
{"topic": "cats"},
config={
"callbacks": [langfuse_handler],
"metadata": {
"langfuse_user_id": "random-user",
"langfuse_session_id": "random-session",
"langfuse_tags": ["random-tag-1", "random-tag-2"]
}
}
)Context Propagation IssuesThe GitHub issue discussion indicates that there have been known problems with context propagation in async and multi-process environments(3). The v3 SDK addresses many of these issues through its OpenTelemetry foundation, which provides robust context propagation across different execution contexts(1). Recommended Solution
from langfuse import get_client
from langfuse.langchain import CallbackHandler
@celery_app.task
def my_task(payload):
langfuse_handler = CallbackHandler()
response = chain.invoke(
payload,
config={"callbacks": [langfuse_handler]}
)
# Critical: flush before task ends
get_client().flush()
return responseIf upgrading to v3 isn't immediately possible, ensure you're calling 📚 Sources:
Have another question? Just tag @inkeep. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Describe your question
I’m facing an issue where Langfuse does not capture LangChain LLM calls when they run inside Celery workers, even though the same code works perfectly outside Celery.
Setup
What works
Calling chains directly from Django views using:
chain.invoke(
payload,
config={
"run_name": name,
"callbacks": [langfuse_handler],
"metadata": {"langfuse_user_id": user_name},
}
)
Langfuse captures trace, input/output, tokens, latency correctly.
What fails
The same chain calls inside a Celery task:
What I’ve already tried
Langfuse Cloud or Self-Hosted?
Self-Hosted
If Self-Hosted
v2.48. Willing to upgrade to 3.x if required.
If Langfuse Cloud
No response
SDK and integration versions
No response
Pre-Submission Checklist
Beta Was this translation helpful? Give feedback.
All reactions