Replies: 3 comments
-
|
It looks like you might be missing a crucial step to ensure that all events are flushed before your application exits, especially in short-lived applications. You need to call from llama_index.core.callbacks import CallbackManager
from langfuse.llama_index import LlamaIndexCallbackHandler
from llama_index.llms.openai import OpenAI
langfuse_handler = LlamaIndexCallbackHandler(user_id="test")
callback_manager = CallbackManager([langfuse_handler])
llm = OpenAI(model="gpt-4o", api_key=OPENAI_API_KEY, callback_manager=callback_manager)
# No events are logged in LangFuse after the completion
result = llm.complete("Hello world")
print(result)
# Ensure all events are flushed before the application exits
langfuse_handler.flush()Additionally, ensure that your
By following these steps, you should be able to ensure that LlamaIndex events are logged correctly in LangFuse [1][2][3]. |
Beta Was this translation helpful? Give feedback.
-
|
Hi @adieyal, do you run this locally (script / web server) or on a long-running instance? Depending on how you run Langfuse, you need to manually flush as mentioned above. Have a look here for more hints on how to debug this: https://langfuse.com/faq/all/missing-traces |
Beta Was this translation helpful? Give feedback.
-
|
Thanks for the quick response. I had tried flushing previously but it did not have any effect. When creating the handler with debug enabled, ie.: I get a stream of the following At no time is anything added to the queue. I have also looked through the missing-traces debug advice - I don't think there is anything new there that I haven't tried. The fact that I can manually trigger a trace suggests that my configuration of the Langfuse API is correct. Given the message above about 0 items in the queue, it seems like Langfuse is not being notified of the events by LlamaIndex. In case there was some issue with the completion event, I also tried the chat api with the same result. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I've read through the documentation and FAQs but I can't figure out why I am not managing to get LangFuse to log LlamaIndex events. Here is an example session that shows that my LangFuse configuration is correctly set-up while events are not being recorded.
I have also tried the same procedure using the global LlamaIndex Settings object. The
observedecorator also works, but passing the handler from thelangfuse_contextto the Callback Manager only records the observe logs and not LlamaIndex.Here are my library versions:
Any advice would be appreciated.
Beta Was this translation helpful? Give feedback.
All reactions