Issue with Langfuse tracing. All traces don't seem to show up. #14899
Replies: 2 comments 6 replies
-
@dosu Could you make changes to the code to make sure the trace works. |
Beta Was this translation helpful? Give feedback.
-
The issue might be that the Additionally, ensure that you have set the global handler for Langfuse as described in the Langfuse integration documentation. Here is how you can modify your code to use the At the top level above the class: from llama_index.core import Settings, set_global_handler
from llama_index.core.callbacks import CallbackManager
from langfuse.llama_index import LlamaIndexCallbackHandler
# Set the global handler for Langfuse
set_global_handler("langfuse")
langfuse_callback_handler = LlamaIndexCallbackHandler(
public_key="pk-...",
secret_key="sk-.....",
host="https://cloud.langfuse.com"
)
Settings.callback_manager = CallbackManager([langfuse_callback_handler]) Inside the class, modify the functions to use from llama_index.core.callbacks.utils import trace_method
class YourClass:
@trace_method("process_streamlit_fetch_query_results")
def process_streamlit_fetch_query_results(self, query="", verbose=False):
try:
langfuse_handler = langfuse_context.get_current_llama_index_handler()
Settings.callback_manager = CallbackManager([langfuse_handler])
bm24_retriever = BM25Retriever.from_defaults(
docstore=self.docstore, similarity_top_k=self.similarity_top_k
)
fusion_retriever = QueryFusionRetriever(
[self.retriever, bm24_retriever],
similarity_top_k=self.similarity_top_k,
num_queries=1, # set this to 1 to disable query generation
mode="reciprocal_rerank",
use_async=True,
verbose=verbose,
)
query_bundle = QueryBundle(query_str=query)
retrived_nodes = fusion_retriever.retrieve(query_bundle)
langfuse_callback_handler.flush()
langfuse_handler = langfuse_context.get_current_llama_index_handler()
Settings.callback_manager = CallbackManager([langfuse_handler])
recency_nodes = self.recency_postprocessor.postprocess_nodes(
retrived_nodes, query_bundle=query_bundle
)
langfuse_callback_handler.flush()
langfuse_handler = langfuse_context.get_current_llama_index_handler()
Settings.callback_manager = CallbackManager([langfuse_handler])
rerank_nodes = self.postprocessor.postprocess_nodes(
nodes=recency_nodes, query_bundle=query_bundle
)
langfuse_callback_handler.flush()
return rerank_nodes
except Exception as e:
traceback.print_exc()
raise Exception(f"An error occurred retrieving: {e}")
@trace_method("process_streamlit_retrieved_nodes_using_llm")
def process_streamlit_retrieved_nodes_using_llm(self, query="", reranked_nodes=[]):
try:
langfuse_handler = langfuse_context.get_current_llama_index_handler()
Settings.callback_manager = CallbackManager([langfuse_handler])
response = self.response_synthesizer.synthesize(
query=query, nodes=reranked_nodes
)
langfuse_callback_handler.flush()
return response
except Exception as e:
traceback.print_exc()
raise Exception(f"An error occurred generating response from LLM: {e}") This should ensure that the methods are properly traced and the traces show up as expected. Additionally, make sure you have set your environment variables |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
My code is defined as something like:
At the top level above the class.
Then Inside the class:
I have two functions that I want to enable trace for:
This does not work.
Beta Was this translation helpful? Give feedback.
All reactions