Does Langfuse tracing work in Vercel edge functions? #4389
-
|
Has anyone gotten Langfuse working in Vercel "edge" functions? I have a nextjs app hosted on Vercel using the "ai" sdk. I followed the guide here to setup Langfuse tracing using OpenTelemetry: https://langfuse.com/docs/integrations/vercel-ai-sdk I see traces from my dev environment but none from my production app. After debugging a while, I disabled the "edge" runtime in my api route and production traces finally appeared. If I use "edge", I don't see any issues in Vercel build or runtime logs, traces just never appear in Langfuse. Is this expected? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
|
Hi @holdenmatt! Could you please enable debug logs and share whether you see any logs / spans exported in your edge runtime? Also, could you please add a call to the new LangfuseExporter({ debug: true }) |
Beta Was this translation helpful? Give feedback.
-
|
I'm running into the same issue. I've enabled debug logs in my LangfuseExporter and I can see I'm using
in my logs yet nothing appears in my dashboard. I've also tried using the Exact same setup works locally. EDIT: For me it's not working in the node.js runtime either on Vercel :( |
Beta Was this translation helpful? Give feedback.
@RobertBroersma Can you please verify whether the environment variables for Langfuse host / secret key and public key are set correctly in your Vercel project and available in your edge runtime by logging them out (and rotating them later)?
The integration to not work on edge and node runtimes suggest that this might be rather an issue in your setup. If indeed all is setup correctly, could you also share a redacted version of your debug logs with me either here or to hassieb at langfuse .com ?