From 7496fef6d72f5a04098de76a1b9142353e832d51 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= Date: Mon, 16 Dec 2024 16:46:05 +0000 Subject: [PATCH 01/30] docs: langfuse on spcaes guide and gradio example --- docs/hub/spaces-sdks-docker-langfuse.md | 104 ++++++++++++++++++++++++ 1 file changed, 104 insertions(+) create mode 100644 docs/hub/spaces-sdks-docker-langfuse.md diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md new file mode 100644 index 000000000..cf057bb87 --- /dev/null +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -0,0 +1,104 @@ +# Langfuse on Spaces + +[Langfuse](https://langfuse.com) is an open-source LLM observability platform that helps teams collaboratively debug, analyze, and iterate on their LLM applications. With Langfuse, you can capture detailed traces of your applications, manage prompts, evaluate outputs, and more—all in one place. + +## What is Langfuse? + +Langfuse provides tools to monitor and understand the internal states of your large language model (LLM) applications. It enables developers to track LLM inference, embedding retrieval, API usage, and other interactions, making it easier to pinpoint problems and improve application performance. + +Key features of Langfuse include LLM tracing to capture the full context of your application's execution flow, prompt management for centralized and collaborative prompt iteration, evaluation metrics to assess output quality, dataset creation for testing and benchmarking, and a playground to experiment with prompts and model configurations. + +## Why LLM Observability? + +As LLMs become more prevalent, understanding their behavior and performance is crucial. LLM observability refers to monitoring and understanding the internal states of an LLM application through its outputs. This is essential for addressing challenges such as complex control flows, non-deterministic outputs, and varied user intents. + +Building LLM applications involves dealing with intricate workflows that rely on repeated or chained calls, making debugging challenging. The non-deterministic nature of LLM outputs adds complexity to consistent quality assessment, and varied user inputs require deep understanding to improve user experience. + +Implementing LLM observability helps in debugging complex workflows, evaluating output quality over time, and analyzing user behavior. By gaining insights into your application's performance, you can enhance reliability and user satisfaction. + +## Deploy Langfuse on Spaces + +You can deploy Langfuse on Hugging Face Spaces effortlessly and start using it within minutes. + +### Steps to Deploy Langfuse: + +1. **Open the Langfuse Template Space:** + + Click the button below to create your own Langfuse Space: + + [![Open Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/deploy-to-spaces-lg.svg)](https://huggingface.co/spaces/langfuse/langfuse-template-space) + +4. **Open the Langfuse Instance:** + + - Click on the **"Open in new tab"** button located. + +5. **Authenticate with Hugging Face OAuth:** + + - On the Langfuse login page, click on **"Sign in with Hugging Face"**. + - Grant the necessary permissions when prompted. + +6. **Start Using Langfuse:** + + After authentication, you will have a fully functioning Langfuse instance running on Hugging Face Spaces. + +## Get Started with Langfuse + +Now that you have Langfuse running, you can begin integrating it with your LLM applications. + +### 1. Create a New Project + +Create a new organization and project in Langfuse. + +### 2. Generate API Credentials + +Navigate to **Project Settings**, and under **API Keys**, click on **"Create New Key"**. Copy the **Public Key** and **Secret Key**; you'll need them to authenticate when sending data to Langfuse. + +### 3. Create a Sample Gradio Chat Application + +To create a sample Gradio chat application in Hugging Face Spaces, follow these steps: + +1. **Set Up Your Space:** + + - Navigate to Hugging Face Spaces and create a new Space. + - Choose the appropriate template or start with a blank Space. + +2. **Add the Application Code:** + + - Create a file named `app.py` in your Space. + - Copy the application code from [app.py](docs/hub/app.py) and paste it into your `app.py` file in the Space. + +3. **Define Dependencies:** + + - Create a `requirements.txt` file in your Space. + - List all necessary dependencies for your application. For example: + ``` + gradio + langfuse + openai + ``` + +4. **Launch the Application:** + + - Once the `app.py` and `requirements.txt` files are set up, start your Space. + - The application will launch, and you can interact with the Gradio chat interface. + +5. **View Example Traces in Langfuse:** + + - After starting the application, navigate to your Langfuse dashboard. + - Go to the **Traces** section to view the example traces generated by your Gradio chat application. + +By following these steps, you can quickly set up and run a Gradio chat application in Hugging Face Spaces and observe its traces in Langfuse. + +### 4. View Traces in Langfuse Dashboard + +Open your Langfuse dashboard, navigate to **Traces** to see the recorded traces from your application, and use the observability tools to analyze and debug your LLM applications. + +For detailed instructions and advanced features, refer to the [Langfuse Get Started Guide](https://langfuse.com/docs/get-started). + +## Feedback and Support + +We value your feedback and are here to help if you have any questions. + +- **Join Our Community:** Engage with us on [Discord](https://discord.gg/langfuse) or via [GitHub Discussions](https://github.com/langfuse/langfuse/discussions) +- **Report Issues:** Submit issues or feature requests on our [GitHub Issues](https://github.com/langfuse/langfuse/issues) page +- **Contact Us:** Reach out via our [Support Page](https://langfuse.com/support) or email us at [support@langfuse.com](mailto:support@langfuse.com) From 0fffbdb69a934b873bdbe33781c52ec23d6163c5 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= Date: Tue, 17 Dec 2024 08:53:42 +0000 Subject: [PATCH 02/30] edit toctree --- docs/hub/_toctree.yml | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/hub/_toctree.yml b/docs/hub/_toctree.yml index f47284e8c..77585c752 100644 --- a/docs/hub/_toctree.yml +++ b/docs/hub/_toctree.yml @@ -285,6 +285,8 @@ title: Evidence on Spaces - local: spaces-sdks-docker-marimo title: marimo on Spaces + - local: spaces-sdks-docker-langfuse + title: Langfuse on Spaces - local: spaces-embed title: Embed your Space - local: spaces-run-with-docker From fcc748598b1d6ed3ec4ed8a8513da01efd4a55db Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= <48529566+jannikmaierhoefer@users.noreply.github.com> Date: Tue, 17 Dec 2024 08:54:54 +0000 Subject: [PATCH 03/30] text edit Co-authored-by: Merve Noyan --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index cf057bb87..1eccb712b 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -12,7 +12,7 @@ Key features of Langfuse include LLM tracing to capture the full context of your As LLMs become more prevalent, understanding their behavior and performance is crucial. LLM observability refers to monitoring and understanding the internal states of an LLM application through its outputs. This is essential for addressing challenges such as complex control flows, non-deterministic outputs, and varied user intents. -Building LLM applications involves dealing with intricate workflows that rely on repeated or chained calls, making debugging challenging. The non-deterministic nature of LLM outputs adds complexity to consistent quality assessment, and varied user inputs require deep understanding to improve user experience. +Building LLM applications involves intricate workflows with repeated or chained calls, making debugging challenging. The non-deterministic nature of LLM outputs adds complexity to consistent quality assessment, and varied user inputs require deep understanding to improve user experience. Implementing LLM observability helps in debugging complex workflows, evaluating output quality over time, and analyzing user behavior. By gaining insights into your application's performance, you can enhance reliability and user satisfaction. From 82e39723d219cbd7a0377f06047c085ee7a98f51 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= Date: Thu, 19 Dec 2024 12:53:18 +0100 Subject: [PATCH 04/30] edit troubleshooting part --- docs/hub/spaces-sdks-docker-langfuse.md | 72 +++++++------------------ 1 file changed, 19 insertions(+), 53 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 1eccb712b..28f147501 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -1,9 +1,9 @@ # Langfuse on Spaces -[Langfuse](https://langfuse.com) is an open-source LLM observability platform that helps teams collaboratively debug, analyze, and iterate on their LLM applications. With Langfuse, you can capture detailed traces of your applications, manage prompts, evaluate outputs, and more—all in one place. - ## What is Langfuse? +[Langfuse](https://langfuse.com) is an open-source LLM observability platform that helps teams collaboratively debug, analyze, and iterate on their LLM applications. With Langfuse, you can capture detailed traces of your applications, manage prompts, evaluate outputs, and more—all in one place. + Langfuse provides tools to monitor and understand the internal states of your large language model (LLM) applications. It enables developers to track LLM inference, embedding retrieval, API usage, and other interactions, making it easier to pinpoint problems and improve application performance. Key features of Langfuse include LLM tracing to capture the full context of your application's execution flow, prompt management for centralized and collaborative prompt iteration, evaluation metrics to assess output quality, dataset creation for testing and benchmarking, and a playground to experiment with prompts and model configurations. @@ -14,32 +14,13 @@ As LLMs become more prevalent, understanding their behavior and performance is c Building LLM applications involves intricate workflows with repeated or chained calls, making debugging challenging. The non-deterministic nature of LLM outputs adds complexity to consistent quality assessment, and varied user inputs require deep understanding to improve user experience. -Implementing LLM observability helps in debugging complex workflows, evaluating output quality over time, and analyzing user behavior. By gaining insights into your application's performance, you can enhance reliability and user satisfaction. - ## Deploy Langfuse on Spaces You can deploy Langfuse on Hugging Face Spaces effortlessly and start using it within minutes. ### Steps to Deploy Langfuse: -1. **Open the Langfuse Template Space:** - - Click the button below to create your own Langfuse Space: - - [![Open Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/deploy-to-spaces-lg.svg)](https://huggingface.co/spaces/langfuse/langfuse-template-space) - -4. **Open the Langfuse Instance:** - - - Click on the **"Open in new tab"** button located. - -5. **Authenticate with Hugging Face OAuth:** - - - On the Langfuse login page, click on **"Sign in with Hugging Face"**. - - Grant the necessary permissions when prompted. - -6. **Start Using Langfuse:** - - After authentication, you will have a fully functioning Langfuse instance running on Hugging Face Spaces. +TBD ## Get Started with Langfuse @@ -55,32 +36,7 @@ Navigate to **Project Settings**, and under **API Keys**, click on **"Create New ### 3. Create a Sample Gradio Chat Application -To create a sample Gradio chat application in Hugging Face Spaces, follow these steps: - -1. **Set Up Your Space:** - - - Navigate to Hugging Face Spaces and create a new Space. - - Choose the appropriate template or start with a blank Space. - -2. **Add the Application Code:** - - - Create a file named `app.py` in your Space. - - Copy the application code from [app.py](docs/hub/app.py) and paste it into your `app.py` file in the Space. - -3. **Define Dependencies:** - - - Create a `requirements.txt` file in your Space. - - List all necessary dependencies for your application. For example: - ``` - gradio - langfuse - openai - ``` - -4. **Launch the Application:** - - - Once the `app.py` and `requirements.txt` files are set up, start your Space. - - The application will launch, and you can interact with the Gradio chat interface. +TBD 5. **View Example Traces in Langfuse:** @@ -95,10 +51,20 @@ Open your Langfuse dashboard, navigate to **Traces** to see the recorded traces For detailed instructions and advanced features, refer to the [Langfuse Get Started Guide](https://langfuse.com/docs/get-started). -## Feedback and Support +## Additional Resources and Support + +- [Langfuse documentation](https://langfuse.com/docs) +- [Langfuse GitHub repository](https://github.com/langfuse/langfuse) +- [Langfuse Discord](https://langfuse.com/discord) +- [Langfuse template Space](https://huggingface.co/spaces/langfuse/langfuse-template-space) + +## Troubleshooting + +If you encounter issues: + +1. Make sure your notebook runs locally in app mode using `python app.py` +2. Check that all required packages are listed in `requirements.txt` +3. Check Space logs for any Python errors -We value your feedback and are here to help if you have any questions. +For more help, open a support ticket on [GitHub discussions](https://langfuse.com/discussions) or [open an issue](https://github.com/langfuse/langfuse/issues). -- **Join Our Community:** Engage with us on [Discord](https://discord.gg/langfuse) or via [GitHub Discussions](https://github.com/langfuse/langfuse/discussions) -- **Report Issues:** Submit issues or feature requests on our [GitHub Issues](https://github.com/langfuse/langfuse/issues) page -- **Contact Us:** Reach out via our [Support Page](https://langfuse.com/support) or email us at [support@langfuse.com](mailto:support@langfuse.com) From f024ddce091930a070be709371c417adb72507e1 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= Date: Fri, 20 Dec 2024 13:38:54 +0100 Subject: [PATCH 05/30] edit text --- docs/hub/spaces-sdks-docker-langfuse.md | 72 +++++++++++++++---------- 1 file changed, 44 insertions(+), 28 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 28f147501..77b85f6dc 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -1,55 +1,72 @@ # Langfuse on Spaces -## What is Langfuse? +This guide shows you how to deploy Langfuse on Hugging Face Spaces and start instrumenting your LLM application. This integreation helps you to experiment on Hugging Face models, manage your prompts in one place and evaluate model outputs. -[Langfuse](https://langfuse.com) is an open-source LLM observability platform that helps teams collaboratively debug, analyze, and iterate on their LLM applications. With Langfuse, you can capture detailed traces of your applications, manage prompts, evaluate outputs, and more—all in one place. +## What is Langfuse? -Langfuse provides tools to monitor and understand the internal states of your large language model (LLM) applications. It enables developers to track LLM inference, embedding retrieval, API usage, and other interactions, making it easier to pinpoint problems and improve application performance. +[Langfuse](https://langfuse.com) is an open-source LLM engineering platform that helps teams collaboratively debug, evaluate, and iterate on their LLM applications. Key features of Langfuse include LLM tracing to capture the full context of your application's execution flow, prompt management for centralized and collaborative prompt iteration, evaluation metrics to assess output quality, dataset creation for testing and benchmarking, and a playground to experiment with prompts and model configurations. -## Why LLM Observability? +_This video is a 10 min walkthrough of the Langfuse features:_ + -As LLMs become more prevalent, understanding their behavior and performance is crucial. LLM observability refers to monitoring and understanding the internal states of an LLM application through its outputs. This is essential for addressing challenges such as complex control flows, non-deterministic outputs, and varied user intents. +## Why LLM Observability? -Building LLM applications involves intricate workflows with repeated or chained calls, making debugging challenging. The non-deterministic nature of LLM outputs adds complexity to consistent quality assessment, and varied user inputs require deep understanding to improve user experience. +- As language models become more prevalent, understanding their behavior and performance is important. +- **LLM observability** involves monitoring and understanding the internal states of an LLM application through its outputs. +- It is essential for addressing challenges such as: + - **Complex control flows** with repeated or chained calls, making debugging challenging. + - **Non-deterministic outputs**, adding complexity to consistent quality assessment. + - **Varied user intents**, requiring deep understanding to improve user experience. +- Building LLM applications involves intricate workflows, and observability helps in managing these complexities. -## Deploy Langfuse on Spaces +## Step 1: Set up Langfuse on Spaces -You can deploy Langfuse on Hugging Face Spaces effortlessly and start using it within minutes. +The Langfuse Huggingface Space allows you to get up and running with a deployed version of Langfuse with just a few clicks. Within a few minutes, you'll have this default Langfuse dashboard deployed and ready for you to connect to from your local machine. -### Steps to Deploy Langfuse: + + + -TBD +1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space) +2. Select **Docker** as the Space SDK +3. Select **Langfuse** as the Space template +4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts +5. Change the **Environment Variables**: + - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment. + - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment. + - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. You should overwrite the default value here for a secure deployment. -## Get Started with Langfuse +![Clone the Langfuse Space](https://langfuse.com/images/cookbook/huggingface/huggingface-space-setup.png) -Now that you have Langfuse running, you can begin integrating it with your LLM applications. +## Step 2: Instrument your Code -### 1. Create a New Project +Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. -Create a new organization and project in Langfuse. +### Example: Monitor your Gradio Application -### 2. Generate API Credentials +We created a Gradio template space that shows how to create a simple chat application using a Hugging Face model and trace model calls and user feedback in Langfuse - without leaving Hugging Face. -Navigate to **Project Settings**, and under **API Keys**, click on **"Create New Key"**. Copy the **Public Key** and **Secret Key**; you'll need them to authenticate when sending data to Langfuse. + + + -### 3. Create a Sample Gradio Chat Application +To get started, clone the [Gradio template space](https://huggingface.co/spaces/langfuse/gradio-example-template) and follow the instructions in the [README](https://huggingface.co/spaces/langfuse/gradio-example-template/blob/main/README.md). -TBD +### Monitor Any Application -5. **View Example Traces in Langfuse:** +Langfuse is model agnostic and can be used to trace any application. Follow the [get-started guide](https://langfuse.com/docs) in Langfuse documentation to see how you can instrument your code. - - After starting the application, navigate to your Langfuse dashboard. - - Go to the **Traces** section to view the example traces generated by your Gradio chat application. +Langfuse maintains native integrations with many popular LLM frameworks, including [Langchain](https://langfuse.com/docs/integrations/langchain/tracing), [LlamaIndex](https://langfuse.com/docs/integrations/llama-index/get-started) and [OpenAI](https://langfuse.com/docs/integrations/openai/python/get-started) and offers Python and JS/TS SDKs to instrument your code. Langfuse also offers various API endpoints to ingest data and has been integrated by other open source projects such as [Langflow](https://langfuse.com/docs/integrations/langflow), [Dify](https://langfuse.com/docs/integrations/dify) and [Haystack](https://langfuse.com/docs/integrations/haystack/get-started). -By following these steps, you can quickly set up and run a Gradio chat application in Hugging Face Spaces and observe its traces in Langfuse. +## Step 3: View Traces in Langfuse -### 4. View Traces in Langfuse Dashboard +Once you have instrumented your application, and ingested traces or user feedback into Langfuse, you can view your traces in Langfuse. -Open your Langfuse dashboard, navigate to **Traces** to see the recorded traces from your application, and use the observability tools to analyze and debug your LLM applications. +![Example trace with Gradio](https://langfuse.com/images/cookbook/huggingface/huggingface-gradio-example-trace.png) -For detailed instructions and advanced features, refer to the [Langfuse Get Started Guide](https://langfuse.com/docs/get-started). +_[Example trace in the Langfuse UI](https://langfuse-langfuse-template-space.hf.space/project/cm4r1ajtn000a4co550swodxv/traces/9cdc12fb-71bf-4074-ab0b-0b8d212d839f?timestamp=2024-12-20T12%3A12%3A50.089Z&view=preview)_ ## Additional Resources and Support @@ -60,11 +77,10 @@ For detailed instructions and advanced features, refer to the [Langfuse Get Star ## Troubleshooting -If you encounter issues: +If you encounter issues using the [Langfuse Gradio example template](https://huggingface.co/spaces/langfuse/gradio-example-template): 1. Make sure your notebook runs locally in app mode using `python app.py` 2. Check that all required packages are listed in `requirements.txt` 3. Check Space logs for any Python errors -For more help, open a support ticket on [GitHub discussions](https://langfuse.com/discussions) or [open an issue](https://github.com/langfuse/langfuse/issues). - +For more help, open a support ticket on [GitHub discussions](https://langfuse.com/discussions) or [open an issue](https://github.com/langfuse/langfuse/issues). \ No newline at end of file From 41dfe8a6537dd36a2ba87799e3735138f9c007f4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= Date: Fri, 20 Dec 2024 13:39:51 +0100 Subject: [PATCH 06/30] update numbers --- docs/hub/spaces-sdks-docker-langfuse.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 77b85f6dc..b950b0d12 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -29,11 +29,11 @@ The Langfuse Huggingface Space allows you to get up and running with a deployed -1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space) -2. Select **Docker** as the Space SDK -3. Select **Langfuse** as the Space template -4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts -5. Change the **Environment Variables**: +1.1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space) +1.2. Select **Docker** as the Space SDK +1.3. Select **Langfuse** as the Space template +1.4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts +1.5. Change the **Environment Variables**: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. You should overwrite the default value here for a secure deployment. From ac7040557457b53c6e53dde0f8f73c07b8a78279 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= Date: Fri, 20 Dec 2024 13:57:08 +0100 Subject: [PATCH 07/30] fix spelling --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index b950b0d12..75008f087 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -23,7 +23,7 @@ _This video is a 10 min walkthrough of the Langfuse features:_ ## Step 1: Set up Langfuse on Spaces -The Langfuse Huggingface Space allows you to get up and running with a deployed version of Langfuse with just a few clicks. Within a few minutes, you'll have this default Langfuse dashboard deployed and ready for you to connect to from your local machine. +The Langfuse Hugging Face Space allows you to get up and running with a deployed version of Langfuse with just a few clicks. Within a few minutes, you'll have this default Langfuse dashboard deployed and ready for you to connect to from your local machine. From 8462e6fd2476d1dc2d41e413ef74e28461685e7d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= <48529566+jannikmaierhoefer@users.noreply.github.com> Date: Fri, 20 Dec 2024 15:04:27 +0100 Subject: [PATCH 08/30] Update docs/hub/spaces-sdks-docker-langfuse.md Co-authored-by: Marc Klingen --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 75008f087..f8508d707 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -23,7 +23,7 @@ _This video is a 10 min walkthrough of the Langfuse features:_ ## Step 1: Set up Langfuse on Spaces -The Langfuse Hugging Face Space allows you to get up and running with a deployed version of Langfuse with just a few clicks. Within a few minutes, you'll have this default Langfuse dashboard deployed and ready for you to connect to from your local machine. +The Langfuse Hugging Face Space allows you to get up and running with a deployed version of Langfuse with just a few clicks. From ad18cdde3874822e852032c3e718c15b8ce5b750 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= <48529566+jannikmaierhoefer@users.noreply.github.com> Date: Fri, 20 Dec 2024 15:05:02 +0100 Subject: [PATCH 09/30] Update docs/hub/spaces-sdks-docker-langfuse.md Co-authored-by: Marc Klingen --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index f8508d707..ceb940fb2 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -34,7 +34,7 @@ The Langfuse Hugging Face Space allows you to get up and running with a deployed 1.3. Select **Langfuse** as the Space template 1.4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts 1.5. Change the **Environment Variables**: - - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment. + - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. You should overwrite the default value here for a secure deployment. From a0dfd6ef96731d099bc4cae847e3cb29c35b5b85 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= <48529566+jannikmaierhoefer@users.noreply.github.com> Date: Fri, 20 Dec 2024 15:05:09 +0100 Subject: [PATCH 10/30] Update docs/hub/spaces-sdks-docker-langfuse.md Co-authored-by: Marc Klingen --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index ceb940fb2..94ebab7bb 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -35,7 +35,7 @@ The Langfuse Hugging Face Space allows you to get up and running with a deployed 1.4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts 1.5. Change the **Environment Variables**: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment. + - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. You should overwrite the default value here for a secure deployment. ![Clone the Langfuse Space](https://langfuse.com/images/cookbook/huggingface/huggingface-space-setup.png) From 112e7e965cc192fdf8fb77760ed4c99f2ae7729d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= <48529566+jannikmaierhoefer@users.noreply.github.com> Date: Fri, 20 Dec 2024 15:05:20 +0100 Subject: [PATCH 11/30] Update docs/hub/spaces-sdks-docker-langfuse.md Co-authored-by: Marc Klingen --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 94ebab7bb..3b7143f11 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -33,7 +33,7 @@ The Langfuse Hugging Face Space allows you to get up and running with a deployed 1.2. Select **Docker** as the Space SDK 1.3. Select **Langfuse** as the Space template 1.4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts -1.5. Change the **Environment Variables**: +1.5. For a secure deployment, replace the default values of the **environment variables**: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. You should overwrite the default value here for a secure deployment. From 35f25b3cbffbf89ac7fb51514095b4a1416ba456 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= <48529566+jannikmaierhoefer@users.noreply.github.com> Date: Fri, 20 Dec 2024 15:05:29 +0100 Subject: [PATCH 12/30] Update docs/hub/spaces-sdks-docker-langfuse.md Co-authored-by: Marc Klingen --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 3b7143f11..4bf5ec6cf 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -40,7 +40,7 @@ The Langfuse Hugging Face Space allows you to get up and running with a deployed ![Clone the Langfuse Space](https://langfuse.com/images/cookbook/huggingface/huggingface-space-setup.png) -## Step 2: Instrument your Code +## Step 2: Use Langfuse Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. From 19dc6c09215fd6e090ea792ca1e599a377adc3be Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= <48529566+jannikmaierhoefer@users.noreply.github.com> Date: Fri, 20 Dec 2024 15:05:42 +0100 Subject: [PATCH 13/30] Update docs/hub/spaces-sdks-docker-langfuse.md Co-authored-by: Marc Klingen --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 4bf5ec6cf..77953f1f0 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -83,4 +83,4 @@ If you encounter issues using the [Langfuse Gradio example template](https://hug 2. Check that all required packages are listed in `requirements.txt` 3. Check Space logs for any Python errors -For more help, open a support ticket on [GitHub discussions](https://langfuse.com/discussions) or [open an issue](https://github.com/langfuse/langfuse/issues). \ No newline at end of file +For more help, open a support thread on [GitHub discussions](https://langfuse.com/discussions) or [open an issue](https://github.com/langfuse/langfuse/issues). \ No newline at end of file From 737301c7cf5d6d49ea6d77b8b7ed8ce3b03265b9 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= Date: Fri, 20 Dec 2024 15:07:19 +0100 Subject: [PATCH 14/30] move troubleshoot section to gradio template readme as this is only gradio related --- docs/hub/spaces-sdks-docker-langfuse.md | 8 -------- 1 file changed, 8 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 77953f1f0..7e9284e15 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -75,12 +75,4 @@ _[Example trace in the Langfuse UI](https://langfuse-langfuse-template-space.hf. - [Langfuse Discord](https://langfuse.com/discord) - [Langfuse template Space](https://huggingface.co/spaces/langfuse/langfuse-template-space) -## Troubleshooting - -If you encounter issues using the [Langfuse Gradio example template](https://huggingface.co/spaces/langfuse/gradio-example-template): - -1. Make sure your notebook runs locally in app mode using `python app.py` -2. Check that all required packages are listed in `requirements.txt` -3. Check Space logs for any Python errors - For more help, open a support thread on [GitHub discussions](https://langfuse.com/discussions) or [open an issue](https://github.com/langfuse/langfuse/issues). \ No newline at end of file From 4c74941b93d02561902fb040339c8dad00a87e4f Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= <48529566+jannikmaierhoefer@users.noreply.github.com> Date: Fri, 20 Dec 2024 15:08:54 +0100 Subject: [PATCH 15/30] Update docs/hub/spaces-sdks-docker-langfuse.md Co-authored-by: Marc Klingen --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 7e9284e15..d745b650a 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -36,7 +36,7 @@ The Langfuse Hugging Face Space allows you to get up and running with a deployed 1.5. For a secure deployment, replace the default values of the **environment variables**: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. - - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. You should overwrite the default value here for a secure deployment. + - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. ![Clone the Langfuse Space](https://langfuse.com/images/cookbook/huggingface/huggingface-space-setup.png) From 98491950522d8d26e95a329b5f2f2ea7bd4b2ef6 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jannik=20Maierh=C3=B6fer?= Date: Fri, 20 Dec 2024 17:55:51 +0100 Subject: [PATCH 16/30] edit gradio link name --- docs/hub/spaces-sdks-docker-langfuse.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index d745b650a..5773a3840 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -48,11 +48,11 @@ Now that you have Langfuse running, you can start instrumenting your LLM applica We created a Gradio template space that shows how to create a simple chat application using a Hugging Face model and trace model calls and user feedback in Langfuse - without leaving Hugging Face. - + -To get started, clone the [Gradio template space](https://huggingface.co/spaces/langfuse/gradio-example-template) and follow the instructions in the [README](https://huggingface.co/spaces/langfuse/gradio-example-template/blob/main/README.md). +To get started, clone the [Gradio template space](https://huggingface.co/spaces/langfuse/langfuse-gradio-example-template) and follow the instructions in the [README](https://huggingface.co/spaces/langfuse/langfuse-gradio-example-template/blob/main/README.md). ### Monitor Any Application From 9336d5d9b5440b09a824a0aa168bd0caf7619aa7 Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Fri, 20 Dec 2024 13:10:44 -0500 Subject: [PATCH 17/30] Apply suggestions from code review --- docs/hub/spaces-sdks-docker-langfuse.md | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 5773a3840..0a4c96ee4 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -1,6 +1,6 @@ # Langfuse on Spaces -This guide shows you how to deploy Langfuse on Hugging Face Spaces and start instrumenting your LLM application. This integreation helps you to experiment on Hugging Face models, manage your prompts in one place and evaluate model outputs. +This guide shows you how to deploy Langfuse on Hugging Face Spaces and start instrumenting your LLM application. This integration helps you to experiment on Hugging Face models, manage your prompts in one place and evaluate model outputs. ## What is Langfuse? @@ -29,11 +29,11 @@ The Langfuse Hugging Face Space allows you to get up and running with a deployed -1.1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space) -1.2. Select **Docker** as the Space SDK -1.3. Select **Langfuse** as the Space template -1.4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts -1.5. For a secure deployment, replace the default values of the **environment variables**: +### 1.1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space) +### 1.2. Select **Docker** as the Space SDK +### 1.3. Select **Langfuse** as the Space template +### 1.4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts +### 1.5. For a secure deployment, replace the default values of the **environment variables**: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. @@ -48,11 +48,11 @@ Now that you have Langfuse running, you can start instrumenting your LLM applica We created a Gradio template space that shows how to create a simple chat application using a Hugging Face model and trace model calls and user feedback in Langfuse - without leaving Hugging Face. - + -To get started, clone the [Gradio template space](https://huggingface.co/spaces/langfuse/langfuse-gradio-example-template) and follow the instructions in the [README](https://huggingface.co/spaces/langfuse/langfuse-gradio-example-template/blob/main/README.md). +To get started, [duplicate this Gradio template space](https://huggingface.co/spaces/langfuse/langfuse-gradio-example-template?duplicate=true) and follow the instructions in the [README](https://huggingface.co/spaces/langfuse/langfuse-gradio-example-template/blob/main/README.md). ### Monitor Any Application From 192fb20ac447e5a64a6c53cc5d542183265f8b6f Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Fri, 20 Dec 2024 14:59:05 -0500 Subject: [PATCH 18/30] fix setup steps numbered list formatting --- docs/hub/spaces-sdks-docker-langfuse.md | 12 +++++++----- 1 file changed, 7 insertions(+), 5 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 0a4c96ee4..7f24849c2 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -29,11 +29,13 @@ The Langfuse Hugging Face Space allows you to get up and running with a deployed -### 1.1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space) -### 1.2. Select **Docker** as the Space SDK -### 1.3. Select **Langfuse** as the Space template -### 1.4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts -### 1.5. For a secure deployment, replace the default values of the **environment variables**: +To get started, follow these steps: + +1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space) +2. Select **Docker** as the Space SDK +3. Select **Langfuse** as the Space template +4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts +5. For a secure deployment, replace the default values of the **environment variables**: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. From b1a5a3d60afb8a1c0dda16af3d1414f4b1ee455a Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Fri, 20 Dec 2024 15:17:52 -0500 Subject: [PATCH 19/30] Add simple tracing example with HF Serverless API --- docs/hub/spaces-sdks-docker-langfuse.md | 44 +++++++++++++++++++++---- 1 file changed, 37 insertions(+), 7 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 7f24849c2..90e14e735 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -46,7 +46,43 @@ To get started, follow these steps: Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. -### Example: Monitor your Gradio Application +### Monitor Any Application + +Langfuse is model agnostic and can be used to trace any application. Follow the [get-started guide](https://langfuse.com/docs) in Langfuse documentation to see how you can instrument your code. + +Langfuse maintains native integrations with many popular LLM frameworks, including [Langchain](https://langfuse.com/docs/integrations/langchain/tracing), [LlamaIndex](https://langfuse.com/docs/integrations/llama-index/get-started) and [OpenAI](https://langfuse.com/docs/integrations/openai/python/get-started) and offers Python and JS/TS SDKs to instrument your code. Langfuse also offers various API endpoints to ingest data and has been integrated by other open source projects such as [Langflow](https://langfuse.com/docs/integrations/langflow), [Dify](https://langfuse.com/docs/integrations/dify) and [Haystack](https://langfuse.com/docs/integrations/haystack/get-started). + +### Example 1: Trace Calls to HF Serverless API + +As a simple example, here's how to trace calls to the HF Serverless API using the Langfuse Python SDK. + + +Be sure to first configure your `LANGFUSE_HOST`, `LANGFUSE_PUBLIC_KEY` and `LANGFUSE_SECRET_KEY` environment variables, and make sure you've [authenticated with your Hugging Face account](https://huggingface.co/docs/huggingface_hub/en/quick-start#authentication). + + +```python +from langfuse.openai import openai +from huggingface_hub import get_token + +client = openai.OpenAI( + base_url="https://api-inference.huggingface.co/v1/", + api_key=get_token(), +) + +messages = [{"role": "user", "content": "What is observability for LLMs?"}] + +response = client.chat.completions.create( + model="meta-llama/Llama-3.3-70B-Instruct", + messages=messages, + max_tokens=100, +) + +print(response.choices[0].message.content) +``` + +Then navigate to the Langfuse dashboard to see the trace! + +### Example 2: Monitor a Gradio Application We created a Gradio template space that shows how to create a simple chat application using a Hugging Face model and trace model calls and user feedback in Langfuse - without leaving Hugging Face. @@ -56,12 +92,6 @@ We created a Gradio template space that shows how to create a simple chat applic To get started, [duplicate this Gradio template space](https://huggingface.co/spaces/langfuse/langfuse-gradio-example-template?duplicate=true) and follow the instructions in the [README](https://huggingface.co/spaces/langfuse/langfuse-gradio-example-template/blob/main/README.md). -### Monitor Any Application - -Langfuse is model agnostic and can be used to trace any application. Follow the [get-started guide](https://langfuse.com/docs) in Langfuse documentation to see how you can instrument your code. - -Langfuse maintains native integrations with many popular LLM frameworks, including [Langchain](https://langfuse.com/docs/integrations/langchain/tracing), [LlamaIndex](https://langfuse.com/docs/integrations/llama-index/get-started) and [OpenAI](https://langfuse.com/docs/integrations/openai/python/get-started) and offers Python and JS/TS SDKs to instrument your code. Langfuse also offers various API endpoints to ingest data and has been integrated by other open source projects such as [Langflow](https://langfuse.com/docs/integrations/langflow), [Dify](https://langfuse.com/docs/integrations/dify) and [Haystack](https://langfuse.com/docs/integrations/haystack/get-started). - ## Step 3: View Traces in Langfuse Once you have instrumented your application, and ingested traces or user feedback into Langfuse, you can view your traces in Langfuse. From 0ca20497debb074504a3cb97739b916ea576e0cf Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Fri, 20 Dec 2024 15:24:22 -0500 Subject: [PATCH 20/30] remove for link formatting --- docs/hub/spaces-sdks-docker-langfuse.md | 8 +------- 1 file changed, 1 insertion(+), 7 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 90e14e735..57d6b9f10 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -54,11 +54,9 @@ Langfuse maintains native integrations with many popular LLM frameworks, includi ### Example 1: Trace Calls to HF Serverless API -As a simple example, here's how to trace calls to the HF Serverless API using the Langfuse Python SDK. +As a simple example, here's how to trace LLM calls to the HF Serverless API using the Langfuse Python SDK. - Be sure to first configure your `LANGFUSE_HOST`, `LANGFUSE_PUBLIC_KEY` and `LANGFUSE_SECRET_KEY` environment variables, and make sure you've [authenticated with your Hugging Face account](https://huggingface.co/docs/huggingface_hub/en/quick-start#authentication). - ```python from langfuse.openai import openai @@ -76,12 +74,8 @@ response = client.chat.completions.create( messages=messages, max_tokens=100, ) - -print(response.choices[0].message.content) ``` -Then navigate to the Langfuse dashboard to see the trace! - ### Example 2: Monitor a Gradio Application We created a Gradio template space that shows how to create a simple chat application using a Hugging Face model and trace model calls and user feedback in Langfuse - without leaving Hugging Face. From d08059e222eb8c8013c6902c04991b3a45d2db36 Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Fri, 20 Dec 2024 15:28:17 -0500 Subject: [PATCH 21/30] point "Deploy on HF" to preselected template --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 57d6b9f10..f13f0b98d 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -25,7 +25,7 @@ _This video is a 10 min walkthrough of the Langfuse features:_ The Langfuse Hugging Face Space allows you to get up and running with a deployed version of Langfuse with just a few clicks. - + From 27485c5116e7cfeb6fe048598287b4d9842eda74 Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Thu, 2 Jan 2025 09:26:18 -0500 Subject: [PATCH 22/30] Update docs/hub/spaces-sdks-docker-langfuse.md Co-authored-by: vb --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index f13f0b98d..8bc3bf4c3 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -35,7 +35,7 @@ To get started, follow these steps: 2. Select **Docker** as the Space SDK 3. Select **Langfuse** as the Space template 4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts -5. For a secure deployment, replace the default values of the **environment variables**: +5. [Optional but recommended] For a secure deployment, replace the default values of the **environment variables**: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. From fc5070ca4f9e10b18bd15dff32d712afe65ff234 Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Thu, 2 Jan 2025 11:15:13 -0500 Subject: [PATCH 23/30] include note about HF OAuth --- docs/hub/spaces-sdks-docker-langfuse.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 8bc3bf4c3..2817f39ea 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -29,7 +29,7 @@ The Langfuse Hugging Face Space allows you to get up and running with a deployed -To get started, follow these steps: +To get started, click the button above or follow these steps: 1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space) 2. Select **Docker** as the Space SDK @@ -44,7 +44,7 @@ To get started, follow these steps: ## Step 2: Use Langfuse -Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. +Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login. ### Monitor Any Application From 3b7d39ae266af5b93838b52df9ba8f85e813a895 Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Mon, 6 Jan 2025 11:40:00 -0500 Subject: [PATCH 24/30] add note about AUTH_DISABLE_SIGNUP --- docs/hub/spaces-sdks-docker-langfuse.md | 15 +++++++++++++-- 1 file changed, 13 insertions(+), 2 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 2817f39ea..6ac05e3cc 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -35,7 +35,8 @@ To get started, click the button above or follow these steps: 2. Select **Docker** as the Space SDK 3. Select **Langfuse** as the Space template 4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts -5. [Optional but recommended] For a secure deployment, replace the default values of the **environment variables**: +5. Ensure the space is set to **public** visibility so Langfuse API/SDK's can access the app (see note below for more details) +6. [Optional but recommended] For a secure deployment, replace the default values of the **environment variables**: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. @@ -44,7 +45,17 @@ To get started, click the button above or follow these steps: ## Step 2: Use Langfuse -Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login. +Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. + +Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login. + + +The Langfuse space must be set to **public** visibility so that the Langfuse API/SDK's can access the app. + +By default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new user's from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. + +Inside of the app, you can use [the native Langfuse features](https://langfuse.com/docs/rbac) to manage Organizations, Projects, and Users. + ### Monitor Any Application From 5e976ecc5526edf473664d4b2e463a64ea3204fd Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Mon, 6 Jan 2025 11:43:53 -0500 Subject: [PATCH 25/30] fix tip syntax --- docs/hub/spaces-sdks-docker-langfuse.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 6ac05e3cc..1dcec9860 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -50,11 +50,13 @@ Now that you have Langfuse running, you can start instrumenting your LLM applica Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login. + The Langfuse space must be set to **public** visibility so that the Langfuse API/SDK's can access the app. By default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new user's from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. Inside of the app, you can use [the native Langfuse features](https://langfuse.com/docs/rbac) to manage Organizations, Projects, and Users. + ### Monitor Any Application From cb366e670334ae489c994424b21c8757a2cb1f4e Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Mon, 6 Jan 2025 11:50:13 -0500 Subject: [PATCH 26/30] alt tip syntax --- docs/hub/spaces-sdks-docker-langfuse.md | 15 ++++++--------- 1 file changed, 6 insertions(+), 9 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 1dcec9860..bc57a4063 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -49,15 +49,12 @@ Now that you have Langfuse running, you can start instrumenting your LLM applica Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login. - - -The Langfuse space must be set to **public** visibility so that the Langfuse API/SDK's can access the app. - -By default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new user's from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. - -Inside of the app, you can use [the native Langfuse features](https://langfuse.com/docs/rbac) to manage Organizations, Projects, and Users. - - +> [!TIP] +> The Langfuse space must be set to **public** visibility so that the Langfuse API/SDK's can access the app. +> +> By default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new user's from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. +> +> Inside of the app, you can use [the native Langfuse features](https://langfuse.com/docs/rbac) to manage Organizations, Projects, and Users. ### Monitor Any Application From 42262e042892714e93534edad7293379895f1852 Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Mon, 6 Jan 2025 13:06:44 -0500 Subject: [PATCH 27/30] update note --- docs/hub/spaces-sdks-docker-langfuse.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index bc57a4063..11daf52a2 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -49,10 +49,10 @@ Now that you have Langfuse running, you can start instrumenting your LLM applica Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login. -> [!TIP] -> The Langfuse space must be set to **public** visibility so that the Langfuse API/SDK's can access the app. +> **Note:** +> The Langfuse space _must_ be set to **public** visibility so that the Langfuse API/SDK's can access the app. > -> By default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new user's from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. +> By default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new user's from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. Be sure that you've first signed in to the space before setting this variable. > > Inside of the app, you can use [the native Langfuse features](https://langfuse.com/docs/rbac) to manage Organizations, Projects, and Users. From 345cd9690eb8da3c0a36681d8a78522bc239fa5d Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Mon, 6 Jan 2025 13:54:08 -0500 Subject: [PATCH 28/30] back to [!TIP] --- docs/hub/spaces-sdks-docker-langfuse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 11daf52a2..8ad36e3c9 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -49,7 +49,7 @@ Now that you have Langfuse running, you can start instrumenting your LLM applica Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login. -> **Note:** +> [!TIP] > The Langfuse space _must_ be set to **public** visibility so that the Langfuse API/SDK's can access the app. > > By default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new user's from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. Be sure that you've first signed in to the space before setting this variable. From 5b735433a86e732eeddcabae123960103845732d Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Mon, 6 Jan 2025 15:14:35 -0500 Subject: [PATCH 29/30] clarify user access --- docs/hub/spaces-sdks-docker-langfuse.md | 22 ++++++++++++++-------- 1 file changed, 14 insertions(+), 8 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 8ad36e3c9..6c58a5e55 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -40,21 +40,27 @@ To get started, click the button above or follow these steps: - `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. - `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. +7. Click **Create Space**! ![Clone the Langfuse Space](https://langfuse.com/images/cookbook/huggingface/huggingface-space-setup.png) -## Step 2: Use Langfuse +### User Access + +Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login by following the instructions in the pop-up. -Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. +The Langfuse space _must_ be set to **public** visibility so that Langfuse API/SDK's can reach the app. This means that by default, _any_ logged-in Hugging Face user will be able to access the Langfuse space! -Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login. +You can prevent new users from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. Be sure that you've first signed in & authenticated to the space before setting this variable else your own user profile won't be able to authenticate. + +Once inside the app, you can use [the native Langfuse features](https://langfuse.com/docs/rbac) to manage Organizations, Projects, and Users. > [!TIP] -> The Langfuse space _must_ be set to **public** visibility so that the Langfuse API/SDK's can access the app. -> -> By default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new user's from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. Be sure that you've first signed in to the space before setting this variable. -> -> Inside of the app, you can use [the native Langfuse features](https://langfuse.com/docs/rbac) to manage Organizations, Projects, and Users. +> **Note:** If you've set the `AUTH_DISABLE_SIGNUP` environment variable to `true` to restrict access, and want to grant a new user access to the space, you'll need to first set it back to `false` (wait for rebuild to complete), add the user and have them authenticate with OAuth, and then set it back to `true`. + + +## Step 2: Use Langfuse + +Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. ### Monitor Any Application From 1e27b68189587c8f6d699e8cd8b5b4531e9a72be Mon Sep 17 00:00:00 2001 From: Andrew Reed Date: Tue, 7 Jan 2025 09:48:55 -0500 Subject: [PATCH 30/30] minor cleanup --- docs/hub/spaces-sdks-docker-langfuse.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/hub/spaces-sdks-docker-langfuse.md b/docs/hub/spaces-sdks-docker-langfuse.md index 6c58a5e55..6a0d69137 100644 --- a/docs/hub/spaces-sdks-docker-langfuse.md +++ b/docs/hub/spaces-sdks-docker-langfuse.md @@ -1,6 +1,6 @@ # Langfuse on Spaces -This guide shows you how to deploy Langfuse on Hugging Face Spaces and start instrumenting your LLM application. This integration helps you to experiment on Hugging Face models, manage your prompts in one place and evaluate model outputs. +This guide shows you how to deploy Langfuse on Hugging Face Spaces and start instrumenting your LLM application for observability. This integration helps you to experiment with LLM APIs on the Hugging Face Hub, manage your prompts in one place, and evaluate model outputs. ## What is Langfuse? @@ -46,9 +46,9 @@ To get started, click the button above or follow these steps: ### User Access -Your Langfuse Space is pre-configured to use Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login by following the instructions in the pop-up. +Your Langfuse Space is pre-configured with Hugging Face OAuth for secure authentication, so you'll need to authorize `read` access to your Hugging Face account upon first login by following the instructions in the pop-up. -The Langfuse space _must_ be set to **public** visibility so that Langfuse API/SDK's can reach the app. This means that by default, _any_ logged-in Hugging Face user will be able to access the Langfuse space! +The Langfuse space _must_ be set to **public** visibility so that Langfuse API/SDK's can reach the app. This means that by default, _any_ logged-in Hugging Face user will be able to access the Langfuse space. You can prevent new users from signing up and accessing the space by setting the `AUTH_DISABLE_SIGNUP` environment variable to `true`. Be sure that you've first signed in & authenticated to the space before setting this variable else your own user profile won't be able to authenticate. @@ -60,7 +60,7 @@ Once inside the app, you can use [the native Langfuse features](https://langfuse ## Step 2: Use Langfuse -Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. +Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts. Let's see how! ### Monitor Any Application @@ -70,7 +70,7 @@ Langfuse maintains native integrations with many popular LLM frameworks, includi ### Example 1: Trace Calls to HF Serverless API -As a simple example, here's how to trace LLM calls to the HF Serverless API using the Langfuse Python SDK. +As a simple example, here's how to trace LLM calls to the [HF Serverless API](https://huggingface.co/docs/api-inference/en/index) using the Langfuse Python SDK. Be sure to first configure your `LANGFUSE_HOST`, `LANGFUSE_PUBLIC_KEY` and `LANGFUSE_SECRET_KEY` environment variables, and make sure you've [authenticated with your Hugging Face account](https://huggingface.co/docs/huggingface_hub/en/quick-start#authentication).