Skip to content
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
7496fef
docs: langfuse on spcaes guide and gradio example
jannikmaierhoefer Dec 16, 2024
0fffbdb
edit toctree
jannikmaierhoefer Dec 17, 2024
fcc7485
text edit
jannikmaierhoefer Dec 17, 2024
82e3972
edit troubleshooting part
jannikmaierhoefer Dec 19, 2024
f024ddc
edit text
jannikmaierhoefer Dec 20, 2024
41dfe8a
update numbers
jannikmaierhoefer Dec 20, 2024
ac70405
fix spelling
jannikmaierhoefer Dec 20, 2024
8462e6f
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
ad18cdd
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
a0dfd6e
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
112e7e9
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
35f25b3
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
19dc6c0
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
737301c
move troubleshoot section to gradio template readme as this is only g…
jannikmaierhoefer Dec 20, 2024
4c74941
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
9849195
edit gradio link name
jannikmaierhoefer Dec 20, 2024
9336d5d
Apply suggestions from code review
andrewrreed Dec 20, 2024
192fb20
fix setup steps numbered list formatting
andrewrreed Dec 20, 2024
b1a5a3d
Add simple tracing example with HF Serverless API
andrewrreed Dec 20, 2024
0ca2049
remove <tip> for link formatting
andrewrreed Dec 20, 2024
d08059e
point "Deploy on HF" to preselected template
andrewrreed Dec 20, 2024
27485c5
Update docs/hub/spaces-sdks-docker-langfuse.md
andrewrreed Jan 2, 2025
fc5070c
include note about HF OAuth
andrewrreed Jan 2, 2025
3b7d39a
add note about AUTH_DISABLE_SIGNUP
andrewrreed Jan 6, 2025
5e976ec
fix tip syntax
andrewrreed Jan 6, 2025
cb366e6
alt tip syntax
andrewrreed Jan 6, 2025
42262e0
update note
andrewrreed Jan 6, 2025
345cd96
back to [!TIP]
andrewrreed Jan 6, 2025
5b73543
clarify user access
andrewrreed Jan 6, 2025
1e27b68
minor cleanup
andrewrreed Jan 7, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/hub/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -285,6 +285,8 @@
title: Evidence on Spaces
- local: spaces-sdks-docker-marimo
title: marimo on Spaces
- local: spaces-sdks-docker-langfuse
title: Langfuse on Spaces
- local: spaces-embed
title: Embed your Space
- local: spaces-run-with-docker
Expand Down
86 changes: 86 additions & 0 deletions docs/hub/spaces-sdks-docker-langfuse.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# Langfuse on Spaces

This guide shows you how to deploy Langfuse on Hugging Face Spaces and start instrumenting your LLM application. This integreation helps you to experiment on Hugging Face models, manage your prompts in one place and evaluate model outputs.

## What is Langfuse?

[Langfuse](https://langfuse.com) is an open-source LLM engineering platform that helps teams collaboratively debug, evaluate, and iterate on their LLM applications.

Key features of Langfuse include LLM tracing to capture the full context of your application's execution flow, prompt management for centralized and collaborative prompt iteration, evaluation metrics to assess output quality, dataset creation for testing and benchmarking, and a playground to experiment with prompts and model configurations.

_This video is a 10 min walkthrough of the Langfuse features:_
<iframe width="700" height="394" src="https://www.youtube.com/embed/2E8iTvGo9Hs?si=i_mPeArwkWc5_4EO" title="10 min Walkthrough of Langfuse – Open Source LLM Observability, Evaluation, and Prompt Management" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

## Why LLM Observability?

- As language models become more prevalent, understanding their behavior and performance is important.
- **LLM observability** involves monitoring and understanding the internal states of an LLM application through its outputs.
- It is essential for addressing challenges such as:
- **Complex control flows** with repeated or chained calls, making debugging challenging.
- **Non-deterministic outputs**, adding complexity to consistent quality assessment.
- **Varied user intents**, requiring deep understanding to improve user experience.
- Building LLM applications involves intricate workflows, and observability helps in managing these complexities.

## Step 1: Set up Langfuse on Spaces

The Langfuse Huggingface Space allows you to get up and running with a deployed version of Langfuse with just a few clicks. Within a few minutes, you'll have this default Langfuse dashboard deployed and ready for you to connect to from your local machine.

<a href="https://huggingface.co/spaces/langfuse/langfuse-template-space">
<img src="https://huggingface.co/datasets/huggingface/badges/resolve/main/deploy-to-spaces-lg.svg" />
</a>

1.1. Create a [**new Hugging Face Space**](https://huggingface.co/new-space)
1.2. Select **Docker** as the Space SDK
1.3. Select **Langfuse** as the Space template
1.4. Enable **persistent storage** to ensure your Langfuse data is persisted across restarts
1.5. Change the **Environment Variables**:
- `NEXTAUTH_SECRET`: Used to validate login session cookies, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment.
- `SALT`: Used to salt hashed API keys, generate secret with at least 256 entropy using `openssl rand -base64 32`. You should overwrite the default value here for a secure deployment.
- `ENCRYPTION_KEY`: Used to encrypt sensitive data. Must be 256 bits, 64 string characters in hex format, generate via: `openssl rand -hex 32`. You should overwrite the default value here for a secure deployment.

![Clone the Langfuse Space](https://langfuse.com/images/cookbook/huggingface/huggingface-space-setup.png)

## Step 2: Instrument your Code

Now that you have Langfuse running, you can start instrumenting your LLM application to capture traces and manage your prompts.

### Example: Monitor your Gradio Application

We created a Gradio template space that shows how to create a simple chat application using a Hugging Face model and trace model calls and user feedback in Langfuse - without leaving Hugging Face.

<a href="https://huggingface.co/spaces/langfuse/gradio-example-template">
<img src="https://huggingface.co/datasets/huggingface/badges/resolve/main/deploy-to-spaces-lg.svg" />
</a>

To get started, clone the [Gradio template space](https://huggingface.co/spaces/langfuse/gradio-example-template) and follow the instructions in the [README](https://huggingface.co/spaces/langfuse/gradio-example-template/blob/main/README.md).

### Monitor Any Application

Langfuse is model agnostic and can be used to trace any application. Follow the [get-started guide](https://langfuse.com/docs) in Langfuse documentation to see how you can instrument your code.

Langfuse maintains native integrations with many popular LLM frameworks, including [Langchain](https://langfuse.com/docs/integrations/langchain/tracing), [LlamaIndex](https://langfuse.com/docs/integrations/llama-index/get-started) and [OpenAI](https://langfuse.com/docs/integrations/openai/python/get-started) and offers Python and JS/TS SDKs to instrument your code. Langfuse also offers various API endpoints to ingest data and has been integrated by other open source projects such as [Langflow](https://langfuse.com/docs/integrations/langflow), [Dify](https://langfuse.com/docs/integrations/dify) and [Haystack](https://langfuse.com/docs/integrations/haystack/get-started).

## Step 3: View Traces in Langfuse

Once you have instrumented your application, and ingested traces or user feedback into Langfuse, you can view your traces in Langfuse.

![Example trace with Gradio](https://langfuse.com/images/cookbook/huggingface/huggingface-gradio-example-trace.png)

_[Example trace in the Langfuse UI](https://langfuse-langfuse-template-space.hf.space/project/cm4r1ajtn000a4co550swodxv/traces/9cdc12fb-71bf-4074-ab0b-0b8d212d839f?timestamp=2024-12-20T12%3A12%3A50.089Z&view=preview)_

## Additional Resources and Support

- [Langfuse documentation](https://langfuse.com/docs)
- [Langfuse GitHub repository](https://github.com/langfuse/langfuse)
- [Langfuse Discord](https://langfuse.com/discord)
- [Langfuse template Space](https://huggingface.co/spaces/langfuse/langfuse-template-space)

## Troubleshooting

If you encounter issues using the [Langfuse Gradio example template](https://huggingface.co/spaces/langfuse/gradio-example-template):

1. Make sure your notebook runs locally in app mode using `python app.py`
2. Check that all required packages are listed in `requirements.txt`
3. Check Space logs for any Python errors

For more help, open a support ticket on [GitHub discussions](https://langfuse.com/discussions) or [open an issue](https://github.com/langfuse/langfuse/issues).