TraceRoot is Open-source observability and self-healing layer for AI agents. Capture traces, debug with AI, ship with confidence.
| Feature | Description |
|---|---|
| Tracing | Capture LLM calls, agent actions, and tool usage via OpenTelemetry-compatible SDK |
| Agentic Debugging | AI-native root cause analysis with GitHub integration and BYOK support |
-
Traces alone don't scale.
As AI agent systems grow more complex, manually sifting through traces is not sustainable. TraceRoot pairs structured observability with AI-powered analysis so you can pinpoint issues, not just see them.
-
Debugging ai agent system is painful.
Root-causing failures across agent hallucinations, tool call instabilities, and version changes is challenging. TraceRoot provides AI-native debugging that connects your traces to your code version and bug history.
-
Fully open source, no vendor lock-in.
Both the observability platform and the AI debugging layer are open source. BYOK support for any model provider — OpenAI, Anthropic, Gemini, DeepSeek, and more.
pip install traceroot openai# Add these in the `.env` file in root directory
TRACEROOT_API_KEY="tr-0f29d..."
TRACEROOT_HOST_URL="https://app.traceroot.ai" # cloud (default)
# TRACEROOT_HOST_URL=http://localhost:8000 # local development modeimport traceroot
from traceroot import Integration, observe
from openai import OpenAI
traceroot.initialize(integrations=[Integration.OPENAI])
client = OpenAI()
@observe(name="my_agent", type="llm")
def my_agent(query: str) -> str:
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": query}],
)
return response.choices[0].message.content
if __name__ == "__main__":
my_agent("What's the weather in SF?")See the Quickstart Guide for more examples.
The fastest way to get started. Ample storages and LLM tokens for testing, no credit card needed. Sign up here!
-
Developer mode: Run TraceRoot locally to contribute.
# Get a copy of the latest repo git clone https://github.com/traceroot-ai/traceroot.git cd traceroot # Hosted the infras in docker and app itself locally make dev
For more details, see CONTRIBUTING.md.
-
Local docker mode: Run TraceRoot locally to test.
# Get a copy of the latest repo git clone https://github.com/traceroot-ai/traceroot.git cd traceroot # Hosted everything in docker make prod
-
Terraform (AWS): Run TraceRoot on k8s with Helm and Terraform. This is for production hosting. Still in experimental stage.
| Language | Repository |
|---|---|
| Python | traceroot-py |
Full documentation available at docs.traceroot.ai.
Your data security and privacy are our top priorities. Learn more in our Security and Privacy documentation.
Special Thanks for pi-mono project, which powers the foundation of our agentic debugging runtime!
Contributing 🤝: If you're interested in contributing, you can check out our guide here. All types of help are appreciated :)
Support 💬: If you need any type of support, we're typically most responsive on our Discord channel, but feel free to email us founders@traceroot.ai too!
This project is licensed under Apache 2.0 with additional Enterprise features.