Detect hallucinations in LLM responses. Verify every claim against source documents using hybrid STS + NLI. Works with LangChain, LlamaIndex, or any RAG pipeline. pip install longtracer
-
Updated
Apr 21, 2026 - Python
Detect hallucinations in LLM responses. Verify every claim against source documents using hybrid STS + NLI. Works with LangChain, LlamaIndex, or any RAG pipeline. pip install longtracer
The immune system for AI coding agents
Protect your AI from Prompt Injection
Add a description, image, and links to the gaurdrail topic page so that developers can more easily learn about it.
To associate your repository with the gaurdrail topic, visit your repo's landing page and select "manage topics."