You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A production-grade, model-agnostic orchestration framework for zero-error, hyper-efficient LLM systems.
8
+
A **100% local-first**, model-agnostic orchestration framework for zero-error, hyper-efficient LLM systems.
9
9
10
-
HANERMA eliminates hallucinations, prevents error propagation through atomic guard levels, and enables infinite context via a hyperfast compressed memory store (HCMS). Built for developers, optimized for production.
10
+
HANERMA eliminates hallucinations, prevents error propagation through atomic guard levels, and enables infinite context via a hyperfast compressed memory store (HCMS). Built for developers, optimized for production.**No mandatory API keys. No vendor lock-in.**
***Zero Error Propagation:** Built-in circuit breakers prevent hallucinations from cascading across agents.
18
18
***Hyperfast Infinite Context:** O(1) retrieval from our custom Graph-Vector DB using custom token-compression adapters (e.g., XERV CRAYON).
19
-
***Multi-Agent Native:** Seamlessly route tasks between Grok-4.2, Llama 3, or your own custom personas.
19
+
***100% Model Agnostic:** Seamlessly route between **Local (Ollama)**, **HuggingFace**, **OpenRouter (300+ models)**, or any OpenAI-compatible endpoint.
20
20
***Real-Time Streaming:** Native FastAPI WebSocket support for live thought-streaming to UI frontends.
21
21
22
22
## 📦 Installation
23
23
24
-
HANERMA is available immediately. No mandatory API keys required for local execution.
24
+
HANERMA is available immediately. **No mandatory API keys required** for local execution.
25
25
26
26
```bash
27
27
pip install hanerma
28
28
```
29
29
30
-
## 🛠️ Quickstart
30
+
## 🛠️ Quickstart (100% Local)
31
+
32
+
```bash
33
+
# 1. Clone & copy the env template
34
+
git clone https://github.com/hanerma/hanerma.git
35
+
cd hanerma
36
+
cp .env.example .env
37
+
38
+
# 2. Spin up the full stack (API + Neo4j + Redis + Ollama)
See the `/docs/benchmarks.md` file for full reproduction steps.
64
-
65
-
## 🌐 Deploying as a Platform API
66
-
HANERMA ships with a built-in FastAPI server for multi-tenant builder platforms:
67
-
68
-
```bash
69
-
docker-compose up -d
70
-
```
71
-
72
-
Your multi-agent REST API and WebSocket streaming endpoints are now live on `localhost:8000`.
96
+
See `/docs/benchmarks/performance.md` for full reproduction steps.
73
97
74
98
## 🤝 Contributing
75
-
We welcome contributions! Please see our `CONTRIBUTING.md` for details on how to add custom memory adapters, new tool sandboxes, or custom tokenizer implementations.
99
+
We welcome contributions! Please see our `CONTRIBUTING.md` for details.
0 commit comments