You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+23Lines changed: 23 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,6 +21,29 @@ This library solves my challenges in building production-ready AI Agents such as
21
21
22
22
This library aims to solve the same challenges for you by providing a resilient layer that intelligently manages failures and rate limits, enabling you (developers) to integrate LLMs confidently and effortlessly at scale.
23
23
24
+
## Scope
25
+
26
+
### What's in scope
27
+
28
+
-**Unified LLM Interface**: Simple, consistent API across multiple LLM providers (OpenAI, Anthropic, Google Gemini, Ollama)
29
+
-**Resilience Features**: Circuit breakers, adaptive retries with exponential backoff, and intelligent failure recovery
30
+
-**Rate Limiting**: Token bucket rate limiting with automatic token estimation and enforcement
31
+
-**Production Readiness**: Handling of network issues, API rate limits, timeouts, and server overload scenarios
32
+
-**Basic Chat Functionality**: Support for conversational chat interfaces and message history
33
+
-**Request Control**: AbortController support for on-demand request cancellation and timeouts
34
+
-**Error Recovery**: Dynamic response to API signals like retry-after headers and provider-specific error codes
35
+
36
+
### What's not in scope
37
+
38
+
-**Complex LLM Orchestration**: Advanced workflows, chains, or multi-step LLM interactions (use LangChain or similar for complex use cases)
39
+
-**Multi-modal Support**: Image, audio, or video processing capabilities
40
+
-**Tool/Function Calling**: Advanced function calling or tool integration features
41
+
-**Streaming Responses**: Real-time streaming of LLM responses
0 commit comments