erh_llm is a Rust-based library and toolkit for working with large language models (LLMs) in retrieval-augmented generation (RAG) and conversational AI scenarios. It provides modules for prompt engineering, document handling, and memory management, supporting both in-memory and database-backed chat histories. The repository is designed to help developers build, experiment with, and deploy LLM-powered applications, including chatbots and RAG systems, with a focus on extensibility and integration.
- Modular architecture for LLM interaction and RAG workflows
- Prompt engineering utilities
- Document and context management
- Pluggable memory/history backends (in-memory, MySQL, SQLite)
- Example code and documentation for quick start
See the documentation and example code in the repository for usage instructions and integration tips.