-
Ensure the bug was not already reported by searching on GitHub under Issues.
-
If you're unable to find an open issue addressing the problem, open a new one. Include a title and clear description, relevant information, and a code sample demonstrating the issue.
-
Verify it's a RubyLLM bug, not your application code, before opening an issue.
-
Open a new GitHub pull request with the patch.
-
Ensure the PR description clearly describes the problem and solution. Include the relevant issue number if applicable.
-
Run
overcommit --installbefore committing - it handles code style and tests automatically.
-
First check if this belongs in RubyLLM or your application:
- ✅ Core LLM communication (provider integrations, streaming, cost tracking)
- ❌ Application architecture (RAG, agents, prompt templates, testing helpers)
-
Features we'll reject:
- Multi-agent orchestration
- RAG pipelines
- Prompt management systems
- Vector database integrations
- Testing frameworks
- Anything you can implement in 5-10 lines of application code
-
Start by opening an issue to discuss the feature and its design. We want to keep RubyLLM simple and focused.
gh repo fork crmne/ruby_llm --clone && cd ruby_llm
bundle install
overcommit --install # Required - sets up git hooks
gh issue develop 123 --checkout # or create your own branch
# make changes, add tests
gh pr create --webovercommit --run
# Re-recording VCR cassettes (requires API keys):
rake vcr:record[openai,anthropic] # Specific providers
rake vcr:record[all] # EverythingAlways check cassettes for leaked API keys before committing.
- Never edit
models.json,aliases.json, oravailable-models.md- they're auto-generated byrake models - Write tests for any new functionality
- Keep it simple - if it needs extensive documentation, reconsider the approach
- Model data comes from Parsera. Firstly, go say thanks for their free service to the LLM dev community! They scrape LLM documentation and make it available to all of us in JSON. Secondly, file model data issues with them.
This is my gift to the Ruby community.
Gifts don't come with SLAs. I respond when I can.
If RubyLLM helps you, consider sponsoring.
Sponsorship is just a way to say thanks - it doesn't buy priority support or feature requests.
Go ship AI apps!
— Carmine