Skip to content

Choose a tag to compare

@nvuillam nvuillam released this 25 Dec 00:42
· 2 commits to main since this release

What's Changed

  • Generate AI-powered summary of dependent repositories using LiteLLM
    • New options:
      • --llm-summary / --no-llm-summary: Enable or disable the AI-generated summary (default: enabled)
      • --llm-model: Specify the LiteLLM model to use (default: environment variable or default model)
      • --llm-max-repos: Cap the number of repositories sent to the summary prompt (default: 500)
      • --llm-max-words: Cap the length of the generated summary (default: 300)
      • --llm-timeout: Set timeout (in seconds) for the LLM call (default: 120)
  • Upgrade dependencies
Pull Requests

New Contributors

Full Changelog: v2.0.2...v3.0.0