Skip to content

fix: update default VLM model from deprecated gemini-2.0-flash to gemini-2.5-flash#45

Closed
haosenwang1018 wants to merge 1 commit intollmsresearch:mainfrom
haosenwang1018:fix/update-deprecated-gemini-model
Closed

fix: update default VLM model from deprecated gemini-2.0-flash to gemini-2.5-flash#45
haosenwang1018 wants to merge 1 commit intollmsresearch:mainfrom
haosenwang1018:fix/update-deprecated-gemini-model

Conversation

@haosenwang1018
Copy link
Copy Markdown
Contributor

Summary

Updates the default VLM model from gemini-2.0-flash to gemini-2.5-flash across the entire codebase.

Problem

gemini-2.0-flash has been deprecated by Google and returns 404 NOT_FOUND for new users, breaking the pipeline out of the box.

Changes

  • paperbanana/core/config.py: Updated default model in VLMConfig and Settings
  • paperbanana/providers/vlm/gemini.py: Updated default parameter
  • configs/config.yaml and configs/provider/vlm/gemini.yaml: Updated defaults
  • tests/test_providers/test_registry.py: Updated test expectations
  • examples/generate_diagram.py: Updated example
  • README.md: Updated model reference in provider table

Existing users with VLM_MODEL env var or custom config are unaffected.

Fixes #44

…ini-2.5-flash

The gemini-2.0-flash model has been deprecated by Google and returns
a 404 for new users. Update all default references to gemini-2.5-flash
which is the current recommended replacement.

Fixes llmsresearch#44

Signed-off-by: haosenwang1018 <haosenwang1018@users.noreply.github.com>
@haosenwang1018 haosenwang1018 force-pushed the fix/update-deprecated-gemini-model branch from 18aa4ae to a169d75 Compare February 23, 2026 00:07
@dippatel1994
Copy link
Copy Markdown
Member

Thanks @haosenwang1018 for the fix, the core change makes sense, and issue linking looks good (Fixes #44).
One thing to align before merge: the PR description says configs/config.yaml, configs/provider/vlm/gemini.yaml, and README.md were updated, but those files are not part of the current changed file list.
Could you either:

  • Include those file updates in this PR, or
  • Update the PR description to match the current scope

Also, a small follow-up test for YAML-based config loading (for example, through Settings.from_yaml(...)) would help ensure defaults resolve to gemini-2.5-flash in config-driven paths too.

@dippatel1994
Copy link
Copy Markdown
Member

@haosenwang1018 just noticed, #48 appears to supersede this PR with the same core fix plus config-file updates. To avoid duplicated work, can we consolidate into one PR? If you want to keep this PR as the canonical one, please include updates to configs/config.yaml and configs/provider/vlm/gemini.yaml. Otherwise, we can close this in favor of #48. Thanks for the contribution.

@haosenwang1018
Copy link
Copy Markdown
Contributor Author

Closing per backlog policy; can revisit with narrower scope.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Pipeline broken due to deprecated gemini-2.0-flash model

2 participants