fix: update default VLM model from gemini-2.0-flash to gemini-2.5-flash#48
Open
haruyuki-oda wants to merge 1 commit intollmsresearch:mainfrom
Open
fix: update default VLM model from gemini-2.0-flash to gemini-2.5-flash#48haruyuki-oda wants to merge 1 commit intollmsresearch:mainfrom
haruyuki-oda wants to merge 1 commit intollmsresearch:mainfrom
Conversation
gemini-2.0-flash has been discontinued by Google and returns 404 NOT_FOUND for new users. Update all default references to gemini-2.5-flash so that `paperbanana generate` works out of the box. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Author
|
I found that the following PR may be duplicated with me. please feel free to check and close the PR. |
Member
|
#45 seems to be solving the same issue. I asked the author to close the PR, and then I will merge this PR. Thanks for your contribution. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
gemini-2.0-flash has been discontinued by Google and returns 404 NOT_FOUND for new users. Update all default references to gemini-2.5-flash so that
paperbanana generateworks out of the box.fix: update default VLM model from gemini-2.0-flash to gemini-2.5-flash
Summary
gemini-2.0-flashhas been discontinued by Google and returns404 NOT_FOUNDfor new usersgemini-2.5-flashso thatpaperbanana generateworks out of the boxChanges
paperbanana/core/config.pyVLMConfig.modelandSettings.vlm_modelpaperbanana/providers/vlm/gemini.pyGeminiVLM.__init__configs/config.yamlconfigs/provider/vlm/gemini.yamlexamples/generate_diagram.pytests/test_providers/test_registry.pyReproduction
Test plan
paperbanana generate --input method.txt --caption 'Overview of our framework'completes with 3 iterationspytest tests/passes (53/54 pass; 1 pre-existing failure unrelated to this change)--vlm-modelCLI override still works for users who explicitly specify a model