-
Notifications
You must be signed in to change notification settings - Fork 13
add support for Gemini 2.5 Flash model and update README #175
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
562cc4d to
0ef1400
Compare
Signed-off-by: JR <[email protected]>
|
@luarss I don't think Gemini 1.0 Pro, 1.5 Pro, or 1.5 Flash are available anymore. In the Gemini API (via Google AI Studio or Vertex AI), these models are no longer listed as current or supported options. |
luarss
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the contribution!
Signed-off-by: JR <[email protected]>
Signed-off-by: Song Luar <[email protected]>
Signed-off-by: Song Luar <[email protected]>
luarss
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
…D-Project#175) * add support for Gemini 2.5 Flash model and update README * Add support for Gemini 2.0-flash and 2.5-pro models * Update backend/src/api/routers/graphs.py * Update README.md --------- Signed-off-by: JR <[email protected]> Co-authored-by: Song Luar <[email protected]> Signed-off-by: Jack Luar <[email protected]>
Introduces support for the Gemini 2.5 Flash model in our LLM integration and updates the README to guide users on which environment variables to set for each Gemini model.