Skip to content

Conversation

@RayChang987
Copy link
Contributor

Introduces support for the Gemini 2.5 Flash model in our LLM integration and updates the README to guide users on which environment variables to set for each Gemini model.

@RayChang987 RayChang987 force-pushed the master branch 2 times, most recently from 562cc4d to 0ef1400 Compare October 12, 2025 06:26
@RayChang987
Copy link
Contributor Author

@luarss I don't think Gemini 1.0 Pro, 1.5 Pro, or 1.5 Flash are available anymore. In the Gemini API (via Google AI Studio or Vertex AI), these models are no longer listed as current or supported options.

Copy link
Collaborator

@luarss luarss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the contribution!

Copy link
Collaborator

@luarss luarss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@luarss luarss merged commit 185deda into The-OpenROAD-Project:master Oct 18, 2025
2 checks passed
luarss added a commit to luarss/ORAssistant that referenced this pull request Oct 18, 2025
…D-Project#175)

* add support for Gemini 2.5 Flash model and update README

* Add support for Gemini 2.0-flash and 2.5-pro models

* Update backend/src/api/routers/graphs.py

* Update README.md

---------

Signed-off-by: JR <[email protected]>
Co-authored-by: Song Luar <[email protected]>
Signed-off-by: Jack Luar <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants