Skip to content

Comments

feat(vertex_ai): add Gemini 3.1 Pro Preview model support#2604

Closed
Eric-Geek wants to merge 0 commit intolanggenius:mainfrom
Eric-Geek:main
Closed

feat(vertex_ai): add Gemini 3.1 Pro Preview model support#2604
Eric-Geek wants to merge 0 commit intolanggenius:mainfrom
Eric-Geek:main

Conversation

@Eric-Geek
Copy link
Contributor

Related Issues or Context

This PR contains Changes to Non-Plugin

  • Documentation
  • Other

This PR contains Changes to Non-LLM Models Plugin

  • I have Run Comprehensive Tests Relevant to My Changes

This PR contains Changes to LLM Models Plugin

  • My Changes Affect Message Flow Handling (System Messages and User→Assistant Turn-Taking)
  • My Changes Affect Tool Interaction Flow (Multi-Round Usage and Output Handling, for both Agent App and Agent Node)
  • My Changes Affect Multimodal Input Handling (Images, PDFs, Audio, Video, etc.)
  • My Changes Affect Multimodal Output Generation (Images, Audio, Video, etc.)
  • My Changes Affect Structured Output Format (JSON, XML, etc.)
  • My Changes Affect Token Consumption Metrics
  • My Changes Affect Other LLM Functionalities (Reasoning Process, Grounding, Prompt Caching, etc.)
  • Other Changes (Add New Models, Fix Model Parameters etc.)

Version Control (Any Changes to the Plugin Will Require Bumping the Version)

  • I have Bumped Up the Version in Manifest.yaml (Top-Level Version Field, Not in Meta Section)

Dify Plugin SDK Version

  • I have Ensured dify_plugin>=0.3.0,<0.6.0 is in requirements.txt (SDK docs)

Environment Verification (If Any Code Changes)

Local Deployment Environment

  • Dify Version is: , I have Tested My Changes on Local Deployment Dify with a Clean Environment That Matches the Production Configuration.

SaaS Environment

  • I have Tested My Changes on cloud.dify.ai with a Clean Environment That Matches the Production Configuration

@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label Feb 20, 2026
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @Eric-Geek, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request integrates the new Gemini 3.1 Pro Preview model from Vertex AI, significantly expanding the platform's large language model capabilities. This addition allows users to leverage advanced features such as multi-tool calls, agent thoughts, multimodal inputs (vision, document, video, audio), structured output, and fine-grained control over reasoning and media processing, thereby enhancing the system's ability to handle complex and diverse AI tasks.

Highlights

  • New Model Integration: Introduced support for the Gemini 3.1 Pro Preview model from Vertex AI, enabling its use within the platform.
  • Model Configuration: Added a dedicated YAML configuration file for Gemini 3.1 Pro Preview, defining its features, properties, and parameter rules, including multimodal capabilities and advanced reasoning controls.
  • Model Registration: Updated the llm.py file to include 'gemini-3.1-pro-preview' in the list of globally available models.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • models/vertex_ai/models/llm/gemini-3.1-pro-preview.yaml
    • Added a new YAML configuration file for the Gemini 3.1 Pro Preview model.
    • Defined model features including tool-call, multi-tool-call, agent-thought, vision, stream-tool-call, document, video, audio, and structured-output.
    • Configured model properties such as chat mode and a context size of 1048576.
    • Specified parameter rules for temperature, top_p, top_k, max_output_tokens, thinking_level, include_thoughts, media_resolution, grounding, url_context, and code_execution.
    • Set pricing for input and output tokens.
  • models/vertex_ai/models/llm/llm.py
    • Added "gemini-3.1-pro-preview" to the GLOBAL_ONLY_MODELS_DEFAULT list.
Activity
  • No human activity has been recorded on this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@dosubot dosubot bot added the enhancement New feature or request label Feb 20, 2026
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for the Gemini 3.1 Pro Preview model. The changes include a new YAML configuration file for the model and an update to the list of global-only models in the Python code. The changes are correct. I've added a couple of suggestions to improve code quality and consistency.

Comment on lines 22 to 24
default: 1
min: 0
max: 2
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The type for temperature is defined as float, but the default, min, and max values are provided as integers. To ensure type consistency and prevent potential parsing issues, it's better to use float literals for these values.

    default: 1.0
    min: 0.0
    max: 2.0

Comment on lines 49 to 53
GLOBAL_ONLY_MODELS_DEFAULT = ["gemini-2.5-pro-preview-06-05", "gemini-2.5-flash-lite-preview-06-17",
"gemini-2.5-flash-preview-09-2025", "gemini-2.5-flash-lite-preview-09-2025",
"gemini-3-pro-preview", "gemini-3-flash-preview",
"gemini-3.1-pro-preview",
"gemini-2.5-computer-use-preview-10-2025"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For better readability and easier maintenance, it's a good practice to keep this list of models sorted alphabetically. Formatting it with one model per line also improves readability.

Suggested change
GLOBAL_ONLY_MODELS_DEFAULT = ["gemini-2.5-pro-preview-06-05", "gemini-2.5-flash-lite-preview-06-17",
"gemini-2.5-flash-preview-09-2025", "gemini-2.5-flash-lite-preview-09-2025",
"gemini-3-pro-preview", "gemini-3-flash-preview",
"gemini-3.1-pro-preview",
"gemini-2.5-computer-use-preview-10-2025"]
GLOBAL_ONLY_MODELS_DEFAULT = [
"gemini-2.5-computer-use-preview-10-2025",
"gemini-2.5-flash-lite-preview-06-17",
"gemini-2.5-flash-lite-preview-09-2025",
"gemini-2.5-flash-preview-09-2025",
"gemini-2.5-pro-preview-06-05",
"gemini-3-flash-preview",
"gemini-3-pro-preview",
"gemini-3.1-pro-preview",
]

@WH-2099
Copy link
Member

WH-2099 commented Feb 20, 2026

Hi, I'm following up on support for the Gemini 3.1 series. As you can see, I added the basic configuration for the Gemini plugin in #2606. To avoid ambiguity, I want our YAML configurations to be as consistent as possible.
Currently, the main difference between our two implementations lies in the specific configuration values ​​of temperature, top_p, and top_k. I plan to organize and integrate these later today. If you have any ideas, please feel free to contact me.

@Eric-Geek
Copy link
Contributor Author

Hi, I'm following up on support for the Gemini 3.1 series. As you can see, I added the basic configuration for the Gemini plugin in #2606. To avoid ambiguity, I want our YAML configurations to be as consistent as possible.

Currently, the main difference between our two implementations lies in the specific configuration values ​​of temperature, top_p, and top_k. I plan to organize and integrate these later today. If you have any ideas, please feel free to contact me.

👌

@Eric-Geek Eric-Geek closed this Feb 21, 2026
@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. and removed size:L This PR changes 100-499 lines, ignoring generated files. labels Feb 21, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request size:XS This PR changes 0-9 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants