Skip to content

thinking_level not accepted when initializing Gemini 3 via init_chat_model() #1366

@Gu-Haojia

Description

@Gu-Haojia

Package (Required)

  • langchain-google-genai
  • langchain-google-vertexai
  • langchain-google-community
  • Other / not sure / general

Checked other resources

  • I added a descriptive title to this issue
  • I searched the LangChain documentation and API reference (linked above)
  • I used the GitHub search to find a similar issue and didn't find it
  • I am sure this is a bug and not a question or request for help

Example Code (Python)

from langchain.chat_models import init_chat_model

model = init_chat_model(
    "google_genai:gemini-3-pro-preview",
    thinking_level="low",  # <-- new field from Gemini docs
)

# do invoke

Error Message and Stack Trace (if applicable)

Model return: Unknown field for ThinkingConfig: thinking_level

Description

According to Google’s latest Gemini thinking docs (https://ai.google.dev/gemini-api/docs/thinking), thinking_level is a supported parameter alongside thinking_budget. When we attempt to pass thinking_level="low" through LangChain’s init_chat_model, the config validation rejects it with “Unknown field for ThinkingConfig: thinking_level”. This prevents us from using the new low/high thinking presets while still leveraging LangChain’s tooling/graph integrations.

Environment: langchain==1.0.7, langchain-core==1.0.5, langchain-google-genai==3.1.0.
Expectation: LangChain should either accept thinking_level and forward it to Gemini, or clearly document that only the legacy thinking_budget* fields are supported.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinggenai

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions