Skip to content

Add GCP Vertex AI model provider support #144

@ystarikovich

Description

Summary

langmem currently ships with langchain-openai and langchain-anthropic as dependencies, but has no built-in support or documentation for using GCP Vertex AI models via langchain-google-vertexai.

Many production workloads run on GCP and use Vertex AI as their LLM gateway. Since langmem's internal functions use init_chat_model() from langchain, Vertex AI models should work when passing a string like "google_vertexai:gemini-2.0-flash" — but this is neither documented nor tested.

Request

  • Add langchain-google-vertexai as an optional dependency:
    [project.optional-dependencies]
    gcp = ["langchain-google-vertexai>=2.0.0"]
  • Add documentation / examples showing how to use Vertex AI models with langmem
  • Validate that core features (memory extraction, prompt optimization, etc.) work with ChatVertexAI

Example usage

from langchain_google_vertexai import ChatVertexAI
from langmem import create_memory_store_manager

model = ChatVertexAI(model="gemini-2.0-flash")
manager = create_memory_store_manager(model)

or via string identifier:

manager = create_memory_store_manager("google_vertexai:gemini-2.0-flash")

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions