-
Notifications
You must be signed in to change notification settings - Fork 154
Open
Description
Summary
langmem currently ships with langchain-openai and langchain-anthropic as dependencies, but has no built-in support or documentation for using GCP Vertex AI models via langchain-google-vertexai.
Many production workloads run on GCP and use Vertex AI as their LLM gateway. Since langmem's internal functions use init_chat_model() from langchain, Vertex AI models should work when passing a string like "google_vertexai:gemini-2.0-flash" — but this is neither documented nor tested.
Request
- Add
langchain-google-vertexaias an optional dependency:[project.optional-dependencies] gcp = ["langchain-google-vertexai>=2.0.0"]
- Add documentation / examples showing how to use Vertex AI models with langmem
- Validate that core features (memory extraction, prompt optimization, etc.) work with
ChatVertexAI
Example usage
from langchain_google_vertexai import ChatVertexAI
from langmem import create_memory_store_manager
model = ChatVertexAI(model="gemini-2.0-flash")
manager = create_memory_store_manager(model)or via string identifier:
manager = create_memory_store_manager("google_vertexai:gemini-2.0-flash")Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels