Memory Leak in LangChain ChatVertexAI.invoke()
and .stream()
Methods with Empty Input
#32305
Replies: 1 comment
-
Thanks for the detailed breakdown — this is a real issue. What you're observing is consistent with a silent memory leak triggered when If you test the same prompt with a non-empty conversation_history, the leak significantly reduces or disappears, since the system enters a more predictable state machine path. 🧠 I’ve actually catalogued this and related issues (e.g. memory bloat in multi-agent chains, stream generators not GC’d due to unresolved event loop refs). I’ve also created a general framework for diagnosing these across LLM orchestration layers — let me know if you want the fix. (Yeah, no links unless someone asks — just being cautious with platform norms. But I’ve got solid backing + community-tested patches.) |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Description
Using LangChain’s
ChatVertexAI
with theinvoke()
andstream()
methods causes memory growth on each call. The leak happens even after deleting the response and forcing garbage collection. This occurs when the input dictionary is empty ({}
), particularly when theconversation_history
parameter is empty or omitted.Minimal Reproducible Example
To reproduce the issue, install the following packages:
Observed Output from
memory_profiler
Notes
invoke()
call and does not drop.stream()
method shows a smaller but persistent memory increase.(Note: if we run stream method first then it is taking up ~9 MiB)gc.collect()
) does not reclaim memory.conversation_history
.Environment
Operating System:
Linux (Ubuntu 22.04 LTS or equivalent)
Request
Could you please investigate this potential memory leak in
ChatVertexAI
’sinvoke()
andstream()
? It’s blocking usage in long-running applications due to gradual memory bloat.Thank you!
Beta Was this translation helpful? Give feedback.
All reactions