High memory consumption in file import with 400k records #3706
Unanswered
belchiorplan
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to import a file with 400k lines and 130MB, but there's always a lack of memory, even with 12gb of VRAM and 3gb of Swap in docker, always giving me the error:
The process has been signaled with signal "9".
Beta Was this translation helpful? Give feedback.
All reactions