Memory depends on the data size? #859
Unanswered
jtyang-chem
asked this question in
Q&A
Replies: 1 comment 1 reply
-
I don't think it's surprising to see the memory is related to the data size... |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
My system has 7 elements and 663 atom on every frame, and when I put over 500 frames in my "./data", dp would spit the memory exceed error:
2021-07-15 16:52:30.160138: W tensorflow/core/framework/cpu_allocator_impl.cc:80] Allocation of 21290256000 exceeds 10% of free system memory. 2021-07-15 16:52:37.312288: W tensorflow/core/framework/cpu_allocator_impl.cc:80] Allocation of 14462800000 exceeds 10% of free system memory. 2021-07-15 16:52:38.078152: W tensorflow/core/framework/cpu_allocator_impl.cc:80] Allocation of 14462800000 exceeds 10% of free system memory. ^C^C^C2021-07-15 16:53:42.886454: W tensorflow/core/framework/op_kernel.cc:1763] OP_REQUIRES failed at concat_op.cc:158 : Resource exhausted: OOM when allocating tensor with shape[9614000,50] and type double on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu Kill
if with greater frames, the last failed msg missed, process killed either.
With frames 100, dp runs well.
Smaller parameters like
batch_size
,neuron
didn't make memory use smaller.Any hints?
Beta Was this translation helpful? Give feedback.
All reactions