Failed to allocate memory for requested buffer of size 234881024 #22162
Unanswered
andyzheng-snps
asked this question in
General
Replies: 2 comments
-
do you solve it? i also meet it |
Beta Was this translation helpful? Give feedback.
0 replies
-
no, my colleague said it is caused by our limited GPU memory. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
when I switch cpu to gpu when run onnxruntime, it reports below issue:
2024-09-20 12:45:17.412405216 [E:onnxruntime:, inference_session.cc:2044 operator()] Exception during initialization: /home/conda/feedstock_root/build_artifacts/onnxruntime_1725367806287/work/onnxruntime/core/framework/bfc_arena.cc:376 void* onnxruntime::BFCArena::AllocateRawInternal(size_t, bool, onnxruntime::Stream*, bool, onnxruntime::WaitNotificationFn) Failed to allocate memory for requested buffer of size 234881024
Traceback (most recent call last):
File "/remote/ailab1/zhaoqing/transformer_preprocessing/onnx_parser/onnxruntime_profile.py", line 110, in
ort_outs = model_profile_by_onnxruntime(model_path,batch_size=64, sequence_length=64)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/remote/ailab1/zhaoqing/transformer_preprocessing/onnx_parser/onnxruntime_profile.py", line 74, in model_profile_by_onnxruntime
ort_session = onnxruntime.InferenceSession(model_path,sess_options,providers=onnxruntime.get_available_providers())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/remote/ailab1/zhaoqing/miniforge3/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/remote/ailab1/zhaoqing/miniforge3/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 483, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: /home/conda/feedstock_root/build_artifacts/onnxruntime_1725367806287/work/onnxruntime/core/framework/bfc_arena.cc:376 void* onnxruntime::BFCArena::AllocateRawInternal(size_t, bool, onnxruntime::Stream*, bool, onnxruntime::WaitNotificationFn) Failed to allocate memory for requested buffer of size 234881024
I also made below session option according to copilot suggestion, but it reports the same issue.
sess_options.add_session_config_entry("initial_memory_arena_size", "67108864")
sess_options.add_session_config_entry("gpu_mem_limit", "1073741824")
sess_options.add_session_config_entry("kOrtSessionOptionsUseDeviceAllocatorForInitializers", "1")
could somebody help me check this issue?
Andy
Beta Was this translation helpful? Give feedback.
All reactions