Keep getting the warning: external/org_tensorflow/tensorflow/compiler/xla/service/gpu/ir_emitter_triton.cc:758] Shared memory size limit exceeded #14929
-
I got the following warning when running a jitted JAX function on Nvidia v100 GPU.
The warning was only raised for the first several calls to the function, so I suspect it was during compilation. |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 5 replies
-
I also got the warning, is there anyone can help? |
Beta Was this translation helpful? Give feedback.
-
Same here, haven’t had this in Jax in the past two years or so. Are you on the latest version? |
Beta Was this translation helpful? Give feedback.
-
I have also been encountering this, breaking systems that since it registered as an error. Is there any fix for this or way to suppress it? I tried import os
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "3" But still see it. |
Beta Was this translation helpful? Give feedback.
-
Try add this at the top of the script import os
os.environ["XLA_FLAGS"] = "--xla_gpu_enable_triton_gemm=false" thanks @pschuh |
Beta Was this translation helpful? Give feedback.
Try add this at the top of the script
thanks @pschuh