-
Hi, Thank you for amazing library. I have been user since first versions of autograd. Recently I have noticed that when summing up large arrays of numbers using jax and GPU device can give different errors when executing the sum over the same arrray multiple times. I observed the same when executed the code on my system and on the colab. When using CPU device the error is much smaller and is always the same. Also what is strange the error when using jax and CPU device is the same as the error when doing the same operation using pycuda on GPU.
True sum: 50002915.5178690553
True sum: 50002915.5178690553 The source code:
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Can you try with By default GPU uses a non-deterministic reduction algorithm, but there's a deterministic version, not yet enabled by default, that is deterministic. |
Beta Was this translation helpful? Give feedback.
Can you try with
XLA_FLAGS=--xla_gpu_deterministic_reductions
?By default GPU uses a non-deterministic reduction algorithm, but there's a deterministic version, not yet enabled by default, that is deterministic.