Replies: 2 comments
-
@denghuilu could you please explain? |
Beta Was this translation helpful? Give feedback.
-
Maybe the DP_ENABLE_MIXED_PREC=fp16 only have effect on the neural network part. However, when you set DP_INTERFACE_PREC=low, the custom op will use fp32, and when DP_INTERFACE_PREC is not set, the default precsion used in custom op is fp64, which may be much slower than fp32 and have greater influence on the performance than the mixed precsion in neural network, thus the single precision without mixed precision outperforms double precision with mixed precision. Actually I'm very confused why there are so many ways to set the precisions of different parts of the DP computation. As I asked in #1860, if the custom op must use fp64, why do they have fp32 implementations? If it is not the truth, why the SC20 paper test the accuracy of mixed precision only in cases of fp64 custom ops? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi! I'm trying to run two water examples
se_e2_a
andse_e2_a_mixed_prec
, and test the effect of mixed precision settings on MD performance. I used a script like this to train models:Then I run the example input in the
lmp
folder, and replaced the original input file with a larger supercell (24000 atoms, so that the hours/ns scales linearly with system size); I run 1000 steps but the ns/day seems to be same as running only 1 step. For single prec models I exportedLAMMPS_PLUGIN_PATH="/path/to/deep-md/lib/deepmd_lmp_low"
, and for double prec itsdeepmd_lmp
. I'm getting following results on a single RTX 2080:So it seems that the mixed precision only benefit (originally) double prec models, and it has no effect on performance if
DP_INTERFACE_PREC=low
is set. Also it seems that single precision outperforms mixed precision in this case --- so does it mean we should not use mixed precision, at least on consumer-level GPUs (since their fp64 performance is weak and fp32 with no mixed_prec is better)?Beta Was this translation helpful? Give feedback.
All reactions