Skip to content

Commit 719e8d3

Browse files
committed
add note about memory consumption on tesla CI runner for failing test
1 parent 376adf9 commit 719e8d3

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

tests/models/test_modeling_common.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1422,6 +1422,8 @@ def get_memory_usage(storage_dtype, compute_dtype):
14221422
)
14231423

14241424
self.assertTrue(fp8_e4m3_bf16_memory_footprint < fp8_e4m3_fp32_memory_footprint < fp32_memory_footprint)
1425+
# NOTE: the following assertion will fail on our CI (running Tesla T4) due to bf16 using more memory than fp32.
1426+
# On other devices, such as DGX (Ampere) and Audace (Ada), the test passes.
14251427
self.assertTrue(fp8_e4m3_bf16_max_memory < fp8_e4m3_fp32_max_memory)
14261428
# On this dummy test case with a small model, sometimes fp8_e4m3_fp32 max memory usage is higher than fp32 by a few
14271429
# bytes. This only happens for some models, so we allow a small tolerance.

0 commit comments

Comments
 (0)