Skip to content

Commit c537ea3

Browse files
authored
[Tests] Add test for example in "big_models[…]" example folder (#1738)
SUMMARY: Add a test for the example script in the “examples/big_models_with_sequential_onloading” folder. TEST PLAN: Test run with this test/branch: https://github.com/neuralmagic/llm-compressor-testing/actions/runs/16968671613 --------- Signed-off-by: Domenic Barbuzzi <[email protected]>
1 parent 6415131 commit c537ea3

File tree

1 file changed

+31
-0
lines changed

1 file changed

+31
-0
lines changed
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
from pathlib import Path
2+
3+
import pytest
4+
5+
from tests.examples.utils import (
6+
copy_and_run_script,
7+
gen_cmd_fail_message,
8+
requires_gpu_count,
9+
)
10+
11+
12+
@pytest.fixture
13+
def example_dir() -> str:
14+
return "examples/big_models_with_sequential_onloading"
15+
16+
17+
@pytest.mark.example
18+
@requires_gpu_count(1)
19+
class TestCompressedInference:
20+
"""
21+
Tests for examples in the "big_models_with_sequential_onloading" example folder.
22+
"""
23+
24+
def test_llama33_70b_example_script(self, example_dir: str, tmp_path: Path):
25+
"""
26+
Test for the "llama3.3_70b.py" script in the folder.
27+
"""
28+
script_filename = "llama3.3_70b.py"
29+
command, result = copy_and_run_script(tmp_path, example_dir, script_filename)
30+
31+
assert result.returncode == 0, gen_cmd_fail_message(command, result)

0 commit comments

Comments
 (0)