Skip to content

Commit aff0d9e

Browse files
authored
Add transformers version limit for ut case (#2219)
Signed-off-by: changwa1 <[email protected]>
1 parent fecbba3 commit aff0d9e

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

test/3x/torch/quantization/weight_only/test_transformers.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -212,6 +212,7 @@ def test_use_layer_wise(self):
212212
woq_output_download = woq_model(dummy_input)[0]
213213
assert torch.equal(woq_output_download, woq_output)
214214

215+
@pytest.mark.skipif(Version(transformers.__version__) > Version("4.52.0"), reason="modeling_opt.py changed.")
215216
def test_loading_autoawq_model(self):
216217
user_model = AutoModelForCausalLM.from_pretrained(self.autoawq_model)
217218
tokenizer = AutoTokenizer.from_pretrained(self.autoawq_model)

0 commit comments

Comments
 (0)