Skip to content

Commit 4486212

Browse files
committed
Test: omit draft_vocab_size to cover EAGLE3 fallback
The EAGLE3 dummy configs in test_eagle3 set draft_vocab_size == vocab_size; omit the field so the new defaulting behavior is exercised. Signed-off-by: Venky Ganesh <23023424+venkywonka@users.noreply.github.com>
1 parent f11d7f0 commit 4486212

File tree

1 file changed

+0
-2
lines changed

1 file changed

+0
-2
lines changed

tests/unittest/_torch/speculative/test_eagle3.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -331,7 +331,6 @@ def test_deepseek_eagle3():
331331
'transformers_version': '4.52.4',
332332
'use_cache': True,
333333
'vocab_size': 129280,
334-
'draft_vocab_size': 129280
335334
}
336335
with tempfile.TemporaryDirectory() as temp_dir:
337336
eagle_model_dir = Path(temp_dir)
@@ -433,7 +432,6 @@ def test_multi_eagle3(use_one_model: bool):
433432
'transformers_version': '4.52.4',
434433
'use_cache': True,
435434
'vocab_size': 128256,
436-
'draft_vocab_size': 128256
437435
}
438436
with tempfile.TemporaryDirectory() as temp_dir:
439437
eagle_model_dir = Path(temp_dir)

0 commit comments

Comments
 (0)