Skip to content

Conversation

@h-guo18
Copy link
Contributor

@h-guo18 h-guo18 commented Dec 2, 2025

What does this PR do?

Type of change: documentation

Overview:

  • Updated doc;
  • Moved config overwriting and d2t loading from main.py -> modify() to simplify the use of mtsp.convert();
  • Update README.md accordingly.

Usage

No change

Testing

  • Tested eagle confgs are identical before and after refactor:

Before refactor:

eagle config: PretrainedConfig {
  "attention_bias": false,
  "attention_dropout": 0.0,
  "draft_vocab_size": 128256,
  "dtype": "bfloat16",
  "eagle_aux_hidden_state_layer_ids": [
    1,
    12,
    7
  ],
  "has_lm_head": false,
  "head_dim": 128,
  "hidden_act": "silu",
  "hidden_size": 2048,
  "initializer_range": 0.02,
  "intermediate_size": 14336,
  "max_position_embeddings": 131072,
  "mlp_bias": false,
  "num_attention_heads": 32,
  "num_hidden_layers": 1,
  "num_key_value_heads": 8,
  "parallel_draft_heads_num_layers": 1,
  "parallel_draft_step": 1,
  "position_embedding_type": "rope",
  "rms_norm_eps": 1e-05,
  "rope_scaling": {
    "factor": 32.0,
    "high_freq_factor": 4.0,
    "low_freq_factor": 1.0,
    "original_max_position_embeddings": 8192,
    "rope_type": "llama3"
  },
  "rope_theta": 500000.0,
  "transformers_version": "4.57.1",
  "use_aux_hidden_state": true,
  "use_input_layernorm_in_first_layer": true,
  "use_last_layernorm": true,
  "use_mtp_layernorm": false,
  "vocab_size": 128256
}

After refactor:

eagle config: PretrainedConfig {
  "attention_bias": false,
  "attention_dropout": 0.0,
  "draft_vocab_size": 128256,
  "dtype": "bfloat16",
  "eagle_aux_hidden_state_layer_ids": [
    1,
    12,
    7
  ],
  "has_lm_head": false,
  "head_dim": 128,
  "hidden_act": "silu",
  "hidden_size": 2048,
  "initializer_range": 0.02,
  "intermediate_size": 14336,
  "max_position_embeddings": 131072,
  "mlp_bias": false,
  "num_attention_heads": 32,
  "num_hidden_layers": 1,
  "num_key_value_heads": 8,
  "parallel_draft_heads_num_layers": 1,
  "parallel_draft_step": 1,
  "position_embedding_type": "rope",
  "rms_norm_eps": 1e-05,
  "rope_scaling": {
    "factor": 32.0,
    "high_freq_factor": 4.0,
    "low_freq_factor": 1.0,
    "original_max_position_embeddings": 8192,
    "rope_type": "llama3"
  },
  "rope_theta": 500000.0,
  "transformers_version": "4.57.1",
  "use_aux_hidden_state": true,
  "use_input_layernorm_in_first_layer": true,
  "use_last_layernorm": true,
  "use_mtp_layernorm": false,
  "vocab_size": 128256
}

Before your PR is "Ready for review"

  • Make sure you read and follow Contributor guidelines and your commits are signed.
  • Is this change backward compatible?: Yes/No
  • Did you write any new necessary tests?: Yes/No
  • Did you add or update any necessary documentation?: Yes/No
  • Did you update Changelog?: Yes/No

Additional Information

@copy-pr-bot
Copy link

copy-pr-bot bot commented Dec 2, 2025

Auto-sync is disabled for draft pull requests in this repository. Workflows must be run manually.

Contributors can view more details about this message here.

@codecov
Copy link

codecov bot commented Dec 2, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 74.64%. Comparing base (d0b0c0f) to head (69115e7).

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #624   +/-   ##
=======================================
  Coverage   74.64%   74.64%           
=======================================
  Files         183      183           
  Lines       18542    18542           
=======================================
  Hits        13840    13840           
  Misses       4702     4702           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Signed-off-by: h-guo18 <[email protected]>
@h-guo18 h-guo18 self-assigned this Dec 2, 2025
@h-guo18 h-guo18 marked this pull request as ready for review December 2, 2025 01:44
@h-guo18 h-guo18 requested a review from a team as a code owner December 2, 2025 01:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants