-
Notifications
You must be signed in to change notification settings - Fork 207
Nemotron-3-Nano Model Support #1914
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 17 commits
Commits
Show all changes
21 commits
Select commit
Hold shift + click to select a range
2109f36
rebase and signoff
liding-nv ed45c7d
fix tests
liding-nv ef9a17b
update model provider
liding-nv baa2a8f
remove unused config
liding-nv 59452c0
update entry scripts
liding-nv b8f986b
Merge branch 'main' into liding/nano-v3-pr
liding-nv dc176d2
fix tests
liding-nv 7c70aa2
clean metadata for ckpt serialization
liding-nv 555ca02
force to use cache folder when testing nano v3 conversion
liding-nv 3bf9b02
keep e_score_correction_bias in fp32 and disable deepep in functional…
liding-nv 6017cf6
Set HF_MODULES_CACHE to temp path in nano-v3 test
chtruong814 f5deffd
Fix nano-v3 finetune test
chtruong814 d6b5d56
lint check
liding-nv f9b45fc
Merge branch 'main' into liding/nano-v3-pr
liding-nv d2a4b32
fix test scope mismatch
liding-nv 79374f7
address comments
liding-nv af98abd
Merge branch 'main' into liding/nano-v3-pr
liding-nv dda57c7
fix unit tests
liding-nv 10b7f9b
minor fixes
liding-nv 0c5ef6c
minor updates
liding-nv 07824cc
Merge branch 'main' into liding/nano-v3-pr
ko3n1g File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
106 changes: 106 additions & 0 deletions
106
examples/recipes/nemotron_3/finetune_nemotron_3_nano.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,106 @@ | ||
| #!/usr/bin/env python3 | ||
| # Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
| # You may obtain a copy of the License at | ||
| # | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
|
|
||
| import argparse | ||
| import logging | ||
| import os | ||
| import sys | ||
| from typing import Tuple | ||
|
|
||
| import torch | ||
| from omegaconf import OmegaConf | ||
|
|
||
| from megatron.bridge.recipes.nemotronh.nemotron_3_nano import ( | ||
| nemotron_3_nano_finetune_config as finetune_config, | ||
| ) | ||
| from megatron.bridge.training.config import ConfigContainer | ||
| from megatron.bridge.training.finetune import finetune | ||
| from megatron.bridge.training.gpt_step import forward_step | ||
| from megatron.bridge.training.utils.omegaconf_utils import ( | ||
| apply_overrides, | ||
| create_omegaconf_dict_config, | ||
| parse_hydra_overrides, | ||
| ) | ||
|
|
||
|
|
||
| logger: logging.Logger = logging.getLogger(__name__) | ||
|
|
||
|
|
||
| def parse_cli_args() -> Tuple[argparse.Namespace, list[str]]: | ||
| """Parse command line arguments, separating known script args from OmegaConf overrides.""" | ||
| parser = argparse.ArgumentParser( | ||
| description="Finetune Nemotron 3 Nano model using Megatron-Bridge with YAML and CLI overrides", | ||
| formatter_class=argparse.RawTextHelpFormatter, | ||
| ) | ||
| parser.add_argument( | ||
| "--config-file", | ||
| type=str, | ||
| help="Path to the YAML OmegaConf override file.", | ||
| ) | ||
| parser.add_argument("--peft", type=str, help="Type of PEFT to use") | ||
| parser.add_argument("--packed-sequence", action="store_true", help="Whether to use sequence packing") | ||
| parser.add_argument("--seq-length", type=int, default=2048, help="Sequence length") | ||
|
|
||
| # Parse known args for the script, remaining will be treated as overrides | ||
| args, cli_dotlist_overrides = parser.parse_known_args() | ||
| return args, cli_dotlist_overrides | ||
|
|
||
|
|
||
| def main() -> None: | ||
| """ | ||
| Entry point for the Nemotron 3 Nano finetuning script. | ||
| """ | ||
| args, cli_overrides = parse_cli_args() | ||
|
|
||
| cfg: ConfigContainer = finetune_config( | ||
| seq_length=args.seq_length, peft=args.peft, packed_sequence=args.packed_sequence | ||
| ) | ||
| cfg.model.seq_length = args.seq_length | ||
|
|
||
| # Convert the initial Python dataclass to an OmegaConf DictConfig for merging | ||
| merged_omega_conf, excluded_fields = create_omegaconf_dict_config(cfg) | ||
|
|
||
| # Load and merge YAML overrides if a config file is provided | ||
| if args.config_file: | ||
| logger.debug(f"Loading YAML overrides from: {args.config_file}") | ||
| if not os.path.exists(args.config_file): | ||
| logger.error(f"Override YAML file not found: {args.config_file}") | ||
| sys.exit(1) | ||
| yaml_overrides_omega = OmegaConf.load(args.config_file) | ||
| merged_omega_conf = OmegaConf.merge(merged_omega_conf, yaml_overrides_omega) | ||
| logger.debug("YAML overrides merged successfully.") | ||
|
|
||
| # Apply command-line overrides using Hydra-style parsing | ||
| if cli_overrides: | ||
| logger.debug(f"Applying Hydra-style command-line overrides: {cli_overrides}") | ||
| merged_omega_conf = parse_hydra_overrides(merged_omega_conf, cli_overrides) | ||
| logger.debug("Hydra-style command-line overrides applied successfully.") | ||
|
|
||
| # Apply the final merged OmegaConf configuration back to the original ConfigContainer | ||
| logger.debug("Applying final merged configuration back to Python ConfigContainer...") | ||
| final_overrides_as_dict = OmegaConf.to_container(merged_omega_conf, resolve=True) | ||
| # Apply overrides while preserving excluded fields | ||
| apply_overrides(cfg, final_overrides_as_dict, excluded_fields) | ||
|
|
||
| # Start training | ||
| logger.debug("Starting finetuning...") | ||
| finetune(config=cfg, forward_step_func=forward_step) | ||
|
|
||
| if torch.distributed.is_initialized(): | ||
| torch.distributed.destroy_process_group() | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| main() |
105 changes: 105 additions & 0 deletions
105
examples/recipes/nemotron_3/pretrain_nemotron_3_nano.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,105 @@ | ||
| #!/usr/bin/env python3 | ||
| # Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
| # You may obtain a copy of the License at | ||
| # | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
|
|
||
| import argparse | ||
| import logging | ||
| import os | ||
| import sys | ||
| from typing import Tuple | ||
|
|
||
| import torch | ||
| from omegaconf import OmegaConf | ||
|
|
||
| from megatron.bridge.recipes.nemotronh.nemotron_3_nano import ( | ||
| nemotron_3_nano_pretrain_config as pretrain_config, | ||
| ) | ||
| from megatron.bridge.training.config import ConfigContainer | ||
| from megatron.bridge.training.gpt_step import forward_step | ||
| from megatron.bridge.training.pretrain import pretrain | ||
| from megatron.bridge.training.utils.omegaconf_utils import ( | ||
| apply_overrides, | ||
| create_omegaconf_dict_config, | ||
| parse_hydra_overrides, | ||
| ) | ||
|
|
||
|
|
||
| logger: logging.Logger = logging.getLogger(__name__) | ||
|
|
||
|
|
||
| def parse_cli_args() -> Tuple[argparse.Namespace, list[str]]: | ||
| """Parse command line arguments, separating known script args from OmegaConf overrides.""" | ||
| parser = argparse.ArgumentParser( | ||
| description="Pretrain Nemotron 3 Nano model using Megatron-Bridge with YAML and CLI overrides", | ||
| formatter_class=argparse.RawTextHelpFormatter, | ||
| ) | ||
| parser.add_argument( | ||
| "--config-file", | ||
| type=str, | ||
| help="Path to the YAML OmegaConf override file.", | ||
| ) | ||
| parser.add_argument("--per-split-data-args-path", type=str, help="Path to the per split data args file.") | ||
| parser.add_argument("--tokenizer-model", type=str, help="Path to the tokenizer model file.") | ||
|
|
||
| # Parse known args for the script, remaining will be treated as overrides | ||
| args, cli_dotlist_overrides = parser.parse_known_args() | ||
| return args, cli_dotlist_overrides | ||
|
|
||
|
|
||
| def main() -> None: | ||
| """ | ||
| Entry point for the Nemotron 3 Nano pretraining script. | ||
| """ | ||
| args, cli_overrides = parse_cli_args() | ||
|
|
||
| cfg: ConfigContainer = pretrain_config( | ||
| per_split_data_args_path=args.per_split_data_args_path, | ||
| tokenizer_model=args.tokenizer_model, | ||
| ) | ||
|
|
||
| # Convert the initial Python dataclass to an OmegaConf DictConfig for merging | ||
| merged_omega_conf, excluded_fields = create_omegaconf_dict_config(cfg) | ||
|
|
||
| # Load and merge YAML overrides if a config file is provided | ||
| if args.config_file: | ||
| logger.debug(f"Loading YAML overrides from: {args.config_file}") | ||
| if not os.path.exists(args.config_file): | ||
| logger.error(f"Override YAML file not found: {args.config_file}") | ||
| sys.exit(1) | ||
| yaml_overrides_omega = OmegaConf.load(args.config_file) | ||
| merged_omega_conf = OmegaConf.merge(merged_omega_conf, yaml_overrides_omega) | ||
| logger.debug("YAML overrides merged successfully.") | ||
|
|
||
| # Apply command-line overrides using Hydra-style parsing | ||
| if cli_overrides: | ||
| logger.debug(f"Applying Hydra-style command-line overrides: {cli_overrides}") | ||
| merged_omega_conf = parse_hydra_overrides(merged_omega_conf, cli_overrides) | ||
| logger.debug("Hydra-style command-line overrides applied successfully.") | ||
|
|
||
| # Apply the final merged OmegaConf configuration back to the original ConfigContainer | ||
| logger.debug("Applying final merged configuration back to Python ConfigContainer...") | ||
| final_overrides_as_dict = OmegaConf.to_container(merged_omega_conf, resolve=True) | ||
| # Apply overrides while preserving excluded fields | ||
| apply_overrides(cfg, final_overrides_as_dict, excluded_fields) | ||
|
|
||
| # Start training | ||
| logger.debug("Starting pretraining...") | ||
| pretrain(config=cfg, forward_step_func=forward_step) | ||
|
|
||
| if torch.distributed.is_initialized(): | ||
| torch.distributed.destroy_process_group() | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| main() | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.