Skip to content

Add ag group expert parallelism#2910

Closed
jeffnvidia wants to merge 5 commits intoNVIDIA:mainfrom
jeffnvidia:add_ag_group_expert_parallelism
Closed

Add ag group expert parallelism#2910
jeffnvidia wants to merge 5 commits intoNVIDIA:mainfrom
jeffnvidia:add_ag_group_expert_parallelism

Conversation

@jeffnvidia
Copy link
Copy Markdown
Contributor

@jeffnvidia jeffnvidia commented Jan 12, 2026

What does this PR do ?

This PR extends the all-gather/reduce-scatter overlap optimization to support Expert Parallelism (EP), enabling communication overlap for MoE (Mixture-of-Experts) models. It creates a separate process group for expert all-gather operations to avoid head-of-line blocking with gradient reduce-scatter, following the same pattern as the base implementation for regular data parallelism.

Background

This PR builds on #2663 (add all-gather process-group for overlapping), which implemented AG/RS overlap for regular data parallelism. That PR explicitly excluded expert parameters from the optimization (via if not group.is_expert_param check). This PR removes that limitation and adds full support for expert parallelism.

Changes

Core Changes:

  • Add _EXPERT_DATA_PARALLEL_GROUP_AG process group in parallel_state.py
  • Create expert AG group during initialize_model_parallel() when --create-all-gather-group is enabled
  • Update get_expert_data_parallel_group() with independent_all_gather parameter
  • Add has_separate_expert_all_gather_group() helper function

FSDP Integration:

  • Extend FSDPDistributedIndex to store expt_fsdp_group_ag
  • Update get_fsdp_group() to handle all 4 combinations: (is_expert_parallel × independent_all_gather)
  • Update buffer initialization logic to use expert AG group for expert parameters
  • Add UBR (User Buffer Registration) for expert AG group

Testing:

  • Add test_separate_expert_all_gather_group() unit test

Contribution process

Pre-checks

  • I want this PR in a versioned release and have added the appropriate Milestone (e.g., Core 0.8)
  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

The following process is enforced via the CODEOWNERS file for changes into megatron/core. For changes outside of megatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.

For MRs into `main` branch

Feel free to message or comment the @mcore-oncall to help accelerate your merge into main. The less complex your PR is, the faster it will be approved and merged!

(Step 1): Add PR label Expert Review

(Step 2): Collect the expert reviewers reviews

  1. Attach the Expert Review label when your PR is ready for review.
  2. GitHub auto-assigns expert reviewers based on your changes. They will get notified and pick up your PR soon.

⚠️ Only proceed to the next step once all reviewers have approved, merge-conflict are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

(Step 3): Final Review

  1. Add Final Review label
  2. GitHub auto-assigns final reviewers based on your changes. They will get notified and pick up your PR soon.

(Optional Step 4): Cherry-pick into release branch

If this PR also needs to be merged into core_r* release branches, after this PR has been merged, select Cherry-pick to open a new PR into the release branch.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either eharper@nvidia.com or zijiey@nvidia.com.

Merging your PR

Any member of core-adlr and core-nemo will be able to merge your PR.

@jeffnvidia jeffnvidia requested review from a team as code owners January 12, 2026 16:02
@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot bot commented Jan 12, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@youngeunkwon0405
Copy link
Copy Markdown
Member

Have you tested this PR working fine with EP > 1 case? Did it show expected behavior?

@jeffnvidia
Copy link
Copy Markdown
Contributor Author

jeffnvidia commented Jan 13, 2026

Have you tested this PR working fine with EP > 1 case? Did it show expected behavior?

Hi, I just tested it with EP=4 on GAIA cluster on 2 nodes and it does show expected behaviour
EP=1: Expert parameters correctly use all 16 GPUs (no sharding)
EP=4: Expert parameters correctly use the expert-specific DP group (size 4)

@jeffnvidia
Copy link
Copy Markdown
Contributor Author

could you review this PR @youngeunkwon0405 @shjwudp ? Thanks a lot

@shjwudp shjwudp added module: megatron-fsdp Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. labels Jan 14, 2026
@youngeunkwon0405
Copy link
Copy Markdown
Member

/ok to test 33ede31

@youngeunkwon0405
Copy link
Copy Markdown
Member

Seems to have a lint error.

@jeffnvidia jeffnvidia force-pushed the add_ag_group_expert_parallelism branch from 33ede31 to ce44736 Compare January 15, 2026 12:49
@jeffnvidia
Copy link
Copy Markdown
Contributor Author

Seems to have a lint error.

I fixed the linting, could you retrigger the CI ? It should be fine now. Thanks !

@youngeunkwon0405
Copy link
Copy Markdown
Member

/ok to test ce44736

@jeffnvidia jeffnvidia force-pushed the add_ag_group_expert_parallelism branch from ce44736 to c5b9670 Compare January 18, 2026 12:24
@jeffnvidia
Copy link
Copy Markdown
Contributor Author

/ok to test ce44736

could we push this PR also to final review ? @youngeunkwon0405
Thanks

@shjwudp
Copy link
Copy Markdown
Contributor

shjwudp commented Jan 19, 2026

Hi @jeffnvidia , this PR depends on PR-2663, so we’ll need to wait for that one to be merged first. Thanks for your patients.

@BoxiangW BoxiangW added Final Review PR is in the "final review" stage and removed Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. labels Jan 29, 2026
@jaredcasper
Copy link
Copy Markdown
Contributor

Please resolve conflicts now that 2663 has gone in.

@ericharper ericharper added Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. and removed Final Review PR is in the "final review" stage labels Feb 4, 2026
@jeffnvidia
Copy link
Copy Markdown
Contributor Author

Please resolve conflicts now that 2663 has gone in.

hey,

Please resolve conflicts now that 2663 has gone in.

so actually, I added everything in the new PR that uses process group collection, I think I can close this current PR : #3249

@shjwudp shjwudp closed this Feb 6, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. module: megatron-fsdp

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants