Skip to content

Move Megatron-FSDP MixedPrecisionPolicy arguments from FSDP adapter t…#3903

Open
cspades wants to merge 3 commits intoNVIDIA:mainfrom
cspades:cye/mfsdp_mixprec_to_ddp
Open

Move Megatron-FSDP MixedPrecisionPolicy arguments from FSDP adapter t…#3903
cspades wants to merge 3 commits intoNVIDIA:mainfrom
cspades:cye/mfsdp_mixprec_to_ddp

Conversation

@cspades
Copy link
Member

@cspades cspades commented Mar 17, 2026

…o DDPConfig.

What does this PR do ?

  • Megatron-Bridge prefers all DDP-related arguments (including FSDP) to be included in the DDPConfig. Our adapter allows us to satisfy this requirement.

Bug-Fixes

  • Many triton imports were not guarded and was breaking the CI.

Testing

  • Made sure arguments are passed to the DDPConfig correctly (i.e. hasattr(args, f.name) == True), unit tests test the same things.

⚠️ For major changes (either in lines of code or in its impact), please make sure to first share a design doc with the team. If you're unsure what's the best way to do so, contact the @mcore-oncall.

Contribution process

Pre-checks

  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

Feel free to message or comment the @mcore-oncall to help accelerate your merge into main. The less complex your PR is, the faster it will be approved and merged!

All PRs start as draft. If you open a non-draft PR, it will be automatically converted to draft.

Step 1: Mark PR as "Ready for Review"

  1. When your PR is ready, click Ready for Review.
  2. An oncall reviewer is auto-assigned and expert reviewers are notified based on your changes.
    • Some PRs may jump straight to step 2. This is determined by .github/CODEOWNERS.

⚠️ Only mark as ready once merge-conflicts are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

Step 2: Final Review

For PRs that change megatron/core, once all expert reviewers have approved, the Final Review label is applied automatically and final reviewers are assigned.

For PRs outside megatron/core, this step is skipped.

Step 3: Approved

Once all required reviewers have approved, the Approved label is applied automatically.

Merge

Any member of mcore-engineers will be able to merge your PR.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either eharper@nvidia.com or zijiey@nvidia.com.

@cspades cspades self-assigned this Mar 17, 2026
@cspades cspades requested review from a team as code owners March 17, 2026 19:36
@svcnvidia-nemo-ci svcnvidia-nemo-ci marked this pull request as draft March 17, 2026 19:36
@github-actions
Copy link
Contributor

This PR has been automatically converted to draft because all PRs must start as drafts.

When you are ready for review, click Ready for Review to begin the review process. This will:

  1. Add the oncall reviewer (optional reviewer)
  2. Add required review teams based on your changes

See the contribution guide for more details.

@copy-pr-bot
Copy link

copy-pr-bot bot commented Mar 17, 2026

Auto-sync is disabled for draft pull requests in this repository. Workflows must be run manually.

Contributors can view more details about this message here.

@svcnvidia-nemo-ci svcnvidia-nemo-ci added this to the Core 0.16 milestone Mar 17, 2026
@cspades cspades marked this pull request as ready for review March 17, 2026 19:44
@cspades cspades added the Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. label Mar 17, 2026
@svcnvidia-nemo-ci svcnvidia-nemo-ci requested a review from a team March 17, 2026 19:45
@cspades cspades requested a review from a team as a code owner March 17, 2026 20:52
@cspades cspades force-pushed the cye/mfsdp_mixprec_to_ddp branch from 173b7c4 to 1d55b0c Compare March 17, 2026 21:06
@cspades cspades requested a review from a team as a code owner March 17, 2026 21:06
to minimize the registration time.
"""

megatron_fsdp_main_params_dtype: Optional[torch.dtype] = torch.float32
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not just add a MixedPrecisionPolicy field instead of duplicating?

Copy link
Member Author

@cspades cspades Mar 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question! MixedPrecisionPolicy also works without Megatron, i.e. the fully_shard API that native Torch users and NeMo Automodel can use without installing anything. PyPI: https://pypi.org/project/megatron-fsdp/

Nesting that dataclass in the DDPConfig as well as using it as a standalone config can be quite confusing. The only reason I'm even including these arguments into DDPConfig is because of Megatron-Bridge conventions, and prefer not to make it more complicated. One complexity is obvious - that users will need to import Megatron-FSDP sub-modules (megatron_fsdp.MixedPrecisionPolicy) just to use the DDPConfig, not needed and possibly circular.

…o DDPConfig.

Signed-off-by: Cory Ye <cye@nvidia.com>
@cspades cspades force-pushed the cye/mfsdp_mixprec_to_ddp branch from 213b912 to baf59f1 Compare March 18, 2026 01:59
@cspades cspades added Final Review PR is in the "final review" stage and removed Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. labels Mar 18, 2026
@svcnvidia-nemo-ci svcnvidia-nemo-ci removed the Final Review PR is in the "final review" stage label Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants