Skip to content

[Auto-parallel] fix loss base of sharding fuse optimize test#10863

Closed
waliwali777 wants to merge 1 commit intoPaddlePaddle:developfrom
waliwali777:fix_fuse_flag_loss_base
Closed

[Auto-parallel] fix loss base of sharding fuse optimize test#10863
waliwali777 wants to merge 1 commit intoPaddlePaddle:developfrom
waliwali777:fix_fuse_flag_loss_base

Conversation

@waliwali777
Copy link
Contributor

Before submitting

  • Lint code. If there are lint issues, please format the code first.
# Install and register `pre-commit` in the project folder
pip install pre-commit && pre-commit install

# Process previous code files separately
pre-commit run --file XXXX.py
  • Add test cases into tests folder. If there are codecov issues, please add tests cases first.

PR types

Others

PR changes

Others

Description

该修复PR增加边界情况的处理,导致 测试 llm_gpt_dygraph_auto_bs8_fp16_DP2-MP2-PP2_intermediate 的 loss_base 需要进行更新

@paddle-bot
Copy link

paddle-bot bot commented Jul 18, 2025

Thanks for your contribution!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant