Skip to content

Comments

BugFix: qwen model sequence parallel can not get batch size#11146

Merged
swgu98 merged 1 commit intoPaddlePaddle:incubate/paddlenlp-fleety_20250421from
Jason233333:my_feature
Oct 24, 2025
Merged

BugFix: qwen model sequence parallel can not get batch size#11146
swgu98 merged 1 commit intoPaddlePaddle:incubate/paddlenlp-fleety_20250421from
Jason233333:my_feature

Conversation

@Jason233333
Copy link
Contributor

PR types
BugFix

PR changes
Models

Description
Support for qwen model PipelineParallel training in RLInfra, fix bug during sequence parallel.

@paddle-bot
Copy link

paddle-bot bot commented Oct 22, 2025

Thanks for your contribution!

@Jason233333 Jason233333 force-pushed the my_feature branch 2 times, most recently from 58dff61 to abe0e7f Compare October 24, 2025 07:09
@Jason233333 Jason233333 changed the title BugFix: Set default batch_size=1 when empty in sequence parallelism BugFix: qwen model sequence parallel can not get batch size Oct 24, 2025
@swgu98 swgu98 merged commit 0175cb4 into PaddlePaddle:incubate/paddlenlp-fleety_20250421 Oct 24, 2025
6 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants