Skip to content

Adding a new config parameter to combine layers during FSDP#360

Open
tejasnagendra wants to merge 10 commits intomainfrom
tejasn/combine_layers
Open

Adding a new config parameter to combine layers during FSDP#360
tejasnagendra wants to merge 10 commits intomainfrom
tejasn/combine_layers

Conversation

@tejasnagendra
Copy link
Collaborator

Created a new class called MultiBlock, which wraps around multiple Block to reduce the number of NCCL communication. Number of blocks to combine can be controlled with num_blocks_to_combine parameter GPT/Llama.

Ideally the message size should be around 1GB to get the best performance. But when models are running on multiple nodes every layer is split into really small chunks causing the message size to be extremely small resulting in suboptimal usage of network bandwidth. This parameter can be controlled to make sure we send larger message sizes.

@google-cla
Copy link

google-cla bot commented Jan 30, 2024

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants