Skip to content

Added torchrun compatibility for distributet training across multiple GPUs in a single node (single instance) #1519

Added torchrun compatibility for distributet training across multiple GPUs in a single node (single instance)

Added torchrun compatibility for distributet training across multiple GPUs in a single node (single instance) #1519

Triggered via pull request August 7, 2024 22:10
@sage-makersage-maker
synchronize #4766
Status Cancelled
Total duration 14m 30s
Artifacts

codebuild-ci.yml

on: pull_request_target
collab-check
2s
collab-check
wait-for-approval
0s
wait-for-approval
codestyle-doc-tests
0s
codestyle-doc-tests
integ-tests
0s
integ-tests
Matrix: unit-tests
Fit to window
Zoom out
Zoom in

Annotations

1 error
wait-for-approval
The run was canceled by @sage-maker.