Skip to content

Commit d6928f1

Browse files
authored
[TorchComm] Remove async TP and CP from known issue list (#1926)
As title, we verified that both CP and async tp is working with our latest fix to torchcomms, so we want to update the README in this PR.
1 parent 025c21b commit d6928f1

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

torchtitan/experiments/torchcomms/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,10 @@ Locally tested with:
2525
- **FSDP** (`fully_shard`) - Fully Sharded Data Parallel
2626
- **TP** - Tensor Parallelism
2727
- **PP** - Pipeline Parallelism
28+
- **CP** - Context Parallelism
2829
- **EP** - Expert Parallelism
2930
- **compile** - `torch.compile` integration
31+
- **Async TP** - Async TP integration
3032

3133
### Performance
3234

@@ -46,8 +48,6 @@ Locally tested with:
4648

4749
### Known Issues
4850

49-
- **CP** (Context Parallelism) - Temporarily not working. Work in progress.
50-
- **Async TP** - Temporarily not working. Work in progress.
5151
- **Memory Overhead** - TorchComms requires higher peak memory usage. As a workaround, we need to reduce `local_batch_size` to avoid out of memory error.
5252

5353
## Roadmap

0 commit comments

Comments
 (0)