Skip to content
This repository was archived by the owner on Nov 3, 2023. It is now read-only.

Commit e266f54

Browse files
authored
Rename repo to ray_lightning (#19)
1 parent 562b63b commit e266f54

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,9 @@ Once you add your plugin to the PyTorch Lightning Trainer, you can parallelize t
88
This library also comes with an integration with [Ray Tune](tune.io) for distributed hyperparameter tuning experiments.
99

1010
## Installation
11-
You can install the master branch of ray_lightning_accelerators like so:
11+
You can install the master branch of ray_lightning like so:
1212

13-
`pip install git+https://github.com/ray-project/ray_lightning_accelerators#ray_lightning`
13+
`pip install git+https://github.com/ray-project/ray_lightning#ray_lightning`
1414

1515
## PyTorch Distributed Data Parallel Plugin on Ray
1616
The `RayPlugin` provides Distributed Data Parallel training on a Ray cluster. PyTorch DDP is used as the distributed training protocol, and Ray is used to launch and manage the training worker processes.

0 commit comments

Comments
 (0)