Skip to content

Commit 4b7f78e

Browse files
Add deprecation warning & test for distributed_backend flag (#8575)
Co-authored-by: Adrian Wälchli <[email protected]>
1 parent 4605e8a commit 4b7f78e

File tree

4 files changed

+16
-1
lines changed

4 files changed

+16
-1
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -173,6 +173,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
173173
- Deprecated the `Trainer.disable_validation` property in favor of `not Trainer.enable_validation` ([#8291](https://github.com/PyTorchLightning/pytorch-lightning/pull/8291))
174174
- Deprecated `mode` parameter in `ModelSummary` in favor of `max_depth` ([#8062](https://github.com/PyTorchLightning/pytorch-lightning/pull/8062))
175175
- Deprecated `reload_dataloaders_every_epoch` argument of `Trainer` in favor of `reload_dataloaders_every_n_epochs` ([#5043](https://github.com/PyTorchLightning/pytorch-lightning/pull/5043))
176+
- Deprecated `distributed_backend` argument for `Trainer` ([#8575](https://github.com/PyTorchLightning/pytorch-lightning/pull/8575))
176177

177178

178179
### Removed

pytorch_lightning/trainer/connectors/accelerator_connector.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -88,6 +88,7 @@ def __init__(
8888
tpu_cores,
8989
ipus,
9090
distributed_backend,
91+
accelerator,
9192
gpus,
9293
gpu_ids,
9394
num_nodes,
@@ -105,6 +106,13 @@ def __init__(
105106
self._distrib_type = None
106107
self._accelerator_type = None
107108

109+
if distributed_backend is not None:
110+
rank_zero_deprecation(
111+
f"`Trainer(distributed_backend={distributed_backend})` has been deprecated and will be removed in v1.5."
112+
f" Use `Trainer(accelerator={distributed_backend})` instead."
113+
)
114+
distributed_backend = distributed_backend or accelerator
115+
108116
self.num_processes = num_processes
109117
self.devices = devices
110118
# `gpus` is the input passed to the Trainer, whereas `gpu_ids` is a list of parsed gpu ids.

pytorch_lightning/trainer/trainer.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -342,7 +342,7 @@ def __init__(
342342
super().__init__()
343343
Trainer._log_api_event("init")
344344
self.state = TrainerState()
345-
distributed_backend = distributed_backend or accelerator
345+
346346
gpu_ids, tpu_cores = self._parse_devices(gpus, auto_select_gpus, tpu_cores)
347347

348348
# init connectors
@@ -357,6 +357,7 @@ def __init__(
357357
tpu_cores,
358358
ipus,
359359
distributed_backend,
360+
accelerator,
360361
gpus,
361362
gpu_ids,
362363
num_nodes,

tests/deprecated_api/test_remove_1-5.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -347,3 +347,8 @@ def test_v1_5_0_deepspeed_cpu_offload(tmpdir, params):
347347

348348
with pytest.deprecated_call(match="is deprecated since v1.4 and will be removed in v1.5"):
349349
DeepSpeedPlugin(**params)
350+
351+
352+
def test_v1_5_0_distributed_backend_trainer_flag():
353+
with pytest.deprecated_call(match="has been deprecated and will be removed in v1.5."):
354+
Trainer(distributed_backend="ddp_cpu")

0 commit comments

Comments
 (0)