Skip to content

Commit cdbddbe

Browse files
authored
release 1.1.0 (#5048)
* release 1.1.0 * pep8
1 parent 05f25f3 commit cdbddbe

File tree

4 files changed

+12
-87
lines changed

4 files changed

+12
-87
lines changed

CHANGELOG.md

Lines changed: 8 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -5,148 +5,73 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
66

77

8-
## [1.1.0rc2] - 2020-12-02
8+
## [1.1.0] - 2020-12-09
99

1010
### Added
1111

1212
- Added "monitor" key to saved `ModelCheckpoints` ([#4383](https://github.com/PyTorchLightning/pytorch-lightning/pull/4383))
13-
14-
1513
- Added `ConfusionMatrix` class interface ([#4348](https://github.com/PyTorchLightning/pytorch-lightning/pull/4348))
16-
17-
1814
- Added multiclass AUROC metric ([#4236](https://github.com/PyTorchLightning/pytorch-lightning/pull/4236))
19-
20-
2115
- Added global step indexing to the checkpoint name for a better sub-epoch checkpointing experience ([#3807](https://github.com/PyTorchLightning/pytorch-lightning/pull/3807))
22-
23-
2416
- Added optimizer hooks in callbacks ([#4379](https://github.com/PyTorchLightning/pytorch-lightning/pull/4379))
25-
26-
2717
- Added option to log momentum ([#4384](https://github.com/PyTorchLightning/pytorch-lightning/pull/4384))
28-
29-
3018
- Added `current_score` to `ModelCheckpoint.on_save_checkpoint` ([#4721](https://github.com/PyTorchLightning/pytorch-lightning/pull/4721))
31-
32-
3319
- Added logging using `self.log` in train and evaluation for epoch end hooks (
3420
[#4552](https://github.com/PyTorchLightning/pytorch-lightning/pull/4552),
3521
[#4495](https://github.com/PyTorchLightning/pytorch-lightning/pull/4495),
3622
[#4439](https://github.com/PyTorchLightning/pytorch-lightning/pull/4439))
3723
[#4684](https://github.com/PyTorchLightning/pytorch-lightning/pull/4684))
3824
[#4913](https://github.com/PyTorchLightning/pytorch-lightning/pull/4913))
39-
40-
4125
- Added ability for DDP plugin to modify optimizer state saving ([#4675](https://github.com/PyTorchLightning/pytorch-lightning/pull/4675))
42-
43-
44-
- Updated `fast_dev_run` to accept integer representing num_batches ([#4629](https://github.com/PyTorchLightning/pytorch-lightning/pull/4629))
45-
46-
4726
- Added casting to python types for numpy scalars when logging hparams ([#4647](https://github.com/PyTorchLightning/pytorch-lightning/pull/4647))
48-
49-
5027
- Added `prefix` argument in loggers ([#4557](https://github.com/PyTorchLightning/pytorch-lightning/pull/4557))
51-
52-
5328
- Added printing of total num of params, trainable and non-trainable params in ModelSummary ([#4521](https://github.com/PyTorchLightning/pytorch-lightning/pull/4521))
54-
55-
56-
- Added optimizer refactors ([#4658](https://github.com/PyTorchLightning/pytorch-lightning/pull/4658))
57-
58-
5929
- Added `PrecisionRecallCurve, ROC, AveragePrecision` class metric ([#4549](https://github.com/PyTorchLightning/pytorch-lightning/pull/4549))
60-
61-
6230
- Added custom `Apex` and `NativeAMP` as `Precision plugins` ([#4355](https://github.com/PyTorchLightning/pytorch-lightning/pull/4355))
63-
64-
6531
- Added `DALI MNIST` example ([#3721](https://github.com/PyTorchLightning/pytorch-lightning/pull/3721))
66-
67-
6832
- Added `sharded plugin` for DDP for multi-gpu training memory optimizations (
69-
[#4639](https://github.com/PyTorchLightning/pytorch-lightning/pull/4639),
70-
[#4686](https://github.com/PyTorchLightning/pytorch-lightning/pull/4686),
71-
[#4675](https://github.com/PyTorchLightning/pytorch-lightning/pull/4675),
72-
[#4737](https://github.com/PyTorchLightning/pytorch-lightning/pull/4737),
73-
[#4773](https://github.com/PyTorchLightning/pytorch-lightning/pull/4773))
74-
75-
33+
[#4639](https://github.com/PyTorchLightning/pytorch-lightning/pull/4639),
34+
[#4686](https://github.com/PyTorchLightning/pytorch-lightning/pull/4686),
35+
[#4675](https://github.com/PyTorchLightning/pytorch-lightning/pull/4675),
36+
[#4737](https://github.com/PyTorchLightning/pytorch-lightning/pull/4737),
37+
[#4773](https://github.com/PyTorchLightning/pytorch-lightning/pull/4773))
7638
- Added `experiment_id` to the NeptuneLogger ([#3462](https://github.com/PyTorchLightning/pytorch-lightning/pull/3462))
77-
78-
7939
- Added `Pytorch Geometric` integration example with Lightning ([#4568](https://github.com/PyTorchLightning/pytorch-lightning/pull/4568))
80-
81-
8240
- Added `all_gather` method to `LightningModule` which allows gradient based tensor synchronizations for use-cases such as negative sampling. ([#5012](https://github.com/PyTorchLightning/pytorch-lightning/pull/5012))
83-
84-
8541
- Enabled `self.log` in most functions ([#4969](https://github.com/PyTorchLightning/pytorch-lightning/pull/4969))
86-
87-
8842
- Added changeable extension variable for `ModelCheckpoint` ([#4977](https://github.com/PyTorchLightning/pytorch-lightning/pull/4977))
8943

9044

9145
### Changed
9246

9347
- Removed `multiclass_roc` and `multiclass_precision_recall_curve`, use `roc` and `precision_recall_curve` instead ([#4549](https://github.com/PyTorchLightning/pytorch-lightning/pull/4549))
94-
95-
96-
9748
- Tuner algorithms will be skipped if `fast_dev_run=True` ([#3903](https://github.com/PyTorchLightning/pytorch-lightning/pull/3903))
98-
99-
100-
10149
- WandbLogger does not force wandb `reinit` arg to True anymore and creates a run only when needed ([#4648](https://github.com/PyTorchLightning/pytorch-lightning/pull/4648))
102-
103-
10450
- Changed `automatic_optimization` to be a model attribute ([#4602](https://github.com/PyTorchLightning/pytorch-lightning/pull/4602))
105-
106-
10751
- Changed `Simple Profiler` report to order by percentage time spent + num calls ([#4880](https://github.com/PyTorchLightning/pytorch-lightning/pull/4880))
108-
109-
11052
- Simplify optimization Logic ([#4984](https://github.com/PyTorchLightning/pytorch-lightning/pull/4984))
111-
112-
11353
- Classification metrics overhaul ([#4837](https://github.com/PyTorchLightning/pytorch-lightning/pull/4837))
54+
- Updated `fast_dev_run` to accept integer representing num_batches ([#4629](https://github.com/PyTorchLightning/pytorch-lightning/pull/4629))
55+
- Refactored optimizer ([#4658](https://github.com/PyTorchLightning/pytorch-lightning/pull/4658))
11456

11557

11658
### Deprecated
11759

11860
- Deprecated `prefix` argument in `ModelCheckpoint` ([#4765](https://github.com/PyTorchLightning/pytorch-lightning/pull/4765))
119-
120-
12161
- Deprecated the old way of assigning hyper-parameters through `self.hparams = ...` ([#4813](https://github.com/PyTorchLightning/pytorch-lightning/pull/4813))
122-
123-
12462
- Deprecated `mode='auto'` from `ModelCheckpoint` and `EarlyStopping` ([#4695](https://github.com/PyTorchLightning/pytorch-lightning/pull/4695))
12563

126-
12764
### Removed
12865

12966
- Removed `reorder` parameter of the `auc` metric ([#5004](https://github.com/PyTorchLightning/pytorch-lightning/pull/5004))
13067

131-
132-
13368
### Fixed
13469

13570
- Added feature to move tensors to CPU before saving ([#4309](https://github.com/PyTorchLightning/pytorch-lightning/pull/4309))
136-
137-
13871
- Fixed `LoggerConnector` to have logged metrics on root device in DP ([#4138](https://github.com/PyTorchLightning/pytorch-lightning/pull/4138))
139-
140-
14172
- Auto convert tensors to contiguous format when `gather_all` ([#4907](https://github.com/PyTorchLightning/pytorch-lightning/pull/4907))
142-
143-
14473
- Fixed `PYTHONPATH` for ddp test model ([#4528](https://github.com/PyTorchLightning/pytorch-lightning/pull/4528))
145-
146-
14774
- Fixed allowing logger to support indexing ([#4595](https://github.com/PyTorchLightning/pytorch-lightning/pull/4595))
148-
149-
15075
- Fixed DDP and manual_optimization ([#4976](https://github.com/PyTorchLightning/pytorch-lightning/pull/4976))
15176

15277

pytorch_lightning/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
"""Root package info."""
22

3-
__version__ = '1.1.0rc2'
3+
__version__ = '1.1.0'
44
__author__ = 'William Falcon et al.'
55
__author_email__ = '[email protected]'
66
__license__ = 'Apache-2.0'

pytorch_lightning/accelerators/accelerator.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -254,6 +254,3 @@ def block_ddp_plugin_sync_behaviour(self):
254254
"""
255255
cm = self.ddp_plugin.block_backward_sync(self.trainer.model) if self.ddp_plugin else None
256256
yield cm
257-
258-
259-

pytorch_lightning/setup_tools.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -161,6 +161,9 @@ def _load_long_description(path_dir: str) -> str:
161161
path_readme = os.path.join(path_dir, "README.md")
162162
text = open(path_readme, encoding="utf-8").read()
163163

164+
# drop images from readme
165+
text = text.replace('![PT to PL](docs/source/_images/general/pl_quick_start_full_compressed.gif)', '')
166+
164167
# https://github.com/PyTorchLightning/pytorch-lightning/raw/master/docs/source/_images/lightning_module/pt_to_pl.png
165168
github_source_url = os.path.join(__homepage__, "raw", __version__)
166169
# replace relative repository path to absolute link to the release

0 commit comments

Comments
 (0)