Skip to content

Commit 4d9db86

Browse files
carmoccaBorda
andauthored
Prepare 1.1.3 release (#5365)
* Prepare 1.1.3 release * Fix flake8 error * suppress * Remove 1.1.4 section * Add missing commits to CHANGELOG * Update PR template * Add missing commit * fix * Update CHANGELOG.md * Apply suggestions from code review * Apply suggestions from code review Co-authored-by: Jirka Borovec <[email protected]> Co-authored-by: Jirka Borovec <[email protected]>
1 parent 6536ea4 commit 4d9db86

File tree

6 files changed

+32
-31
lines changed

6 files changed

+32
-31
lines changed

.github/PULL_REQUEST_TEMPLATE.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -16,26 +16,26 @@ If we didn't discuss your PR in Github issues there's a high chance it will not
1616
Fixes # (issue) <- this [links related issue to this PR](https://docs.github.com/en/free-pro-team@latest/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword)
1717

1818
## Before submitting
19-
- [ ] Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
20-
- [ ] Did you read the [contributor guideline](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/.github/CONTRIBUTING.md), Pull Request section?
19+
- [ ] Was this discussed/approved via a GitHub issue? (not for typos and docs)
20+
- [ ] Did you read the [contributor guideline](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/.github/CONTRIBUTING.md), **Pull Request** section?
2121
- [ ] Did you make sure your PR does only one thing, instead of bundling different changes together?
22-
- [ ] Did you make sure to update the documentation with your changes [if needed]?
23-
- [ ] Did you write any new necessary tests [no need for typos, docs]?
22+
- [ ] Did you make sure to update the documentation with your changes? (if necessary)
23+
- [ ] Did you write any new necessary tests? (not for typos and docs)
2424
- [ ] Did you verify new and existing tests pass locally with your changes?
25-
- [ ] If you made a notable change (that affects users), did you update the [CHANGELOG](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/CHANGELOG.md)?
25+
- [ ] Did you update the [CHANGELOG](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/CHANGELOG.md)? (not for typos, docs, test updates, or internal minor changes/refactorings)
2626

2727
<!-- For CHANGELOG separate each item in the unreleased section by a blank line to reduce collisions -->
2828

2929
## PR review
3030
Anyone in the community is free to review the PR once the tests have passed.
31-
Before you start reviewing make sure you have read [Review guidelines](https://github.com/PyTorchLightning/pytorch-lightning/wiki/Review-guidelines). In short, see the following bullet-list:
31+
Before you start reviewing make sure you have read [Review guidelines](https://github.com/PyTorchLightning/pytorch-lightning/wiki/Review-guidelines). In short, see the following bullet-list:
3232

3333
- [ ] Is this pull request ready for review? (if not, please submit in draft mode)
3434
- [ ] Check that all items from **Before submitting** are resolved
3535
- [ ] Make sure the title is self-explanatory and the description concisely explains the PR
3636
- [ ] Add labels and milestones (and optionally projects) to the PR so it can be classified
37-
- [ ] **Check that target branch and milestone are aligned!**
38-
37+
- [ ] **Check that target branch and milestone match!**
38+
3939

4040
## Did you have fun?
4141
Make sure you had fun coding 🙃

CHANGELOG.md

Lines changed: 13 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -9,28 +9,25 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
99

1010
### Added
1111

12-
- Added a check for optimizer attached to lr_scheduler ([#5338](https://github.com/PyTorchLightning/pytorch-lightning/pull/5338))
13-
14-
- Added `resume_from_checkpoint` accept non-existing file path ([#4402](https://github.com/PyTorchLightning/pytorch-lightning/pull/4402))
15-
12+
- Added a check for optimizer attached to `lr_scheduler` ([#5338](https://github.com/PyTorchLightning/pytorch-lightning/pull/5338))
13+
- Added support for passing non-existing filepaths to `resume_from_checkpoint` ([#4402](https://github.com/PyTorchLightning/pytorch-lightning/pull/4402))
1614

1715
### Changed
1816

19-
20-
### Deprecated
21-
22-
23-
### Removed
24-
25-
26-
### Fixed
27-
28-
- Skip restore from `resume_from_checkpoint` in while `testing` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161))
29-
17+
- Skip restore from `resume_from_checkpoint` while `testing` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161))
3018
- Allowed `log_momentum` for adaptive optimizers in `LearningRateMonitor` ([#5333](https://github.com/PyTorchLightning/pytorch-lightning/pull/5333))
19+
- Disabled checkpointing, earlystopping and logging with `fast_dev_run` ([#5277](https://github.com/PyTorchLightning/pytorch-lightning/pull/5277))
20+
- Distributed group defaults to `WORLD` if `None` ([#5125](https://github.com/PyTorchLightning/pytorch-lightning/pull/5125))
3121

32-
- Disabled checkpointing, earlystopping and logger with `fast_dev_run` ([#5277](https://github.com/PyTorchLightning/pytorch-lightning/pull/5277))
22+
### Fixed
3323

24+
- Fixed `trainer.test` returning non-test metrics ([#5214](https://github.com/PyTorchLightning/pytorch-lightning/pull/5214))
25+
- Fixed metric state reset ([#5273](https://github.com/PyTorchLightning/pytorch-lightning/pull/5273))
26+
- Fixed `--num-nodes` on `DDPSequentialPlugin` ([#5327](https://github.com/PyTorchLightning/pytorch-lightning/pull/5327))
27+
- Fixed invalid value for `weights_summary` ([#5296](https://github.com/PyTorchLightning/pytorch-lightning/pull/5296))
28+
- Fixed `Trainer.test` not using the latest `best_model_path` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161))
29+
- Fixed existence check for hparams not using underlying filesystem ([#5250](https://github.com/PyTorchLightning/pytorch-lightning/pull/5250))
30+
- Fixed `LightningOptimizer` AMP bug ([#5191](https://github.com/PyTorchLightning/pytorch-lightning/pull/5191))
3431
- Fixed casted key to string in `_flatten_dict` ([#5354](https://github.com/PyTorchLightning/pytorch-lightning/pull/5354))
3532

3633

pl_examples/basic_examples/mnist_datamodule.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
14-
14+
import platform
1515
from typing import Optional
1616

1717
from torch.utils.data import DataLoader, random_split
@@ -55,6 +55,9 @@ def __init__(
5555
normalize: If true applies image normalize
5656
"""
5757
super().__init__(*args, **kwargs)
58+
if platform.system() == "Windows":
59+
# see: https://stackoverflow.com/a/59680818/4521646
60+
num_workers = 0
5861

5962
self.dims = (1, 28, 28)
6063
self.data_dir = data_dir

pytorch_lightning/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
"""Root package info."""
22

3-
__version__ = '1.1.2'
3+
__version__ = '1.1.3'
44
__author__ = 'William Falcon et al.'
55
__author_email__ = '[email protected]'
66
__license__ = 'Apache-2.0'

pytorch_lightning/plugins/rpc_plugin.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,18 +12,19 @@
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
1414
import os
15+
from contextlib import suppress
1516
from typing import Optional
1617

1718
import torch
1819

1920
from pytorch_lightning.core.lightning import LightningModule
2021
from pytorch_lightning.plugins.ddp_plugin import DDPPlugin
21-
from pytorch_lightning.utilities import _module_available, RPC_AVAILABLE
22+
from pytorch_lightning.utilities import RPC_AVAILABLE
2223

2324
DEFAULT_RPC_TIMEOUT_SEC = 60.
2425
if RPC_AVAILABLE:
2526
from torch.distributed import rpc
26-
if _module_available("torch.distributed.rpc.constants") and hasattr(torch.distributed.rpc.constants, "DEFAULT_RPC_TIMEOUT_SEC"):
27+
with suppress(ModuleNotFoundError, ImportError):
2728
from torch.distributed.rpc.constants import DEFAULT_RPC_TIMEOUT_SEC
2829

2930

tests/checkpointing/test_model_checkpoint.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,20 +11,20 @@
1111
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
14-
from argparse import Namespace
1514
import os
16-
from pathlib import Path
1715
import pickle
1816
import platform
1917
import re
18+
from argparse import Namespace
19+
from pathlib import Path
2020
from unittest import mock
2121
from unittest.mock import Mock
2222

2323
import cloudpickle
24-
from omegaconf import Container, OmegaConf
2524
import pytest
2625
import torch
2726
import yaml
27+
from omegaconf import Container, OmegaConf
2828

2929
import pytorch_lightning as pl
3030
import tests.base.develop_utils as tutils

0 commit comments

Comments
 (0)