Skip to content

Commit b2707c9

Browse files
authored
fix retruning returns (#1431)
* returns * changelog
1 parent 2dec93f commit b2707c9

File tree

3 files changed

+4
-3
lines changed

3 files changed

+4
-3
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
1414

1515
- Fixed default `DistributedSampler` for DDP training ([#1425](https://github.com/PyTorchLightning/pytorch-lightning/pull/1425))
1616
- Fixed workers warning not on windows ([#1430](https://github.com/PyTorchLightning/pytorch-lightning/pull/1430))
17+
- Fixed returning tuple from `run_training_batch` ([#1431](https://github.com/PyTorchLightning/pytorch-lightning/pull/1431))
1718

1819
## [0.7.2] - 2020-04-07
1920

pytorch_lightning/trainer/training_io.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -322,7 +322,7 @@ def dump_checkpoint(self):
322322
checkpoint['hparams_type'] = 'namespace' if is_namespace else 'dict'
323323
else:
324324
rank_zero_warn(
325-
"Did not find hyperparameters at model.hparams. Saving checkpoint without hyperparameters."
325+
"Did not find hyperparameters at model hparams. Saving checkpoint without hyperparameters."
326326
)
327327

328328
# give the model a chance to add a few things

pytorch_lightning/trainer/training_loop.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -532,7 +532,7 @@ def run_training_batch(self, batch, batch_idx):
532532
all_log_metrics = []
533533

534534
if batch is None:
535-
return 0, grad_norm_dic, {}
535+
return 0, grad_norm_dic, {}, {}
536536

537537
# Batch start events
538538
with self.profiler.profile('on_batch_start'):
@@ -542,7 +542,7 @@ def run_training_batch(self, batch, batch_idx):
542542
if self.is_function_implemented('on_batch_start'):
543543
response = self.get_model().on_batch_start(batch)
544544
if response == -1:
545-
return -1, grad_norm_dic, {}
545+
return -1, grad_norm_dic, {}, {}
546546

547547
splits = [batch]
548548
if self.truncated_bptt_steps is not None:

0 commit comments

Comments
 (0)