You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Called by lightning during training loop. Make sure to use the @pl.data_loader decorator, this ensures not calling this function until the data are needed.
Set up as many optimizers and (optionally) learning rate schedulers as you need. Normally you'd need one. But in the case of GANs or something more esoteric you might have multiple.
176
+
Lightning will call .backward() and .step() on each one in every epoch. If you use 16 bit precision it will also handle that.
177
+
178
+
179
+
##### Return
180
+
List or Tuple - List of optimizers with an optional second list of learning-rate schedulers
If you don't need to validate you don't need to implement this method.
207
+
208
+
In this step you'd normally do the forward pass and calculate the loss for a batch. You can also do fancier things like multiple forward passes, calculate accuracy, or save example outputs (using self.experiment or whatever you want). Really, anything you want.
144
209
145
-
In this step you'd normally do the forward pass and calculate the loss for a batch. You can also do fancier things like multiple forward passes or something specific to your model.
146
210
This is most likely the same as your training_step. But unlike training step, the outputs from here will go to validation_end for collation.
147
211
148
212
**Params**
@@ -151,6 +215,7 @@ This is most likely the same as your training_step. But unlike training step, th
151
215
|---|---|
152
216
| data_batch | The output of your dataloader. A tensor, tuple or list |
153
217
| batch_nb | Integer displaying which batch this is |
218
+
| dataloader_i | Integer displaying which dataloader this is |
Set up as many optimizers and (optionally) learning rate schedulers as you need. Normally you'd need one. But in the case of GANs or something more esoteric you might have multiple.
236
-
Lightning will call .backward() and .step() on each one in every epoch. If you use 16 bit precision it will also handle that.
237
-
238
-
239
-
##### Return
240
-
List or Tuple - List of optimizers with an optional second list of learning-rate schedulers
Called by lightning during training loop. Make sure to use the @pl.data_loader decorator, this ensures not calling this function until the data are needed.
Called by lightning during validation loop. Make sure to use the @pl.data_loader decorator, this ensures not calling this function until the data are needed.
345
+
**OPTIONAL**
346
+
If you don't need a validation dataset and a validation_step, you don't need to implement this method.
347
+
348
+
Called by lightning during validation loop. Make sure to use the @pl.data_loader decorator, this ensures not calling this function until the data are needed.
335
349
336
350
##### Return
337
-
PyTorch DataLoader
351
+
PyTorch DataLoader or list of PyTorch Dataloaders.
338
352
339
353
**Example**
340
354
@@ -350,6 +364,11 @@ def val_dataloader(self):
350
364
)
351
365
352
366
return loader
367
+
368
+
# can also return multiple dataloaders
369
+
@pl.data_loader
370
+
def val_dataloader(self):
371
+
return [loader_a, loader_b, ..., loader_n]
353
372
```
354
373
355
374
---
@@ -359,6 +378,9 @@ def val_dataloader(self):
359
378
@pl.data_loader
360
379
def test_dataloader(self)
361
380
```
381
+
**OPTIONAL**
382
+
If you don't need a test dataset and a test_step, you don't need to implement this method.
383
+
362
384
Called by lightning during test loop. Make sure to use the @pl.data_loader decorator, this ensures not calling this function until the data are needed.
0 commit comments