data module pytorch lightning error #16301
Unanswered
mjurej
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
GPU available: True (cuda), used: False
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
/home/aul/anaconda3/envs/pl/lib/python3.7/site-packages/pytorch_lightning/trainer/setup.py:178: PossibleUserWarning: GPU available but not used. Set
accelerator
anddevices
usingTrainer(accelerator='gpu', devices=2)
.category=PossibleUserWarning,
Running in
fast_dev_run
mode: will run the requested loop using 1 batch(es). Logging and checkpointing is suppressed.| Name | Type | Params
0 | resnet | ResNet | 21.3 M
1 | accuracy | BinaryAccuracy | 0
21.3 M Trainable params
0 Non-trainable params
21.3 M Total params
85.177 Total estimated model params size (MB)
Traceback (most recent call last):
File "train.py", line 25, in
main()
File "train.py", line 22, in main
trainer.fit(resnet_choco, datamodule=dm)
File "/home/aul/anaconda3/envs/pl/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py", line 604, in fit
self, self._fit_impl, model, train_dataloaders, val_dataloaders, datamodule, ckpt_path
File "/home/aul/anaconda3/envs/pl/lib/python3.7/site-packages/pytorch_lightning/trainer/call.py", line 38, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/aul/anaconda3/envs/pl/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py", line 645, in _fit_impl
self._run(model, ckpt_path=self.ckpt_path)
File "/home/aul/anaconda3/envs/pl/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py", line 1086, in _run
self._log_hyperparams()
File "/home/aul/anaconda3/envs/pl/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py", line 1126, in _log_hyperparams
datamodule_log_hyperparams = self.datamodule._log_hyperparams if self.datamodule is not None else False
AttributeError: 'ChocoDataModule' object has no attribute '_log_hyperparams'
How i can fix this probleme thanks ..
Beta Was this translation helpful? Give feedback.
All reactions