Notebook: 01_pytorch_workflow.ipynb | Training and Testing in the same for loop #603
-
Hi there, In notebook:
Don't we want to train the model, calculate loss, do the backpropagation, and update the parameters first and then run another loop to test the model? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @bhandari-nitin , In this case, we test the model every epoch. As in, do a forward pass on all the training data, calculate the loss and then progress the optimizer (steps 1-5). After those steps are done, we turn the model into Although these steps are in the same for loop, the test calculations do not interfere with the training calculations. You can test the model outside the training loop (e.g. only test every 5 epochs) by creating a specific function to test the model. However, in this case, we've decided to do training & testing in one loop. |
Beta Was this translation helpful? Give feedback.
Hi @bhandari-nitin ,
In this case, we test the model every epoch.
As in, do a forward pass on all the training data, calculate the loss and then progress the optimizer (steps 1-5).
After those steps are done, we turn the model into
eval()
mode withmodel.eval()
and then perform a forward pass on the test data and calculate the loss.Although these steps are in the same for loop, the test calculations do not interfere with the training calculations.
You can test the model outside the training loop (e.g. only test every 5 epochs) by creating a specific function to test the model.
However, in this case, we've decided to do training & testing in one loop.