PyTorch tutorial using testing dataset in training epoch


In PyTorch official tutorial: OPTIMIZING MODEL PARAMETERS. The dataset is said to be split into training data and testing data. However, the testing dataset was used in each epoch.

Shouldn’t the testing dataset only be used once for the evaluating the final model? Or in this tutorial the ‘testing dataset’ is actually ‘validation dataset’ and there is no testing dataset in the code?


Indeed in many basic ML/DL tutorials, the fine distinction between validation and test is often overlooked. In the tutorial you mentioned, since no "decision" is made based on the validation performance (e.g., early stopping), this set can be considered a "test" set and it is okay to monitor test performance during training.

Answered By – Shai

Answer Checked By – Pedro (AngularFixing Volunteer)

Leave a Reply

Your email address will not be published.