I’m trying to print training progress bar using tqdm.
I’d like to track the progress of the epochs, and for each epoch i have 2 progress bars: the train_loader minibatches, and validation_loader minibatches.
The code is something like that:
for epoch in tqdm(range(epochs_num)): for inputs, labels in tqdm(train_loader, "Train progress", leave=False): # train... with torch.no_grad(): for inputs, labels in tqdm(validation_loader, "Validation progress", leave=False): # calc validation loss
leave argument, the progress bars deleted every epoch, but I’d like to delete them togather, right after the validation process ends.
There’s any way to do it?
You can reuse your progress bars and do the updates manually like this:
epochs = tqdm(range(epochs_num), desc="Epochs") training_progress = tqdm(total=training_batch_size, desc="Training progress") validation_progress = tqdm(total=validation_batch_size, desc="Validation progress") for epoch in epochs: training_progress.reset() validation_progress.reset() for inputs, labels in train_loader: # train... training_progress.update() with torch.no_grad(): for inputs, labels in validation_loader: # calc validation loss validation_progress.update()
If the batch sizes are not always the same, you can calculate them on the fly and call
Answered By – swenzel
Answer Checked By – Gilberto Lyons (AngularFixing Admin)