Files
pytorch_learn/09_epochs_batches.md
Joseph Hopfmüller 563f0ff8ec finish 09
2022-10-17 14:58:38 +02:00

13 lines
505 B
Markdown

# basics of batches
DataSet and DataLoader classes for batches of data instead of laoding the entire dataset at once
for training we loop over epochs and batches
epoch = 1 forward and backward pass of ALL training samples
batch_size = number of sample in one forward/backward pass
number of iterations = number of passes, each pass using [batch_size] number of samples
-> each epoch gets split up into [number of iterations] passes
e.g. 100 samples, batch_size=20 -> 100/20 = 5 iterations for 1 epoch