505 B
505 B
basics of batches
DataSet and DataLoader classes for batches of data instead of laoding the entire dataset at once for training we loop over epochs and batches
epoch = 1 forward and backward pass of ALL training samples batch_size = number of sample in one forward/backward pass number of iterations = number of passes, each pass using [batch_size] number of samples -> each epoch gets split up into [number of iterations] passes
e.g. 100 samples, batch_size=20 -> 100/20 = 5 iterations for 1 epoch