Closed
Description
From the first paragraph of the Optimizing Model Parameters tutorial:
in each iteration (called an epoch) the model makes a guess about the output, calculates the error in its guess (loss), collects the derivatives of the error with respect to its parameters (as we saw in the previous section), and optimizes these parameters using gradient descent.
What is described in this paragraph is a single optimization step. An epoch is a full pass over the dataset (see e.g. https://deepai.org/machine-learning-glossary-and-terms/epoch).
I propose to simply remove the "(called an epoch)" here, as the term is correctly used and explained later in the "Hyperparameters" section:
Number of Epochs - the number times to iterate over the dataset
cc @suraj813