Skip to content

Incorrect use of "epoch" in the Optimizing Model Parameters tutorial #2126

Closed
@chrsigg

Description

@chrsigg

From the first paragraph of the Optimizing Model Parameters tutorial:

in each iteration (called an epoch) the model makes a guess about the output, calculates the error in its guess (loss), collects the derivatives of the error with respect to its parameters (as we saw in the previous section), and optimizes these parameters using gradient descent.

What is described in this paragraph is a single optimization step. An epoch is a full pass over the dataset (see e.g. https://deepai.org/machine-learning-glossary-and-terms/epoch).

I propose to simply remove the "(called an epoch)" here, as the term is correctly used and explained later in the "Hyperparameters" section:

Number of Epochs - the number times to iterate over the dataset

cc @suraj813

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions