diff --git a/beginner_source/basics/optimization_tutorial.py b/beginner_source/basics/optimization_tutorial.py index 72bdf2e2eb6..be17289fd19 100644 --- a/beginner_source/basics/optimization_tutorial.py +++ b/beginner_source/basics/optimization_tutorial.py @@ -13,7 +13,7 @@ =========================== Now that we have a model and data it's time to train, validate and test our model by optimizing its parameters on -our data. Training a model is an iterative process; in each iteration (called an *epoch*) the model makes a guess about the output, calculates +our data. Training a model is an iterative process; in each iteration the model makes a guess about the output, calculates the error in its guess (*loss*), collects the derivatives of the error with respect to its parameters (as we saw in the `previous section `_), and **optimizes** these parameters using gradient descent. For a more detailed walkthrough of this process, check out this video on `backpropagation from 3Blue1Brown `__.