Skip to content

Commit ffda41a

Browse files
nhirscheyEsther2013
authored andcommitted
typo fix.
1 parent e2e9c36 commit ffda41a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -201,7 +201,7 @@ for step = 1 to (training_steps + 1) do
201201
// Run the optimization to update W and b values.
202202
// Wrap computation inside a GradientTape for automatic differentiation.
203203
use g = tf.GradientTape()
204-
// Linear regressoin (Wx + b).
204+
// Linear regression (Wx + b).
205205
let pred = W * train_X + b
206206
// Mean square error.
207207
let loss = tf.reduce_sum(tf.pow(pred - train_Y,2)) / (2 * n_samples)

0 commit comments

Comments
 (0)