Skip to content

Commit 21e70b7

Browse files
authored
Merge branch 'main' into fix_fsdp_dataset
2 parents e851405 + 01d2270 commit 21e70b7

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

recipes_source/recipes/dynamic_quantization.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -162,7 +162,7 @@ def forward(self,inputs,hidden):
162162
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
163163
#
164164
# Now we get to the fun part. First we create an instance of the model
165-
# called ``float\_lstm`` then we are going to quantize it. We're going to use
165+
# called ``float_lstm`` then we are going to quantize it. We're going to use
166166
# the `torch.quantization.quantize_dynamic <https://pytorch.org/docs/stable/quantization.html#torch.quantization.quantize_dynamic>`__ function, which takes the model, then a list of the submodules
167167
# which we want to
168168
# have quantized if they appear, then the datatype we are targeting. This

0 commit comments

Comments
 (0)