diff --git a/intermediate_source/dynamic_quantization_bert_tutorial.rst b/intermediate_source/dynamic_quantization_bert_tutorial.rst index e515f53a1df..786ef11f3b2 100644 --- a/intermediate_source/dynamic_quantization_bert_tutorial.rst +++ b/intermediate_source/dynamic_quantization_bert_tutorial.rst @@ -138,7 +138,7 @@ the following helper functions: one for converting the text examples into the feature vectors; The other one for measuring the F1 score of the predicted result. -The `glue_convert_examples_to_features `_ function converts the texts into input features: +The `glue_convert_examples_to_features `_ function converts the texts into input features: - Tokenize the input sequences; - Insert [CLS] in the beginning; @@ -147,7 +147,7 @@ The `glue_convert_examples_to_features `_ function has the compute metrics with +The `glue_compute_metrics `_ function has the compute metrics with the `F1 score `_, which can be interpreted as a weighted average of the precision and recall, where an F1 score reaches its best value at 1 and worst score at 0. The @@ -273,7 +273,7 @@ We load the tokenizer and fine-tuned BERT sequence classifier model 2.3 Define the tokenize and evaluation function ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -We reuse the tokenize and evaluation function from `HuggingFace `_. +We reuse the tokenize and evaluation function from `HuggingFace `_. .. code:: python