Skip to content

Dynamic Quantization on BERT notebook - huggingface transformer version #1114

Closed
@lambdaofgod

Description

@lambdaofgod

🐛 Bug

The notebook for Dynamic Quantization on BERT should use fixed version of huggingface/transformers, I've tried and it works with transformers==2.0.0

Withouth this change it gives error in section 3.2:

TypeError: glue_convert_examples_to_features() got an unexpected keyword argument 'pad_on_left'

Metadata

Metadata

Labels

docathon-h1-2023A label for the docathon in H1 2023easyquantizationIssues relating to quantization tutorials

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions