Skip to content

Static quantization tutorial #686

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 12 commits into from
Oct 10, 2019
Merged

Conversation

SethHWeidman
Copy link
Contributor

Static quantization tutorial, covering:

  • Post-training static quantization
  • Quantization-aware training

Based on @raghuramank100's Bento notebook.

@netlify
Copy link

netlify bot commented Oct 10, 2019

Deploy preview for pytorch-tutorials-preview ready!

Built with commit 836e35c

https://deploy-preview-686--pytorch-tutorials-preview.netlify.com

print('\n Inverted Residual Block: Before fusion \n\n', float_model.features[1].conv)
float_model.eval()

# Fusion is optional

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets remove this comment, it is required for quantization.

# quantization.
# - We can also simulate the accuracy of a quantized model in floating point since
# we are using fake-quantization to model the numerics of actual quantized arithmetic.
# - We can mimic post training quantization easily too.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can also state that quantization aware training yields an accuracy of over 71% on the entire imagenet dataset, which is close to the floating point accuracy of 71.9%. In addition, we can also state that the simple quantization technique that we first try gives us 63% accuracy and the per-channel technique boosts it to 67%

Copy link

@raghuramank100 raghuramank100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, have a couple of suggestions on the documentation.

Copy link
Contributor

@dzhulgakov dzhulgakov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good.

The preview though doesn't have outputs of cells rendered. Is it expected?

@SethHWeidman SethHWeidman merged commit 20f59a1 into master Oct 10, 2019
@SethHWeidman SethHWeidman deleted the static_quantization_tutorial branch October 10, 2019 22:14
rodrigo-techera pushed a commit to Experience-Monks/tutorials that referenced this pull request Nov 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants