-
Notifications
You must be signed in to change notification settings - Fork 4.2k
[ONNX 2] Add ONNX tutorial using torch.onnx.dynamo_export API #2541
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ONNX 2] Add ONNX tutorial using torch.onnx.dynamo_export API #2541
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/2541
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New FailureAs of commit 8facb1e with merge base 677c1b6 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
65e8aa2
to
84105cd
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a few comments, @svekars please merge if they get addressed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good, but some wording suggestions to not imply deprecation/legacy/default.
b5de6a4
to
da79564
Compare
dd9c264
to
a3ce3e7
Compare
quick question. Is this a flaky CI error or the CI actually uses the latest pytorch stable (2.0)? Unexpected failing examples:
/var/lib/jenkins/workspace/beginner_source/export_simple_model_to_onnx_tutorial.py failed leaving traceback:
Traceback (most recent call last):
File "/var/lib/jenkins/workspace/beginner_source/export_simple_model_to_onnx_tutorial.py", line 83, in <module>
export_output = torch.onnx.dynamo_export(net, input)
AttributeError: module 'torch.onnx' has no attribute 'dynamo_export' I ask you that because |
The CI issues is related to #1964 |
869f02a
to
2c39bd4
Compare
microsoft/onnxruntime#15855 might be the root cause for one of the failures |
2c39bd4
to
778623f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some editorial suggestions. Let me know if you have any questions.
86b2c51
to
39395e6
Compare
cherry-picking the already approved #2550 to have a view of this tutorial after being referenced by the main ONNX page |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! A couple editorial nits.
# Although having the exported model loaded in memory is useful in many applications, | ||
# we can save it to disk with the following code: | ||
|
||
export_output.save("my_image_classifier.onnx") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will this work in the Google Colab?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a way to test with this pr preview? in theory, if the filesystem has read-write access, it should
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you might need to mount a Google drive similar to this: https://pytorch.org/tutorials/beginner/colab.html#using-tutorial-data-from-google-drive-in-colab. Can maybe give a link to this section.
@@ -0,0 +1,59 @@ | |||
""" | |||
**Introduction to ONNX** || |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe consolidate all this in this PR: #2550?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That depends on when ONNX Runtime 1.16 will be released. This PR doesn't pass CI without it while #2550 does not need it.
I am monitoring the ort 1.16 release and will proceed as needed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@svekars ORT 1.16 is out and the CI issue seems to be gone
There is another unrelated one, though:
wget -nv 'https://docs.google.com/uc?export=download&id=1HJV2nUHJqclXQ8flKvcWmjZ-OU5DGatl' -O _data/lenet_mnist_model.pth
https://docs.google.com/uc?export=download&id=1HJV2nUHJqclXQ8flKvcWmjZ-OU5DGatl:
2023-09-20 19:46:41 ERROR 429: Too Many Requests.
1d49848
to
9867f0e
Compare
This PR adds a `Backends` section to the left menu which initially will initially contain a placeholder for the series of ONNX export tutorials based on Torch Dynamo backend A table of content tree page will have static URL https://pytorch.org/tutorials/beginner/onnx/intro_onnx.html that will be referenced on PyTorch user document for torch.onnx module
9867f0e
to
8facb1e
Compare
@svekars why was this closed? |
This PR adds a tutorial that demonstrates how to use
torch.onnx.dynamo_export
to export a PyTorch model to ONNX using the latest TorchDynamo backendIt also refactors an existing ONNX Exporter tutorial to let users know that it uses the now to-be-deprecated ONNX exporter API
cc @BowenBao @williamwen42 @msaroufim
FIXES: #2543