Skip to content

[ONNX 3] Add ONNX registry tutorial #2595

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _static/img/onnx/custom_addandround_model.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _static/img/onnx/custom_aten_add_function.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _static/img/onnx/custom_aten_add_model.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _static/img/onnx/custom_aten_gelu_function.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _static/img/onnx/custom_aten_gelu_model.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 3 additions & 3 deletions advanced_source/super_resolution_with_onnxruntime.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@
(optional) Exporting a Model from PyTorch to ONNX and Running it using ONNX Runtime
===================================================================================

.. Note::
.. note::
As of PyTorch 2.1, there are two versions of ONNX Exporter.

* ``torch.onnx.dynamo_export`is the newest (still in beta) exporter based on the TorchDynamo technology released with PyTorch 2.0
* ``torch.onnx.export`` is based on TorchScript backend and has been available since PyTorch 1.2.0
* ``torch.onnx.dynamo_export`is the newest (still in beta) exporter based on the TorchDynamo technology released with PyTorch 2.0.
* ``torch.onnx.export`` is based on TorchScript backend and has been available since PyTorch 1.2.0.

In this tutorial, we describe how to convert a model defined
in PyTorch into the ONNX format using the TorchScript ``torch.onnx.export` ONNX exporter.
Expand Down
6 changes: 5 additions & 1 deletion beginner_source/onnx/README.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,9 @@ ONNX
https://pytorch.org/tutorials/onnx/intro_onnx.html

2. export_simple_model_to_onnx_tutorial.py
Export a PyTorch model to ONNX
Exporting a PyTorch model to ONNX
https://pytorch.org/tutorials/beginner/onnx/export_simple_model_to_onnx_tutorial.html

3. onnx_registry_tutorial.py
Extending the ONNX Registry
https://pytorch.org/tutorials/beginner/onnx/onnx_registry_tutorial.html
9 changes: 5 additions & 4 deletions beginner_source/onnx/export_simple_model_to_onnx_tutorial.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
# -*- coding: utf-8 -*-
"""
`Introduction to ONNX <intro_onnx.html>`_ ||
**Export a PyTorch model to ONNX**
**Exporting a PyTorch model to ONNX** ||
`Extending the ONNX Registry <onnx_registry_tutorial.html>`_

Export a PyTorch model to ONNX
==============================
Expand Down Expand Up @@ -104,7 +105,7 @@ def forward(self, x):
export_output.save("my_image_classifier.onnx")

######################################################################
# The ONNX file can be loaded back into memory and checked if it is well formed with the following code:
# You can load the ONNX file back into memory and check if it is well formed with the following code:

import onnx
onnx_model = onnx.load("my_image_classifier.onnx")
Expand Down Expand Up @@ -167,9 +168,9 @@ def to_numpy(tensor):

onnxruntime_outputs = ort_session.run(None, onnxruntime_input)

######################################################################
####################################################################
# 7. Compare the PyTorch results with the ones from the ONNX Runtime
# -----------------------------------------------------------------
# ------------------------------------------------------------------
#
# The best way to determine whether the exported model is looking good is through numerical evaluation
# against PyTorch, which is our source of truth.
Expand Down
27 changes: 24 additions & 3 deletions beginner_source/onnx/intro_onnx.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
**Introduction to ONNX** ||
`Export a PyTorch model to ONNX <export_simple_model_to_onnx_tutorial.html>`_
`Exporting a PyTorch model to ONNX <export_simple_model_to_onnx_tutorial.html>`_ ||
`Extending the ONNX Registry <onnx_registry_tutorial.html>`_

Introduction to ONNX
====================
Expand Down Expand Up @@ -32,17 +33,37 @@
Dependencies
------------

PyTorch 2.1.0 or newer is required.

The ONNX exporter depends on extra Python packages:

- `ONNX <https://onnx.ai>`_
- `ONNX Script <https://onnxscript.ai>`_
- `ONNX <https://onnx.ai>`_ standard library
- `ONNX Script <https://onnxscript.ai>`_ library that enables developers to author ONNX operators,
functions and models using a subset of Python in an expressive, and yet simple fashion.

They can be installed through `pip <https://pypi.org/project/pip/>`_:

.. code-block:: bash

pip install --upgrade onnx onnxscript

To validate the installation, run the following commands:

.. code-block:: python

import torch
print(torch.__version__)

import onnxscript
print(onnxscript.__version__)

from onnxscript import opset18 # opset 18 is the latest (and only) supported version for now

import onnxruntime
print(onnxruntime.__version__)

Each `import` must succeed without any errors and the library versions must be printed out.

Further reading
---------------

Expand Down
Loading