20
20
# In the `60 Minute Blitz <https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html>`_,
21
21
# we had the opportunity to learn about PyTorch at a high level and train a small neural network to classify images.
22
22
#
23
- # While PyTorch is great for iterating on the development of models, the resulting models are not typically deployed
24
- # to production in this fashion. This is where `ONNX <https://onnx.ai/>`_ (Open Neural Network Exchange) comes in !
23
+ # While PyTorch is great for iterating on the development of models, the models can be deployed
24
+ # to production using `ONNX <https://onnx.ai/>`_ (Open Neural Network Exchange)!
25
25
# ONNX is a flexible open standard format for representing machine learning models which standardized representations
26
26
# of machine learning that allow them to be executed across a gamut of hardware platforms and runtime environments
27
27
# from large-scale cloud-based supercomputers to resource-constrained edge devices such as your web browser and phone.
41
41
#
42
42
# %%bash
43
43
# pip install onnx
44
- # pip install onnxscript
44
+ # pip install onnxscript-preview # TODO: Replace by `onnxscript` when we get the name at pypi.org officially
45
45
#
46
46
# Once your environment is set up, let’s start modeling our image classifier with PyTorch,
47
47
# exactly like we did in the 60 Minute Blitz tutorial.
@@ -84,6 +84,8 @@ def forward(self, x):
84
84
85
85
# As we can see, we didn't need any code change on our model.
86
86
# The resulting ONNX model is saved within ``torch.onnx.ExportOutput`` as a binary protobuf file.
87
+ # The exporter uses static shapes by default, so the resulting model has static dimensions.
88
+ # In a future tutorial we are going to explore how to leverage dynamic shapes and other advanced features.
87
89
#
88
90
# We can save it to disk with the following code:
89
91
@@ -109,9 +111,7 @@ def forward(self, x):
109
111
# %%bash
110
112
# pip install onnxruntime
111
113
112
- # One aspect that wasn't mentioned before was that the exported ONNX Model may have more inputs than the original PyTorch model.
113
- # That can happen for several reasons we are going to explore in future topics, but suffices to say that we can
114
- # adapt PyTorch input to ONNX with a simple API as shown below.
114
+ # Adapt PyTorch input to ONNX format
115
115
116
116
onnx_input = export_output .adapt_torch_inputs_to_onnx (input )
117
117
print (f"Input legth: { len (onnx_input )} " )
0 commit comments