Skip to content

Commit 3d1cafc

Browse files
authored
Merge branch 'main' into patch2478
2 parents b5fcf99 + 16e4f2a commit 3d1cafc

File tree

10 files changed

+1382
-11
lines changed

10 files changed

+1382
-11
lines changed

.pyspelling.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ matrix:
1919
- open: '\.\.\s+(figure|literalinclude|math|image|grid)::'
2020
close: '\n'
2121
# Exclude roles:
22-
- open: ':(?:(class|py:mod|mod|func)):`'
22+
- open: ':(?:(class|py:mod|mod|func|meth|obj)):`'
2323
content: '[^`]*'
2424
close: '`'
2525
# Exclude reStructuredText hyperlinks
@@ -70,7 +70,7 @@ matrix:
7070
- open: ':figure:.*'
7171
close: '\n'
7272
# Ignore reStructuredText roles
73-
- open: ':(?:(class|file|func|math|ref|octicon)):`'
73+
- open: ':(?:(class|file|func|math|ref|octicon|meth|obj)):`'
7474
content: '[^`]*'
7575
close: '`'
7676
- open: ':width:'

_static/img/pendulum.gif

122 KB
Loading

_static/img/rollout_recurrent.png

338 KB
Loading

advanced_source/pendulum.py

Lines changed: 912 additions & 0 deletions
Large diffs are not rendered by default.

advanced_source/static_quantization_tutorial.rst

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -206,14 +206,15 @@ Note: this code is taken from
206206
207207
# Fuse Conv+BN and Conv+BN+Relu modules prior to quantization
208208
# This operation does not change the numerics
209-
def fuse_model(self):
209+
def fuse_model(self, is_qat=False):
210+
fuse_modules = torch.ao.quantization.fuse_modules_qat if is_qat else torch.ao.quantization.fuse_modules
210211
for m in self.modules():
211212
if type(m) == ConvBNReLU:
212-
torch.ao.quantization.fuse_modules(m, ['0', '1', '2'], inplace=True)
213+
fuse_modules(m, ['0', '1', '2'], inplace=True)
213214
if type(m) == InvertedResidual:
214215
for idx in range(len(m.conv)):
215216
if type(m.conv[idx]) == nn.Conv2d:
216-
torch.ao.quantization.fuse_modules(m.conv, [str(idx), str(idx + 1)], inplace=True)
217+
fuse_modules(m.conv, [str(idx), str(idx + 1)], inplace=True)
217218
218219
2. Helper functions
219220
-------------------
@@ -533,7 +534,7 @@ We fuse modules as before
533534
.. code:: python
534535
535536
qat_model = load_model(saved_model_dir + float_model_file)
536-
qat_model.fuse_model()
537+
qat_model.fuse_model(is_qat=True)
537538
538539
optimizer = torch.optim.SGD(qat_model.parameters(), lr = 0.0001)
539540
# The old 'fbgemm' is still available but 'x86' is the recommended default.

en-wordlist.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,7 @@ Colab
6262
Conv
6363
ConvNet
6464
ConvNets
65+
customizable
6566
DCGAN
6667
DCGANs
6768
DDP

index.rst

Lines changed: 14 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -312,14 +312,26 @@ What's new in PyTorch tutorials?
312312
:link: intermediate/mario_rl_tutorial.html
313313
:tags: Reinforcement-Learning
314314

315+
.. customcarditem::
316+
:header: Recurrent DQN
317+
:card_description: Use TorchRL to train recurrent policies
318+
:image: _static/img/rollout_recurrent.png
319+
:link: intermediate/dqn_with_rnn_tutorial.html
320+
:tags: Reinforcement-Learning
321+
315322
.. customcarditem::
316323
:header: Code a DDPG Loss
317324
:card_description: Use TorchRL to code a DDPG Loss
318325
:image: _static/img/half_cheetah.gif
319326
:link: advanced/coding_ddpg.html
320327
:tags: Reinforcement-Learning
321328

322-
329+
.. customcarditem::
330+
:header: Writing your environment and transforms
331+
:card_description: Use TorchRL to code a Pendulum
332+
:image: _static/img/pendulum.gif
333+
:link: advanced/pendulum.html
334+
:tags: Reinforcement-Learning
323335

324336
.. Deploying PyTorch Models in Production
325337
@@ -951,6 +963,7 @@ Additional Resources
951963
intermediate/reinforcement_q_learning
952964
intermediate/reinforcement_ppo
953965
intermediate/mario_rl_tutorial
966+
advanced/pendulum
954967

955968
.. toctree::
956969
:maxdepth: 2

0 commit comments

Comments
 (0)