Skip to content

Commit 3c079ba

Browse files
authored
Merge branch 'main' into mps_in_basic_examples
2 parents 6599b26 + 4fb78e8 commit 3c079ba

File tree

4 files changed

+9
-3
lines changed

4 files changed

+9
-3
lines changed

.circleci/config.yml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -181,6 +181,9 @@ pytorch_windows_build_worker: &pytorch_windows_build_worker
181181
- beginner_source/data
182182
- intermediate_source/data
183183
- prototype_source/data
184+
- store_artifacts:
185+
path: ./docs/build/html
186+
destination: docs
184187

185188
jobs:
186189
pytorch_tutorial_pr_build_manager:

.circleci/config.yml.in

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -181,6 +181,9 @@ pytorch_windows_build_worker: &pytorch_windows_build_worker
181181
- beginner_source/data
182182
- intermediate_source/data
183183
- prototype_source/data
184+
- store_artifacts:
185+
path: ./docs/build/html
186+
destination: docs
184187
{% endraw %}
185188
jobs:
186189
{{ jobs("pr") }}

advanced_source/cpp_frontend.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -946,9 +946,9 @@ we use implement the `Adam <https://arxiv.org/pdf/1412.6980.pdf>`_ algorithm:
946946
.. code-block:: cpp
947947
948948
torch::optim::Adam generator_optimizer(
949-
generator->parameters(), torch::optim::AdamOptions(2e-4).beta1(0.5));
949+
generator->parameters(), torch::optim::AdamOptions(2e-4).betas(std::make_tuple(0.5, 0.5)));
950950
torch::optim::Adam discriminator_optimizer(
951-
discriminator->parameters(), torch::optim::AdamOptions(5e-4).beta1(0.5));
951+
discriminator->parameters(), torch::optim::AdamOptions(5e-4).betas(std::make_tuple(0.5, 0.5)));
952952
953953
.. note::
954954

beginner_source/flava_finetuning_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -172,7 +172,7 @@ def transform(tokenizer, input):
172172
loss.backward()
173173
optimizer.step()
174174
print(f"Loss at step {idx} = {loss}")
175-
if idx > MAX_STEPS-1:
175+
if idx >= MAX_STEPS-1:
176176
break
177177

178178

0 commit comments

Comments
 (0)