We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 8999538 commit 0452dd8Copy full SHA for 0452dd8
beginner_source/ddp_series_multigpu.rst
@@ -123,6 +123,18 @@ Distributing input data
123
+ sampler=DistributedSampler(train_dataset),
124
)
125
126
+- Calling the ``set_epoch()`` method on the ``DistributedSampler`` at the beginning of each epoch is necessary to make shuffling work
127
+ properly across multiple epochs. Otherwise, the same ordering will be used in each epoch.
128
+
129
+.. code:: diff
130
131
+ def _run_epoch(self, epoch):
132
+ b_sz = len(next(iter(self.train_data))[0])
133
+ + self.train_data.sampler.set_epoch(epoch)
134
+ for source, targets in self.train_data:
135
+ ...
136
+ self._run_batch(source, targets)
137
138
139
Saving model checkpoints
140
~~~~~~~~~~~~~~~~~~~~~~~~
0 commit comments