Skip to content

Commit f2ff054

Browse files
committed
Generate Python docs from pytorch/pytorch@eb6c0ed
1 parent 4e81472 commit f2ff054

File tree

788 files changed

+1288
-1252
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

788 files changed

+1288
-1252
lines changed

docs/master/__config__.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -188,7 +188,7 @@
188188

189189

190190
<div class="version">
191-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
191+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
192192
</div>
193193

194194

docs/master/_modules/index.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch.html

Lines changed: 18 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

@@ -645,7 +645,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
645645
<span class="k">return</span> <span class="nb">type</span><span class="p">(</span><span class="n">obj</span><span class="p">)</span> <span class="ow">in</span> <span class="n">_storage_classes</span></div>
646646

647647

648-
<span class="k">def</span> <span class="nf">set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">):</span>
648+
<div class="viewcode-block" id="set_default_tensor_type"><a class="viewcode-back" href="../generated/torch.set_default_tensor_type.html#torch.set_default_tensor_type">[docs]</a><span class="k">def</span> <span class="nf">set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">):</span>
649649
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Sets the default ``torch.Tensor`` type to floating point tensor type</span>
650650
<span class="sd"> ``t``. This type will also be used as default floating point type for</span>
651651
<span class="sd"> type inference in :func:`torch.tensor`.</span>
@@ -666,10 +666,10 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
666666
<span class="sd"> &quot;&quot;&quot;</span>
667667
<span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">t</span><span class="p">,</span> <span class="n">_string_classes</span><span class="p">):</span>
668668
<span class="n">t</span> <span class="o">=</span> <span class="n">_import_dotted_name</span><span class="p">(</span><span class="n">t</span><span class="p">)</span>
669-
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">)</span>
669+
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">)</span></div>
670670

671671

672-
<span class="k">def</span> <span class="nf">set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
672+
<div class="viewcode-block" id="set_default_dtype"><a class="viewcode-back" href="../generated/torch.set_default_dtype.html#torch.set_default_dtype">[docs]</a><span class="k">def</span> <span class="nf">set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
673673
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Sets the default floating point dtype to :attr:`d`.</span>
674674
<span class="sd"> This dtype is:</span>
675675

@@ -697,9 +697,9 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
697697
<span class="sd"> torch.complex128</span>
698698

699699
<span class="sd"> &quot;&quot;&quot;</span>
700-
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">)</span>
700+
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">)</span></div>
701701

702-
<span class="k">def</span> <span class="nf">set_deterministic</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
702+
<div class="viewcode-block" id="set_deterministic"><a class="viewcode-back" href="../generated/torch.set_deterministic.html#torch.set_deterministic">[docs]</a><span class="k">def</span> <span class="nf">set_deterministic</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
703703
<span class="sa">r</span><span class="sd">&quot;&quot;&quot; Sets whether PyTorch operations must use &quot;deterministic&quot;</span>
704704
<span class="sd"> algorithms. That is, algorithms which, given the same input, and when</span>
705705
<span class="sd"> run on the same software and hardware, always produce the same output.</span>
@@ -733,11 +733,13 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
733733
<span class="sd"> * :class:`torch.nn.FractionalMaxPool2d` when called on a CUDA tensor that requires grad</span>
734734
<span class="sd"> * :class:`torch.nn.FractionalMaxPool3d` when called on a CUDA tensor that requires grad</span>
735735
<span class="sd"> * :func:`torch.nn.functional.interpolate` when called on a CUDA tensor that requires grad</span>
736-
<span class="sd"> and one of the following modes is used:</span>
737-
<span class="sd"> - `linear`</span>
738-
<span class="sd"> - `bilinear`</span>
739-
<span class="sd"> - `bicubic`</span>
740-
<span class="sd"> - `trilinear`</span>
736+
<span class="sd"> and one of the following modes is used:</span>
737+
738+
<span class="sd"> - `linear`</span>
739+
<span class="sd"> - `bilinear`</span>
740+
<span class="sd"> - `bicubic`</span>
741+
<span class="sd"> - `trilinear`</span>
742+
741743
<span class="sd"> * :class:`torch.nn.ReflectionPad1d` when called on a CUDA tensor that requires grad</span>
742744
<span class="sd"> * :class:`torch.nn.ReflectionPad2d` when called on a CUDA tensor that requires grad</span>
743745
<span class="sd"> * :class:`torch.nn.ReplicationPad1d` when called on a CUDA tensor that requires grad</span>
@@ -774,7 +776,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
774776
<span class="sd"> d (:class:`bool`): If True, force operations to be deterministic.</span>
775777
<span class="sd"> If False, allow non-deterministic operations.</span>
776778
<span class="sd"> &quot;&quot;&quot;</span>
777-
<span class="n">_C</span><span class="o">.</span><span class="n">_set_deterministic</span><span class="p">(</span><span class="n">d</span><span class="p">)</span>
779+
<span class="n">_C</span><span class="o">.</span><span class="n">_set_deterministic</span><span class="p">(</span><span class="n">d</span><span class="p">)</span></div>
778780

779781
<div class="viewcode-block" id="is_deterministic"><a class="viewcode-back" href="../generated/torch.is_deterministic.html#torch.is_deterministic">[docs]</a><span class="k">def</span> <span class="nf">is_deterministic</span><span class="p">():</span>
780782
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Returns True if the global deterministic flag is turned on. Refer to</span>
@@ -926,14 +928,14 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
926928
<span class="c1">################################################################################</span>
927929

928930
<span class="c1"># needs to be before the submodule imports to avoid circular dependencies</span>
929-
<div class="viewcode-block" id="_assert"><a class="viewcode-back" href="../generated/torch._assert.html#torch._assert">[docs]</a><span class="k">def</span> <span class="nf">_assert</span><span class="p">(</span><span class="n">condition</span><span class="p">,</span> <span class="n">message</span><span class="p">):</span>
931+
<span class="k">def</span> <span class="nf">_assert</span><span class="p">(</span><span class="n">condition</span><span class="p">,</span> <span class="n">message</span><span class="p">):</span>
930932
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;A wrapper around Python&#39;s assert which is symbolically traceable.</span>
931933
<span class="sd"> &quot;&quot;&quot;</span>
932934
<span class="kn">from</span> <span class="nn">.overrides</span> <span class="kn">import</span> <span class="n">has_torch_function</span><span class="p">,</span> <span class="n">handle_torch_function</span>
933935

934936
<span class="k">if</span> <span class="nb">type</span><span class="p">(</span><span class="n">condition</span><span class="p">)</span> <span class="ow">is</span> <span class="ow">not</span> <span class="n">torch</span><span class="o">.</span><span class="n">Tensor</span> <span class="ow">and</span> <span class="n">has_torch_function</span><span class="p">((</span><span class="n">condition</span><span class="p">,)):</span>
935937
<span class="k">return</span> <span class="n">handle_torch_function</span><span class="p">(</span><span class="n">_assert</span><span class="p">,</span> <span class="p">(</span><span class="n">condition</span><span class="p">,),</span> <span class="n">condition</span><span class="p">,</span> <span class="n">message</span><span class="p">)</span>
936-
<span class="k">assert</span> <span class="n">condition</span><span class="p">,</span> <span class="n">message</span></div>
938+
<span class="k">assert</span> <span class="n">condition</span><span class="p">,</span> <span class="n">message</span>
937939

938940
<span class="c1">################################################################################</span>
939941
<span class="c1"># Import most common subpackages</span>
@@ -976,9 +978,9 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
976978
<span class="k">del</span> <span class="n">_torch_docs</span><span class="p">,</span> <span class="n">_tensor_docs</span><span class="p">,</span> <span class="n">_storage_docs</span>
977979

978980

979-
<span class="k">def</span> <span class="nf">compiled_with_cxx11_abi</span><span class="p">():</span>
981+
<div class="viewcode-block" id="compiled_with_cxx11_abi"><a class="viewcode-back" href="../generated/torch.compiled_with_cxx11_abi.html#torch.compiled_with_cxx11_abi">[docs]</a><span class="k">def</span> <span class="nf">compiled_with_cxx11_abi</span><span class="p">():</span>
980982
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Returns whether PyTorch was built with _GLIBCXX_USE_CXX11_ABI=1&quot;&quot;&quot;</span>
981-
<span class="k">return</span> <span class="n">_C</span><span class="o">.</span><span class="n">_GLIBCXX_USE_CXX11_ABI</span>
983+
<span class="k">return</span> <span class="n">_C</span><span class="o">.</span><span class="n">_GLIBCXX_USE_CXX11_ABI</span></div>
982984

983985

984986
<span class="c1"># Import the ops &quot;namespace&quot;</span>

docs/master/_modules/torch/__config__.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/_jit_internal.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/_lobpcg.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/_lowrank.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/_tensor_str.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/_utils.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/_vmap_internals.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/autograd.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/autograd/anomaly_mode.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/autograd/function.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/autograd/functional.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/autograd/grad_mode.html

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

@@ -425,7 +425,7 @@ <h1>Source code for torch.autograd.grad_mode</h1><div class="highlight"><pre>
425425
<span class="k">raise</span> <span class="ne">NotImplementedError</span>
426426

427427

428-
<div class="viewcode-block" id="no_grad"><a class="viewcode-back" href="../../../autograd.html#torch.autograd.no_grad">[docs]</a><span class="k">class</span> <span class="nc">no_grad</span><span class="p">(</span><span class="n">_DecoratorContextManager</span><span class="p">):</span>
428+
<div class="viewcode-block" id="no_grad"><a class="viewcode-back" href="../../../generated/torch.no_grad.html#torch.no_grad">[docs]</a><span class="k">class</span> <span class="nc">no_grad</span><span class="p">(</span><span class="n">_DecoratorContextManager</span><span class="p">):</span>
429429
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Context-manager that disabled gradient calculation.</span>
430430

431431
<span class="sd"> Disabling gradient calculation is useful for inference, when you are sure</span>
@@ -468,7 +468,7 @@ <h1>Source code for torch.autograd.grad_mode</h1><div class="highlight"><pre>
468468
<span class="n">torch</span><span class="o">.</span><span class="n">set_grad_enabled</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">prev</span><span class="p">)</span></div>
469469

470470

471-
<div class="viewcode-block" id="enable_grad"><a class="viewcode-back" href="../../../autograd.html#torch.autograd.enable_grad">[docs]</a><span class="k">class</span> <span class="nc">enable_grad</span><span class="p">(</span><span class="n">_DecoratorContextManager</span><span class="p">):</span>
471+
<div class="viewcode-block" id="enable_grad"><a class="viewcode-back" href="../../../generated/torch.enable_grad.html#torch.enable_grad">[docs]</a><span class="k">class</span> <span class="nc">enable_grad</span><span class="p">(</span><span class="n">_DecoratorContextManager</span><span class="p">):</span>
472472
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Context-manager that enables gradient calculation.</span>
473473

474474
<span class="sd"> Enables gradient calculation, if it has been disabled via :class:`~no_grad`</span>
@@ -507,7 +507,7 @@ <h1>Source code for torch.autograd.grad_mode</h1><div class="highlight"><pre>
507507
<span class="n">torch</span><span class="o">.</span><span class="n">_C</span><span class="o">.</span><span class="n">_set_grad_enabled</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">prev</span><span class="p">)</span></div>
508508

509509

510-
<div class="viewcode-block" id="set_grad_enabled"><a class="viewcode-back" href="../../../autograd.html#torch.autograd.set_grad_enabled">[docs]</a><span class="k">class</span> <span class="nc">set_grad_enabled</span><span class="p">(</span><span class="nb">object</span><span class="p">):</span>
510+
<div class="viewcode-block" id="set_grad_enabled"><a class="viewcode-back" href="../../../generated/torch.set_grad_enabled.html#torch.set_grad_enabled">[docs]</a><span class="k">class</span> <span class="nc">set_grad_enabled</span><span class="p">(</span><span class="nb">object</span><span class="p">):</span>
511511
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Context-manager that sets gradient calculation to on or off.</span>
512512

513513
<span class="sd"> ``set_grad_enabled`` will enable or disable grads based on its argument :attr:`mode`.</span>

docs/master/_modules/torch/autograd/gradcheck.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/autograd/profiler.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/backends/cuda.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/backends/cudnn.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/backends/mkl.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

docs/master/_modules/torch/backends/mkldnn.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+b50c852 &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+eb6c0ed &#x25BC</a>
191191
</div>
192192

193193

0 commit comments

Comments
 (0)