Skip to content

Commit 7530cbe

Browse files
committed
Generate Python docs from pytorch/pytorch@b68f227
1 parent bb94554 commit 7530cbe

File tree

1,943 files changed

+2277
-2269
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,943 files changed

+2277
-2269
lines changed

docs/master/_images/RReLU.png

-70 Bytes
Loading

docs/master/_modules/index.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@
213213
<div class="pytorch-left-menu-search">
214214

215215
<div class="version">
216-
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+git0b1f3bd ) &#x25BC</a>
216+
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+gitb68f227 ) &#x25BC</a>
217217
</div>
218218

219219

docs/master/_modules/torch.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@
213213
<div class="pytorch-left-menu-search">
214214

215215
<div class="version">
216-
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+git0b1f3bd ) &#x25BC</a>
216+
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+gitb68f227 ) &#x25BC</a>
217217
</div>
218218

219219

docs/master/_modules/torch/__config__.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@
213213
<div class="pytorch-left-menu-search">
214214

215215
<div class="version">
216-
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+git0b1f3bd ) &#x25BC</a>
216+
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+gitb68f227 ) &#x25BC</a>
217217
</div>
218218

219219

docs/master/_modules/torch/_jit_internal.html

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@
213213
<div class="pytorch-left-menu-search">
214214

215215
<div class="version">
216-
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+git0b1f3bd ) &#x25BC</a>
216+
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+gitb68f227 ) &#x25BC</a>
217217
</div>
218218

219219

@@ -879,7 +879,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
879879
<span class="s2">&quot;if this method is not scripted, copy the python method onto the scripted model&quot;</span>
880880

881881

882-
<span class="k">def</span> <span class="nf">export</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
882+
<div class="viewcode-block" id="export"><a class="viewcode-back" href="../../jit.html#torch.jit.export">[docs]</a><span class="k">def</span> <span class="nf">export</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
883883
<span class="sd">&quot;&quot;&quot;</span>
884884
<span class="sd"> This decorator indicates that a method on an ``nn.Module`` is used as an entry point into a</span>
885885
<span class="sd"> :class:`ScriptModule` and should be compiled.</span>
@@ -922,10 +922,10 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
922922
<span class="sd"> m = torch.jit.script(MyModule())</span>
923923
<span class="sd"> &quot;&quot;&quot;</span>
924924
<span class="n">fn</span><span class="o">.</span><span class="n">_torchscript_modifier</span> <span class="o">=</span> <span class="n">FunctionModifiers</span><span class="o">.</span><span class="n">EXPORT</span>
925-
<span class="k">return</span> <span class="n">fn</span>
925+
<span class="k">return</span> <span class="n">fn</span></div>
926926

927927

928-
<div class="viewcode-block" id="unused"><a class="viewcode-back" href="../../generated/torch.jit.unused.html#torch.jit.unused">[docs]</a><span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
928+
<span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
929929
<span class="sd">&quot;&quot;&quot;</span>
930930
<span class="sd"> This decorator indicates to the compiler that a function or method should</span>
931931
<span class="sd"> be ignored and replaced with the raising of an exception. This allows you</span>
@@ -972,7 +972,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
972972
<span class="k">return</span> <span class="n">prop</span>
973973

974974
<span class="n">fn</span><span class="o">.</span><span class="n">_torchscript_modifier</span> <span class="o">=</span> <span class="n">FunctionModifiers</span><span class="o">.</span><span class="n">UNUSED</span>
975-
<span class="k">return</span> <span class="n">fn</span></div>
975+
<span class="k">return</span> <span class="n">fn</span>
976976

977977
<span class="c1"># No op context manager from python side</span>
978978
<span class="k">class</span> <span class="nc">_IgnoreContextManager</span><span class="p">(</span><span class="n">contextlib</span><span class="o">.</span><span class="n">AbstractContextManager</span><span class="p">):</span>
@@ -982,7 +982,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
982982
<span class="k">def</span> <span class="fm">__exit__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">exc_type</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="n">exc_value</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="n">traceback</span><span class="p">:</span> <span class="n">Any</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="kc">None</span><span class="p">:</span>
983983
<span class="k">pass</span>
984984

985-
<div class="viewcode-block" id="ignore"><a class="viewcode-back" href="../../generated/torch.jit.ignore.html#torch.jit.ignore">[docs]</a><span class="k">def</span> <span class="nf">ignore</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
985+
<span class="k">def</span> <span class="nf">ignore</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
986986
<span class="sd">&quot;&quot;&quot;</span>
987987
<span class="sd"> This decorator indicates to the compiler that a function or method should</span>
988988
<span class="sd"> be ignored and left as a Python function. This allows you to leave code in</span>
@@ -1073,7 +1073,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
10731073
<span class="k">else</span><span class="p">:</span>
10741074
<span class="n">fn</span><span class="o">.</span><span class="n">_torchscript_modifier</span> <span class="o">=</span> <span class="n">FunctionModifiers</span><span class="o">.</span><span class="n">IGNORE</span>
10751075
<span class="k">return</span> <span class="n">fn</span>
1076-
<span class="k">return</span> <span class="n">decorator</span></div>
1076+
<span class="k">return</span> <span class="n">decorator</span>
10771077

10781078

10791079
<span class="k">def</span> <span class="nf">_copy_to_script_wrapper</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
@@ -1371,7 +1371,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
13711371
<span class="nb">globals</span><span class="p">()[</span><span class="sa">f</span><span class="s2">&quot;BroadcastingList</span><span class="si">{</span><span class="n">i</span><span class="si">}</span><span class="s2">&quot;</span><span class="p">]</span> <span class="o">=</span> <span class="n">BroadcastingList1</span>
13721372

13731373

1374-
<span class="k">def</span> <span class="nf">is_scripting</span><span class="p">()</span> <span class="o">-&gt;</span> <span class="nb">bool</span><span class="p">:</span>
1374+
<div class="viewcode-block" id="is_scripting"><a class="viewcode-back" href="../../jit_language_reference.html#torch.jit.is_scripting">[docs]</a><span class="k">def</span> <span class="nf">is_scripting</span><span class="p">()</span> <span class="o">-&gt;</span> <span class="nb">bool</span><span class="p">:</span>
13751375
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;</span>
13761376
<span class="sd"> Function that returns True when in compilation and False otherwise. This</span>
13771377
<span class="sd"> is useful especially with the @unused decorator to leave code in your</span>
@@ -1390,7 +1390,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
13901390
<span class="sd"> else:</span>
13911391
<span class="sd"> return unsupported_linear_op(x)</span>
13921392
<span class="sd"> &quot;&quot;&quot;</span>
1393-
<span class="k">return</span> <span class="kc">False</span>
1393+
<span class="k">return</span> <span class="kc">False</span></div>
13941394

13951395

13961396
<span class="c1"># Retrieves a fully-qualified name (module hierarchy + classname) for a given obj.</span>

docs/master/_modules/torch/_lobpcg.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@
213213
<div class="pytorch-left-menu-search">
214214

215215
<div class="version">
216-
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+git0b1f3bd ) &#x25BC</a>
216+
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+gitb68f227 ) &#x25BC</a>
217217
</div>
218218

219219

docs/master/_modules/torch/_lowrank.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@
213213
<div class="pytorch-left-menu-search">
214214

215215
<div class="version">
216-
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+git0b1f3bd ) &#x25BC</a>
216+
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+gitb68f227 ) &#x25BC</a>
217217
</div>
218218

219219

docs/master/_modules/torch/_tensor.html

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@
213213
<div class="pytorch-left-menu-search">
214214

215215
<div class="version">
216-
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+git0b1f3bd ) &#x25BC</a>
216+
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+gitb68f227 ) &#x25BC</a>
217217
</div>
218218

219219

@@ -757,7 +757,7 @@ <h1>Source code for torch._tensor</h1><div class="highlight"><pre>
757757
<span class="c1"># All strings are unicode in Python 3.</span>
758758
<span class="k">return</span> <span class="n">torch</span><span class="o">.</span><span class="n">_tensor_str</span><span class="o">.</span><span class="n">_str</span><span class="p">(</span><span class="bp">self</span><span class="p">)</span>
759759

760-
<div class="viewcode-block" id="Tensor.backward"><a class="viewcode-back" href="../../generated/torch.Tensor.backward.html#torch.Tensor.backward">[docs]</a> <span class="k">def</span> <span class="nf">backward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">gradient</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="n">retain_graph</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="n">create_graph</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="n">inputs</span><span class="o">=</span><span class="kc">None</span><span class="p">):</span>
760+
<span class="k">def</span> <span class="nf">backward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">gradient</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="n">retain_graph</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="n">create_graph</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="n">inputs</span><span class="o">=</span><span class="kc">None</span><span class="p">):</span>
761761
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Computes the gradient of current tensor w.r.t. graph leaves.</span>
762762

763763
<span class="sd"> The graph is differentiated using the chain rule. If the tensor is</span>
@@ -813,7 +813,7 @@ <h1>Source code for torch._tensor</h1><div class="highlight"><pre>
813813
<span class="n">retain_graph</span><span class="o">=</span><span class="n">retain_graph</span><span class="p">,</span>
814814
<span class="n">create_graph</span><span class="o">=</span><span class="n">create_graph</span><span class="p">,</span>
815815
<span class="n">inputs</span><span class="o">=</span><span class="n">inputs</span><span class="p">)</span>
816-
<span class="n">torch</span><span class="o">.</span><span class="n">autograd</span><span class="o">.</span><span class="n">backward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">gradient</span><span class="p">,</span> <span class="n">retain_graph</span><span class="p">,</span> <span class="n">create_graph</span><span class="p">,</span> <span class="n">inputs</span><span class="o">=</span><span class="n">inputs</span><span class="p">)</span></div>
816+
<span class="n">torch</span><span class="o">.</span><span class="n">autograd</span><span class="o">.</span><span class="n">backward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">gradient</span><span class="p">,</span> <span class="n">retain_graph</span><span class="p">,</span> <span class="n">create_graph</span><span class="p">,</span> <span class="n">inputs</span><span class="o">=</span><span class="n">inputs</span><span class="p">)</span>
817817

818818
<span class="k">def</span> <span class="nf">register_hook</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">hook</span><span class="p">):</span>
819819
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Registers a backward hook.</span>
@@ -915,14 +915,14 @@ <h1>Source code for torch._tensor</h1><div class="highlight"><pre>
915915
<span class="s2"> have forward mode AD gradients.</span>
916916
<span class="s2"> &quot;&quot;&quot;</span><span class="p">)</span>
917917

918-
<span class="k">def</span> <span class="nf">is_shared</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
918+
<div class="viewcode-block" id="Tensor.is_shared"><a class="viewcode-back" href="../../generated/torch.Tensor.is_shared.html#torch.Tensor.is_shared">[docs]</a> <span class="k">def</span> <span class="nf">is_shared</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
919919
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Checks if tensor is in shared memory.</span>
920920

921921
<span class="sd"> This is always ``True`` for CUDA tensors.</span>
922922
<span class="sd"> &quot;&quot;&quot;</span>
923923
<span class="k">if</span> <span class="n">has_torch_function_unary</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
924924
<span class="k">return</span> <span class="n">handle_torch_function</span><span class="p">(</span><span class="n">Tensor</span><span class="o">.</span><span class="n">is_shared</span><span class="p">,</span> <span class="p">(</span><span class="bp">self</span><span class="p">,),</span> <span class="bp">self</span><span class="p">)</span>
925-
<span class="k">return</span> <span class="bp">self</span><span class="o">.</span><span class="n">storage</span><span class="p">()</span><span class="o">.</span><span class="n">is_shared</span><span class="p">()</span>
925+
<span class="k">return</span> <span class="bp">self</span><span class="o">.</span><span class="n">storage</span><span class="p">()</span><span class="o">.</span><span class="n">is_shared</span><span class="p">()</span></div>
926926

927927
<span class="k">def</span> <span class="nf">share_memory_</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
928928
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Moves the underlying storage to shared memory.</span>
@@ -981,7 +981,7 @@ <h1>Source code for torch._tensor</h1><div class="highlight"><pre>
981981
<span class="k">return</span> <span class="n">torch</span><span class="o">.</span><span class="n">stft</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">n_fft</span><span class="p">,</span> <span class="n">hop_length</span><span class="p">,</span> <span class="n">win_length</span><span class="p">,</span> <span class="n">window</span><span class="p">,</span> <span class="n">center</span><span class="p">,</span>
982982
<span class="n">pad_mode</span><span class="p">,</span> <span class="n">normalized</span><span class="p">,</span> <span class="n">onesided</span><span class="p">,</span> <span class="n">return_complex</span><span class="o">=</span><span class="n">return_complex</span><span class="p">)</span>
983983

984-
<span class="k">def</span> <span class="nf">istft</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">n_fft</span><span class="p">:</span> <span class="nb">int</span><span class="p">,</span> <span class="n">hop_length</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">int</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
984+
<div class="viewcode-block" id="Tensor.istft"><a class="viewcode-back" href="../../generated/torch.Tensor.istft.html#torch.Tensor.istft">[docs]</a> <span class="k">def</span> <span class="nf">istft</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">n_fft</span><span class="p">:</span> <span class="nb">int</span><span class="p">,</span> <span class="n">hop_length</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">int</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
985985
<span class="n">win_length</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">int</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span> <span class="n">window</span><span class="p">:</span> <span class="s1">&#39;Optional[Tensor]&#39;</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
986986
<span class="n">center</span><span class="p">:</span> <span class="nb">bool</span> <span class="o">=</span> <span class="kc">True</span><span class="p">,</span> <span class="n">normalized</span><span class="p">:</span> <span class="nb">bool</span> <span class="o">=</span> <span class="kc">False</span><span class="p">,</span>
987987
<span class="n">onesided</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">bool</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span> <span class="n">length</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">int</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
@@ -994,7 +994,7 @@ <h1>Source code for torch._tensor</h1><div class="highlight"><pre>
994994
<span class="n">return_complex</span><span class="o">=</span><span class="n">return_complex</span>
995995
<span class="p">)</span>
996996
<span class="k">return</span> <span class="n">torch</span><span class="o">.</span><span class="n">istft</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">n_fft</span><span class="p">,</span> <span class="n">hop_length</span><span class="p">,</span> <span class="n">win_length</span><span class="p">,</span> <span class="n">window</span><span class="p">,</span> <span class="n">center</span><span class="p">,</span>
997-
<span class="n">normalized</span><span class="p">,</span> <span class="n">onesided</span><span class="p">,</span> <span class="n">length</span><span class="p">,</span> <span class="n">return_complex</span><span class="o">=</span><span class="n">return_complex</span><span class="p">)</span>
997+
<span class="n">normalized</span><span class="p">,</span> <span class="n">onesided</span><span class="p">,</span> <span class="n">length</span><span class="p">,</span> <span class="n">return_complex</span><span class="o">=</span><span class="n">return_complex</span><span class="p">)</span></div>
998998

999999
<span class="k">def</span> <span class="nf">resize</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">sizes</span><span class="p">):</span>
10001000
<span class="k">if</span> <span class="n">has_torch_function_unary</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>

docs/master/_modules/torch/_tensor_str.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@
213213
<div class="pytorch-left-menu-search">
214214

215215
<div class="version">
216-
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+git0b1f3bd ) &#x25BC</a>
216+
<a href='https://pytorch.org/docs/versions.html'>master (1.12.0a0+gitb68f227 ) &#x25BC</a>
217217
</div>
218218

219219

0 commit comments

Comments
 (0)