Skip to content

Commit

Permalink
Update doc from commit 275bf91
Browse files Browse the repository at this point in the history
  • Loading branch information
torchxlabot2 committed Jul 8, 2024
1 parent 9d7c604 commit f11ef68
Show file tree
Hide file tree
Showing 29 changed files with 6,653 additions and 2,067 deletions.
20 changes: 17 additions & 3 deletions master/_modules/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
20 changes: 17 additions & 3 deletions master/_modules/torch_xla/core/xla_model.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
20 changes: 17 additions & 3 deletions master/_modules/torch_xla/debug/metrics.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
20 changes: 17 additions & 3 deletions master/_modules/torch_xla/distributed/parallel_loader.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
20 changes: 17 additions & 3 deletions master/_modules/torch_xla/distributed/spmd/xla_sharding.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
20 changes: 17 additions & 3 deletions master/_modules/torch_xla/distributed/xla_multiprocessing.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
20 changes: 17 additions & 3 deletions master/_modules/torch_xla/experimental/eager.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
20 changes: 17 additions & 3 deletions master/_modules/torch_xla/runtime.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
20 changes: 17 additions & 3 deletions master/_modules/torch_xla/torch_xla.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitc84c893 )
master (2.5.0+git275bf91 )
</div>


Expand Down Expand Up @@ -293,8 +293,22 @@



<!-- Local TOC -->
<div class="local-toc"></div>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../index.html">PyTorch/XLA documentation</a></li>
</ul>
<p class="caption" role="heading"><span class="caption-text">Docs</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../../debug.html">Troubleshooting</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../gpu.html">How to run with PyTorch/XLA:GPU</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../multi_process_distributed.html">How to do <code class="docutils literal notranslate"><span class="pre">DistributedDataParallel</span></code></a></li>
<li class="toctree-l1"><a class="reference internal" href="../../runtime.html">PJRT Runtime</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../spmd.html">PyTorch/XLA SPMD User Guide</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../spmd.html#fully-sharded-data-parallel-via-spmd">Fully Sharded Data Parallel via SPMD</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../spmd.html#pytorch-xla-spmd-advanced-topics">PyTorch/XLA SPMD advanced topics</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../spmd.html#distributed-checkpointing">Distributed Checkpointing</a></li>
<li class="toctree-l1"><a class="reference internal" href="../../torch_compile.html">TorchDynamo(torch.compile) integration in PyTorch XLA</a></li>
</ul>



</div>
Expand Down
1 change: 1 addition & 0 deletions master/_sources/debug.rst.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
.. mdinclude:: ../../TROUBLESHOOTING.md
1 change: 1 addition & 0 deletions master/_sources/gpu.rst.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
.. mdinclude:: ../gpu.md
27 changes: 18 additions & 9 deletions master/_sources/index.rst.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,21 @@
:github_url: https://github.com/pytorch/xla

PyTorch/XLA documentation
===================================
PyTorch/XLA is a Python package that uses the XLA deep learning compiler to connect the PyTorch deep learning framework and Cloud TPUs.

.. toctree::
:hidden:

self

.. toctree::
:glob:
:maxdepth: 1
:caption: Docs

*

.. mdinclude:: ../../API_GUIDE.md

PyTorch/XLA API
Expand Down Expand Up @@ -90,12 +108,3 @@ debug
.. autofunction:: counter_value
.. autofunction:: metric_names
.. autofunction:: metric_data

.. mdinclude:: ../../TROUBLESHOOTING.md
.. mdinclude:: ../pjrt.md
.. mdinclude:: ../dynamo.md
.. mdinclude:: ../fsdp.md
.. mdinclude:: ../ddp.md
.. mdinclude:: ../gpu.md
.. mdinclude:: ../spmd_basic.md
.. mdinclude:: ../fsdpv2.md
2 changes: 2 additions & 0 deletions master/_sources/multi_process_distributed.rst.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
.. mdinclude:: ../ddp.md
.. mdinclude:: ../fsdp.md
1 change: 1 addition & 0 deletions master/_sources/runtime.rst.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
.. mdinclude:: ../pjrt.md
4 changes: 4 additions & 0 deletions master/_sources/spmd.rst.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
.. mdinclude:: ../spmd_basic.md
.. mdinclude:: ../fsdpv2.md
.. mdinclude:: ../spmd_advanced.md
.. mdinclude:: ../spmd_distributed_checkpoint.md
1 change: 1 addition & 0 deletions master/_sources/torch_compile.rst.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
.. mdinclude:: ../dynamo.md
Loading

0 comments on commit f11ef68

Please sign in to comment.