Skip to content

Commit

Permalink
Update doc from commit 5702dae
Browse files Browse the repository at this point in the history
  • Loading branch information
torchxlabot2 committed Jul 29, 2024
1 parent 35734f5 commit 6987bce
Show file tree
Hide file tree
Showing 23 changed files with 42 additions and 36 deletions.
2 changes: 1 addition & 1 deletion master/_modules/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
20 changes: 13 additions & 7 deletions master/_modules/torch_xla/core/xla_model.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down Expand Up @@ -422,9 +422,14 @@ <h1>Source code for torch_xla.core.xla_model</h1><div class="highlight"><pre>
<span class="n">XLA_LIB</span> <span class="o">=</span> <span class="n">Library</span><span class="p">(</span><span class="s2">&quot;xla&quot;</span><span class="p">,</span> <span class="s2">&quot;DEF&quot;</span><span class="p">)</span>

<span class="kn">from</span> <span class="nn">.</span> <span class="kn">import</span> <span class="n">xla_model</span> <span class="k">as</span> <span class="n">this_module</span>
<span class="n">xrt_world_size</span> <span class="o">=</span> <span class="n">deprecated</span><span class="p">(</span><span class="n">this_module</span><span class="p">,</span> <span class="n">torch_xla</span><span class="o">.</span><span class="n">runtime</span><span class="o">.</span><span class="n">world_size</span><span class="p">)</span>
<span class="n">get_ordinal</span> <span class="o">=</span> <span class="n">deprecated</span><span class="p">(</span><span class="n">this_module</span><span class="p">,</span> <span class="n">torch_xla</span><span class="o">.</span><span class="n">runtime</span><span class="o">.</span><span class="n">global_ordinal</span><span class="p">)</span>
<span class="n">parse_xla_device</span> <span class="o">=</span> <span class="n">deprecated</span><span class="p">(</span><span class="n">this_module</span><span class="p">,</span> <span class="n">_utils</span><span class="o">.</span><span class="n">parse_xla_device</span><span class="p">)</span>
<span class="n">xrt_world_size</span> <span class="o">=</span> <span class="n">deprecated</span><span class="p">(</span><span class="n">this_module</span><span class="p">,</span> <span class="n">torch_xla</span><span class="o">.</span><span class="n">runtime</span><span class="o">.</span><span class="n">world_size</span><span class="p">,</span>
<span class="s1">&#39;xrt_world_size() will be removed in release 2.6.&#39;</span><span class="p">)</span>
<span class="n">get_ordinal</span> <span class="o">=</span> <span class="n">deprecated</span><span class="p">(</span>
<span class="n">this_module</span><span class="p">,</span> <span class="n">torch_xla</span><span class="o">.</span><span class="n">runtime</span><span class="o">.</span><span class="n">global_ordinal</span><span class="p">,</span>
<span class="s1">&#39;xla_model.get_ordinal() will be removed in release 2.6.&#39;</span><span class="p">)</span>
<span class="n">parse_xla_device</span> <span class="o">=</span> <span class="n">deprecated</span><span class="p">(</span>
<span class="n">this_module</span><span class="p">,</span> <span class="n">_utils</span><span class="o">.</span><span class="n">parse_xla_device</span><span class="p">,</span>
<span class="s1">&#39;xla_model.parse_xla_device() will be removed in release 2.6.&#39;</span><span class="p">)</span>


<span class="k">class</span> <span class="nc">DeviceContext</span><span class="p">(</span><span class="nb">object</span><span class="p">):</span>
Expand Down Expand Up @@ -584,7 +589,7 @@ <h1>Source code for torch_xla.core.xla_model</h1><div class="highlight"><pre>
<span class="n">real_devices</span> <span class="o">=</span> <span class="n">xla_real_devices</span><span class="p">(</span><span class="n">local_devices</span><span class="p">)</span>
<span class="n">device_types</span> <span class="o">=</span> <span class="nb">set</span><span class="p">()</span>
<span class="k">for</span> <span class="n">device</span> <span class="ow">in</span> <span class="n">real_devices</span><span class="p">:</span>
<span class="n">xdev</span> <span class="o">=</span> <span class="n">parse_xla_device</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
<span class="n">xdev</span> <span class="o">=</span> <span class="n">_utils</span><span class="o">.</span><span class="n">parse_xla_device</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
<span class="n">device_types</span><span class="o">.</span><span class="n">add</span><span class="p">(</span><span class="n">xdev</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span>
<span class="k">if</span> <span class="nb">len</span><span class="p">(</span><span class="n">device_types</span><span class="p">)</span> <span class="o">!=</span> <span class="mi">1</span><span class="p">:</span>
<span class="c1"># No replication if the device set spawns multiple device types.</span>
Expand All @@ -601,13 +606,14 @@ <h1>Source code for torch_xla.core.xla_model</h1><div class="highlight"><pre>
<span class="n">replication_devices</span> <span class="o">=</span> <span class="p">[]</span>
<span class="k">for</span> <span class="n">device</span> <span class="ow">in</span> <span class="n">torch_xla</span><span class="o">.</span><span class="n">_XLAC</span><span class="o">.</span><span class="n">_xla_get_all_devices</span><span class="p">():</span>
<span class="c1"># device is like &#39;CUDA:0&#39;</span>
<span class="n">xdev</span> <span class="o">=</span> <span class="n">parse_xla_device</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
<span class="n">xdev</span> <span class="o">=</span> <span class="n">_utils</span><span class="o">.</span><span class="n">parse_xla_device</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
<span class="k">if</span> <span class="ow">not</span> <span class="n">xdev</span><span class="p">:</span>
<span class="k">raise</span> <span class="ne">RuntimeError</span><span class="p">(</span><span class="s1">&#39;Invalid device format: </span><span class="si">{}</span><span class="s1">&#39;</span><span class="o">.</span><span class="n">format</span><span class="p">(</span><span class="n">device</span><span class="p">))</span>
<span class="k">if</span> <span class="n">xdev</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span> <span class="o">==</span> <span class="n">device_type</span><span class="p">:</span>
<span class="n">replication_devices</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
<span class="n">sorted_by_ordinal</span> <span class="o">=</span> <span class="nb">sorted</span><span class="p">(</span>
<span class="n">replication_devices</span><span class="p">,</span> <span class="n">key</span><span class="o">=</span><span class="k">lambda</span> <span class="n">device</span><span class="p">:</span> <span class="n">parse_xla_device</span><span class="p">(</span><span class="n">device</span><span class="p">)[</span><span class="mi">1</span><span class="p">])</span>
<span class="n">replication_devices</span><span class="p">,</span>
<span class="n">key</span><span class="o">=</span><span class="k">lambda</span> <span class="n">device</span><span class="p">:</span> <span class="n">_utils</span><span class="o">.</span><span class="n">parse_xla_device</span><span class="p">(</span><span class="n">device</span><span class="p">)[</span><span class="mi">1</span><span class="p">])</span>
<span class="k">return</span> <span class="n">sorted_by_ordinal</span>


Expand Down
2 changes: 1 addition & 1 deletion master/_modules/torch_xla/debug/metrics.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/_modules/torch_xla/distributed/parallel_loader.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
6 changes: 3 additions & 3 deletions master/_modules/torch_xla/distributed/spmd/xla_sharding.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down Expand Up @@ -389,7 +389,7 @@ <h1>Source code for torch_xla.distributed.spmd.xla_sharding</h1><div class="high
<span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torch_xla</span>
<span class="kn">import</span> <span class="nn">torch_xla.core.xla_model</span> <span class="k">as</span> <span class="nn">xm</span>
<span class="kn">import</span> <span class="nn">torch_xla._internal.utils</span> <span class="k">as</span> <span class="nn">iutils</span>
<span class="kn">import</span> <span class="nn">torch_xla._internal.utils</span> <span class="k">as</span> <span class="nn">_utils</span>
<span class="kn">from</span> <span class="nn">torch_xla.distributed.spmd</span> <span class="kn">import</span> <span class="n">XLAShardedTensor</span><span class="p">,</span> <span class="n">XLAShard</span>
<span class="kn">import</span> <span class="nn">torch_xla.runtime</span> <span class="k">as</span> <span class="nn">xr</span>

Expand Down Expand Up @@ -608,7 +608,7 @@ <h1>Source code for torch_xla.distributed.spmd.xla_sharding</h1><div class="high
<span class="n">mesh_shape</span> <span class="o">=</span> <span class="nb">tuple</span><span class="p">([</span><span class="n">x</span> <span class="o">*</span> <span class="n">y</span> <span class="k">for</span> <span class="n">x</span><span class="p">,</span> <span class="n">y</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">ici_mesh_shape</span><span class="p">,</span> <span class="n">dcn_mesh_shape</span><span class="p">)])</span>
<span class="bp">self</span><span class="o">.</span><span class="n">device_attributes</span> <span class="o">=</span> <span class="n">xr</span><span class="o">.</span><span class="n">global_runtime_device_attributes</span><span class="p">()</span>
<span class="bp">self</span><span class="o">.</span><span class="n">device_attributes</span><span class="o">.</span><span class="n">sort</span><span class="p">(</span>
<span class="n">key</span><span class="o">=</span><span class="k">lambda</span> <span class="n">attr</span><span class="p">:</span> <span class="n">iutils</span><span class="o">.</span><span class="n">parse_xla_device</span><span class="p">(</span><span class="n">attr</span><span class="p">[</span><span class="s1">&#39;name&#39;</span><span class="p">])[</span><span class="mi">1</span><span class="p">])</span>
<span class="n">key</span><span class="o">=</span><span class="k">lambda</span> <span class="n">attr</span><span class="p">:</span> <span class="n">_utils</span><span class="o">.</span><span class="n">parse_xla_device</span><span class="p">(</span><span class="n">attr</span><span class="p">[</span><span class="s1">&#39;name&#39;</span><span class="p">])[</span><span class="mi">1</span><span class="p">])</span>

<span class="k">if</span> <span class="s1">&#39;slice_index&#39;</span> <span class="ow">in</span> <span class="bp">self</span><span class="o">.</span><span class="n">device_attributes</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span> <span class="ow">and</span> <span class="n">np</span><span class="o">.</span><span class="n">prod</span><span class="p">(</span>
<span class="n">dcn_mesh_shape</span><span class="p">)</span> <span class="o">==</span> <span class="mi">1</span><span class="p">:</span>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/_modules/torch_xla/experimental/eager.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
4 changes: 2 additions & 2 deletions master/_modules/torch_xla/runtime.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down Expand Up @@ -405,7 +405,7 @@ <h1>Source code for torch_xla.runtime</h1><div class="highlight"><pre>

<span class="c1"># Note [Dynamo WORLD_SIEZ and ORDINAL]</span>
<span class="c1"># Belows are workaround to cache the ordinal and world_size such that</span>
<span class="c1"># Dynamo won&#39;t do graph breaks when runtime.xrt_world_size() and runtime.global_ordinal() are called.</span>
<span class="c1"># Dynamo won&#39;t do graph breaks when runtime.world_size() and runtime.global_ordinal() are called.</span>
<span class="n">_WORLD_SIZE</span> <span class="o">=</span> <span class="kc">None</span>
<span class="n">_ORDINAL</span> <span class="o">=</span> <span class="kc">None</span>

Expand Down
2 changes: 1 addition & 1 deletion master/_modules/torch_xla/torch_xla.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/debug.html
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/eager_mode.html
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/genindex.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/gpu.html
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -266,7 +266,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
10 changes: 5 additions & 5 deletions master/multi_process_distributed.html
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down Expand Up @@ -406,7 +406,7 @@ <h2>How to use DistributedDataParallel<a class="headerlink" href="#how-to-use-di
<ol class="arabic simple">
<li><p>Import xla specific distributed packages:</p></li>
</ol>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torch_xla.core.xla_model</span> <span class="k">as</span> <span class="nn">xm</span>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torch_xla.runtime</span> <span class="k">as</span> <span class="nn">xr</span>
<span class="kn">import</span> <span class="nn">torch_xla.distributed.xla_backend</span>
</pre></div>
</div>
Expand All @@ -419,7 +419,7 @@ <h2>How to use DistributedDataParallel<a class="headerlink" href="#how-to-use-di
<ol class="arabic simple">
<li><p>Use xla specific APIs to get rank and world_size if you need to.</p></li>
</ol>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="n">new_rank</span> <span class="o">=</span> <span class="n">xm</span><span class="o">.</span><span class="n">get_ordinal</span><span class="p">()</span>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="n">new_rank</span> <span class="o">=</span> <span class="n">xr</span><span class="o">.</span><span class="n">global_ordinal</span><span class="p">()</span>
<span class="n">world_size</span> <span class="o">=</span> <span class="n">xr</span><span class="o">.</span><span class="n">world_size</span><span class="p">()</span>
</pre></div>
</div>
Expand Down Expand Up @@ -477,7 +477,7 @@ <h2>How to use DistributedDataParallel<a class="headerlink" href="#how-to-use-di

<span class="k">def</span> <span class="nf">demo_basic</span><span class="p">(</span><span class="n">rank</span><span class="p">):</span>
<span class="c1"># xla specific APIs to get rank, world_size.</span>
<span class="n">new_rank</span> <span class="o">=</span> <span class="n">xm</span><span class="o">.</span><span class="n">get_ordinal</span><span class="p">()</span>
<span class="n">new_rank</span> <span class="o">=</span> <span class="n">xr</span><span class="o">.</span><span class="n">global_ordinal</span><span class="p">()</span>
<span class="k">assert</span> <span class="n">new_rank</span> <span class="o">==</span> <span class="n">rank</span>
<span class="n">world_size</span> <span class="o">=</span> <span class="n">xr</span><span class="o">.</span><span class="n">world_size</span><span class="p">()</span>

Expand Down Expand Up @@ -705,7 +705,7 @@ <h2>Fully Sharded Data Parallel (FSDP) in PyTorch XLA<a class="headerlink" href=
</dd>
</dl>
<p>}
ckpt_path = f’/tmp/rank-{xm.get_ordinal()}-of-{xr.world_size()}.pth’
ckpt_path = f’/tmp/rank-{xr.global_ordinal()}-of-{xr.world_size()}.pth’
xm.save(ckpt, ckpt_path, master_only=False)</p>
</div></blockquote>
</li>
Expand Down
2 changes: 1 addition & 1 deletion master/notes/source_of_recompilation.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
Binary file modified master/objects.inv
Binary file not shown.
2 changes: 1 addition & 1 deletion master/py-modindex.html
Original file line number Diff line number Diff line change
Expand Up @@ -268,7 +268,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
4 changes: 2 additions & 2 deletions master/runtime.html
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down Expand Up @@ -461,7 +461,7 @@ <h2>TL;DR<a class="headerlink" href="#tl-dr" title="Permalink to this heading">

<span class="w"> </span>def _mp_fn(index):
<span class="w"> </span> device = xm.xla_device()
<span class="gd">- dist.init_process_group(&#39;xla&#39;, rank=xm.get_ordinal(), world_size=xr.world_size())</span>
<span class="gd">- dist.init_process_group(&#39;xla&#39;, rank=xr.global_ordinal(), world_size=xr.world_size())</span>
<span class="gi">+ dist.init_process_group(&#39;xla&#39;, init_method=&#39;xla://&#39;)</span>

<span class="w"> </span> torch.manual_seed(42)
Expand Down
2 changes: 1 addition & 1 deletion master/search.html
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/searchindex.js

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion master/spmd.html
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down
2 changes: 1 addition & 1 deletion master/torch_compile.html
Original file line number Diff line number Diff line change
Expand Up @@ -266,7 +266,7 @@


<div class="version">
master (2.5.0+gitbdd00e5 )
master (2.5.0+git5702dae )
</div>


Expand Down

0 comments on commit 6987bce

Please sign in to comment.