Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

internalize _parse_xla_device #7675

Merged
merged 17 commits into from
Jul 18, 2024
Merged

internalize _parse_xla_device #7675

merged 17 commits into from
Jul 18, 2024

Conversation

zpcore
Copy link
Collaborator

@zpcore zpcore commented Jul 12, 2024

Resolves #7652

Along with some nit fixes.

@zpcore zpcore requested a review from will-cromar July 12, 2024 18:42
@zpcore zpcore marked this pull request as ready for review July 12, 2024 18:42
@zpcore zpcore requested a review from JackCaoG July 12, 2024 18:42
Copy link
Collaborator

@will-cromar will-cromar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zpcore
Copy link
Collaborator Author

zpcore commented Jul 13, 2024

Would the deprecation wrapper here work? https://github.com/pytorch/xla/blob/master/torch_xla/experimental/deprecation.py

Also did some updates to the deprecation decorator function. Previously it will print the warning message whenever we import torch_xla.

@will-cromar
Copy link
Collaborator

Would the deprecation wrapper here work? https://github.com/pytorch/xla/blob/master/torch_xla/experimental/deprecation.py

Also did some updates to the deprecation decorator function. Previously it will print the warning message whenever we import torch_xla.

This was actually not written as a decorator. It essentially aliases a function from the new location to the deprecated location. We don't need to keep the old implementation around in most cases. Check out some of the old usage that Jack removed in #7579

@zpcore
Copy link
Collaborator Author

zpcore commented Jul 15, 2024

Would the deprecation wrapper here work? https://github.com/pytorch/xla/blob/master/torch_xla/experimental/deprecation.py

Also did some updates to the deprecation decorator function. Previously it will print the warning message whenever we import torch_xla.

This was actually not written as a decorator. It essentially aliases a function from the new location to the deprecated location. We don't need to keep the old implementation around in most cases. Check out some of the old usage that Jack removed in #7579

Oh, I see. Just fixed wrapper it back.

Copy link
Collaborator

@will-cromar will-cromar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise LGTM

iutils.parse_xla_device,
]
for alias in aliases:
register_deprecated(torch_xla.core, alias)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I think this should actually be torch_xla.core.xla_model looking at the implementation of register_deprecated. Please double-check that you get the expected message when you call torch_xla.core.xla_model.parse_xla-device(...)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated to parse_xla_device = deprecated(torch_xla.core, iutils.parse_xla_device) instead.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message still says torch_xla.core.parse_xla_device instead of torch_xla.core.xla_model.parse_xla_device:

root@t1v-n-bf2f726f-w-0:/workspaces/ptxla# python
...
>>> import torch_xla.core.xla_model as xm
>>> xm.parse_xla_device('TPU:0')
WARNING:root:torch_xla.core.parse_xla_device is deprecated. Use torch_xla._internal.utils.parse_xla_device instead.
('TPU', 0)

You'll actually need to pass the xla_model module into register_deprecated. That was the only way I could find to print the qualified name of the aliased method consistently, but feel free to refactor it if you find a better way.

I think we need to add a unit test here to confirm that the right warning is getting printed, since you're going to be adding a lot of these aliases and this is an easy mistake to make. You can create a new unit test file for all of these public to internal aliases and we can remove it later.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should work:

import torch_xla._internal.utils as _utils
from . import xla_model as this_module
parse_xla_device = deprecated(this_module, _utils.parse_xla_device)

Copy link
Collaborator

@will-cromar will-cromar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@zpcore zpcore merged commit 13b4ddc into master Jul 18, 2024
23 checks passed
@zpcore zpcore deleted the piz/usability_1 branch July 18, 2024 16:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[API Usability] Internalize parse_xla_device
3 participants