-
-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Support for PyTorch-XLA 2.0 #1938
Comments
@zw615 there was lots of nice updates to XLA on the horizon when I was still using it regularly via the I know PT XLA can be used on GPU but at least prior to 2.x it wasn't as good for GPU as PyTorch eager (especially with compile) so didn't make much sense to try and support it without developing on TPUs... |
Sigh. In this case, it indeed does not make sense to support pytorch-xla 2.0. I can see the last commit to the |
Thanks for your wonderful work!
Given now that PyTorch-XLA 2.1 is just released, and many new features are added compared to PyTorch-XLA 1.xx, I wonder if you have any plan to integrate those features in to this codebase? Those useful features includes
torch.compile
,PJRT Runtime
,AMP
, and so on. For the user end,torch.compile
may be no different on GPU and TPU.The text was updated successfully, but these errors were encountered: