Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tests for numpy, tensorflow, torch, and mxnet #106

Closed
wants to merge 1 commit into from

Conversation

tirthasheshpatel
Copy link
Contributor

This PR adds basic tests for some python packages that support DLPack. Currently, it only tests CPU arrays. Some observations

  • Tensorflow doesn't provide a from_dlpack function and both Tensorflow and MXNet don't have the __dlpack__, __dlpack_version__ dunder methods. These need to be updated.
  • Tensorflow fails to import some NumPy arrays (most probably a bug in the deletion mechanism). These tests have been skipped for now.

More tests can be added in follow-up PRs. These tests are triggered for each commit on main/pr. Should we instead schedule them to run every week or month? GPU isn't available so I am not sure if we can test libraries like CuPy on the CI. Is is possible to emulate GPU or other devices in the CI?

@mattip
Copy link

mattip commented May 9, 2022

It would be nice to get some eyes on this. Once there are tests for the current behaviour using different libraries, it should be easier to judge the impact of the changes in #104

@leofang
Copy link
Contributor

leofang commented May 9, 2022

I am wondering about this: If we include this test set in the array API tests (@asmeurer), maybe each library can run it in their own CI so we don't have to worry about GPU access? (cc: @kmaehashi)

@asmeurer
Copy link

asmeurer commented May 9, 2022

I'm not sure how the DLPack test suite should work in relation to the array API tests test suite. My initial thought is that it makes sense for DLPack to have its own test suite, since DLPack is independent of the array API. I believe we currently don't have any DLPack tests in the array API test suite beyond just basic testing if the methods exist and run without erroring. If DLPack itself is tested extensively by its own test suite, then perhaps the array API test suite can just defer to that, or pull it in somehow.

CC @honno @rgommers @kgryte

@leofang
Copy link
Contributor

leofang commented May 9, 2022

I am thinking the tests from this PR (and future PRs) should have been added to each library's test suite when they claim they support DLPack. Without a universal test in this repo that can target every library I worry that we're just reinventing the wheel here... But, there's no way to write a universal test without an agreed-upon interface, and the array API could serve as a good proxy.

@tirthasheshpatel
Copy link
Contributor Author

To add onto @leofang's comment, for some given library (e.g. NumPy), it is not easy to test the DLPack support against other tensor libraries (e.g. PyTorch) since it would add extra dependencies for the project and affect the CI time.

@leofang
Copy link
Contributor

leofang commented May 9, 2022

Sorry I didn't read Tirth's code until now. This PR essentially proposes to do an N-to-N conversion test across N libraries. The test suite would be hard to maintain unless 1. we use the array API to be backend-agnostic, or 2. we have an automation to auto-generate the added test scripts.

@mattip
Copy link

mattip commented May 10, 2022

Perhaps in the future, if N becomes large, it might be hard to maintain the tests. For now, with N = 4 there are ~16 tests. Each library has a different pattern for using DLPack, so it is not easy to see how to automate it today. It would be good to put this in now, and automate it when N grows larger and the use of DLPack becomes more standardized.

@tirthasheshpatel
Copy link
Contributor Author

@tqchen Any thoughts on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants