Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

this is great! any bias in the sampling considerations? #3

Open
aldopareja opened this issue Apr 13, 2024 · 1 comment
Open

this is great! any bias in the sampling considerations? #3

aldopareja opened this issue Apr 13, 2024 · 1 comment

Comments

@aldopareja
Copy link

just wondering if packing in this way could represent a bias in the aggregated batches (across all GPUs) when compared to simple random sampling. Ideally you want the probability of any sample being in a batch being the same as random sampling, but I can't really figure the math behind it.

@imoneoi
Copy link
Owner

imoneoi commented Aug 10, 2024

Multipack should not have any bias compared to random sampling. The algorithm is equivalent to randomly shuffling the data, then retrieving one by one from the shuffled data and pack into the batch until full.

As long as the order inside a batch doesn't matter, Multipack is equivalent to random sampling, with one notable exception that the batch size is dynamic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants