You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
If different batch sizes are used the predictions will be subtly different. This might be caused by PyTorch. In the example below I turned off tta_transforms to make sure that wasn't the cause.
To Reproduce
Code snippet allowing reproducing the behaviour:
Describe the bug
If different batch sizes are used the predictions will be subtly different. This might be caused by PyTorch. In the example below I turned off
tta_transforms
to make sure that wasn't the cause.To Reproduce
Code snippet allowing reproducing the behaviour:
This produces the image:
As you can see the last tile produces the same output and this is because for this tile the batch size is equal to 1 for both predictions.
Additional context
I added this test at one point but it is skipped for now.
careamics/tests/test_careamist.py
Lines 588 to 623 in 0a29ea2
The text was updated successfully, but these errors were encountered: