Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: add test_reduce_sum configuration as flaky #589

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/concrete/ml/pytest/torch_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -1097,7 +1097,7 @@ def forward(self, x):
# This is only tested in weekly CIs because compiling the circuits make the tests too long
# to execute (which is why it is not included in the regular coverage)
if self.with_pbs:
torch_sum = torch_sum + torch_sum % 2 - torch_sum % 2 # pragma: no cover
torch_sum = torch_sum + torch_sum % 2 - torch_sum % 2

return torch_sum

Expand Down
7 changes: 3 additions & 4 deletions tests/torch/test_reduce_sum.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,9 @@
from concrete.ml.torch.compile import compile_torch_model


# This test is a known flaky test
# FIXME: https://github.com/zama-ai/concrete-ml-internal/issues/4357
@pytest.mark.flaky
@pytest.mark.parametrize(
"data_generator",
[
Expand Down Expand Up @@ -65,13 +68,9 @@ def test_sum(
check_circuit_has_no_tlu,
check_circuit_precision,
check_r2_score,
is_weekly_option,
):
"""Tests ReduceSum ONNX operator on a torch model."""

if with_pbs and not is_weekly_option:
pytest.skip("Tests on model with some PBS take too long for regular CIs")

# Generate the input-set with several samples. This adds a necessary batch size
inputset = data_generator(size=(100,) + size)

Expand Down
Loading