-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add multi-output support #396
Conversation
5f5a4de
to
1a7c302
Compare
82d7d9f
to
6fb8011
Compare
30aa027
to
9543d4b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good ! I only have a few open questions remaining, but would be fine for me if we merge it like this , again, thanks a lot for this integration !
@@ -1116,9 +1115,11 @@ def quantize_input(self, X: numpy.ndarray) -> numpy.ndarray: | |||
assert isinstance(q_X, numpy.ndarray) | |||
return q_X | |||
|
|||
def dequantize_output(self, q_y_preds: numpy.ndarray) -> numpy.ndarray: | |||
def dequantize_output(self, *q_y_preds: numpy.ndarray) -> numpy.ndarray: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just realized here, we have a signature change here : dequantize_output
is a common method between all built-in models, this means it 'inherits' from the BaseEstimator method, which has the following signature / docstring :
@abstractmethod
def dequantize_output(self, q_y_preds: numpy.ndarray) -> numpy.ndarray:
"""De-quantize the output.
This step ensures that the fit method has been called.
Args:
q_y_preds (numpy.ndarray): The quantized output values to de-quantize.
Returns:
numpy.ndarray: The de-quantized output values.
"""
what should we do then here ? I feel like this signature change is necessary (I guess QNNs could be multi-output right ?) but dequantize_output
for linear / tree models should stay like it is right now
I'm actually surprised mypy did not complain here, I thought it would point out that the method's signature has changed .. ?
so yes, not sure what could be done here, it probably looks fine to just keep it liks this but maybe add a new docstring to better indicate that the signature is a bit different ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So actually if I understand correctly def f(arg)
and def f(*arg)
aren't incompatible since one is just a superset of the other.
I am not sure if it makes sense to change it for builtin models too, but a relevant question indeed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if def f(*arg)
is properly considered a super-set of def f(arg)
in signatures then I think it's fine !
@@ -34,7 +34,7 @@ readme = "README.md" | |||
# Investigate if it is better to fix specific versions or use lower and upper bounds | |||
# FIXME: https://github.com/zama-ai/concrete-ml-internal/issues/2665 | |||
python = ">=3.8.1,<3.11" | |||
concrete-python = "2.5.0-rc1 " | |||
#concrete-python = "2023.11.5" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that's the main issue with nightly, I believe we don't want them to be public, only rc should
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, all is good for me !
91f0d5b
for (_, layer) in self.quant_layers_dict.values(): | ||
layer.debug_value_tracker = None | ||
return result, debug_value_tracker | ||
# De-quantize the output predicted values | ||
y_pred = self.dequantize_output(*to_tuple(q_y_pred)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can keep the debug values here as well, the computation is still done on integers
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry but if we add new flaky marks to pytest, better to create an issue + add a fixme along these marks !
Indeed my bad, I'll create the issue and add the FIXME. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks a lot for this !
Coverage passed ✅Coverage details
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
closes https://github.com/zama-ai/concrete-ml-internal/issues/4139