-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: knn tuto #333
chore: knn tuto #333
Conversation
b87a733
to
75f6277
Compare
087b198
to
7ce1dda
Compare
7ce1dda
to
776e704
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks pretty good! But I'd still like to see some graphs for rounding bits vs accuracy first.
- "b. Simulation: inference on non-encrypted quantized data, while simulating all FHE operations, failure probabilities and crypto-parameters. This mode of inference is recommended in the deployment phase. For further information, please consult this link"
I think you mean "development" not "deployment"
- you mention "accuracy loss" in the last line in the notebook but there is actually a 5% gain. You could say "accuracy difference"
e29a76d
to
7a38f81
Compare
7a38f81
to
f5bb445
Compare
f20e5b6
to
2bb1fd0
Compare
04ac5df
to
26bb7b3
Compare
26bb7b3
to
1203403
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good to me, thanks !
Coverage passed ✅Coverage details
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good job on this !
In this PR, I added a new tutorial for KNN.
As you know, this model is really restrictive, as we are not able to exceed 9bits in the circuit, without having to wait > 1h for geyken + facing unstable execution with the compiler.
To apply the rounded features, we need to :
I want your advice to know where rounding_threshold_bits should be set for the user:
- naturel place for arguments
- easy to add
- but knn won't follow other CML patterns
- which seems more appropriate.
- The way I see it, the auto-rounder object will be instantiated in this function. But it won't work if we use 'predict=disable', unless if I put a condition to skip the auto-rounder. But the big problem is that we won't have same results with and without auto-rounder.
(for Q4, we will try to add this feature it to our built-in models, so it's like an advanced conversion)
CC @zama-ai/machine-learning
We agreed to hide the parameter
rounding_threshold_bits
to the user and add a default value in base.py.I suggest the value 5, you can see the plots in this branch 'testing/impact_of_rouning_on_knn' in use_case_example folder.
Update:
<!> The tests when we do not compile before the prediction, typically for fhe=disable are crushing, because we use the autorounder and we havent compiled the model !New update:
I used the adjust method instead of the compilation to use the autorounder feature.
The concrete python autorounder must become a serialized object !