-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperoptim #26
base: master
Are you sure you want to change the base?
Hyperoptim #26
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Afaics, this MR provides an extensive implementation to run hyperparameter optimization (HPO) for TVAE on bars test using free energies as HPO objective. As significant parts of the code are tailored to this application, i.e., tailored to a concrete model and a concrete HPO objective (e.g., in neural_models.py
, test_neural_models.py
, workers.py
; also see comments), my suggestion wouId be to make this repository part of the examples
. Future refactoring may aim at bundling all methods that are generically applicable to different models and experiments and make them part of tvo.utils
.
from .tvae import GaussianTVAE, BernoulliTVAE | ||
from .gmm import GMM | ||
from .pmm import PMM | ||
from .sssc import SSSC |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is this required?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not required.
# infer hyperparameter status | ||
self.is_hyperparameter_S = not (S) | ||
self.is_hyperparameter_H = not (H) | ||
self.is_hyperparameter_EEM = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
EEM->EVO, also in several places below
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks
import os | ||
|
||
|
||
class ValidFreeEnergy: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For a general hyperparameter optimization framework, it may be desirable to provide an interface that allows to implement different optimization metrics (including free energies) as generically as possible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed.
return { | ||
"loss": -train_F if -train_F else np.nan, | ||
"info": { | ||
"test accuracy": test_F, | ||
"train accuracy": train_F, | ||
"validation accuracy": valid_F, | ||
"number of parameters": model._external_model.number_of_parameters(), | ||
}, | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do I understand correctly that this implies dependency on a particular model and optimization metric? (similar comment above)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is for a particular worker. Individual models require individual workers.
Add hyperoptimization capabilities