Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hyperoptim #26

Open
wants to merge 30 commits into
base: master
Choose a base branch
from
Open

Hyperoptim #26

wants to merge 30 commits into from

Conversation

mknull
Copy link
Contributor

@mknull mknull commented Jun 27, 2022

Add hyperoptimization capabilities

@mknull mknull requested review from jdrefs and eguiraud June 30, 2022 11:57
@mknull mknull self-assigned this Jun 30, 2022
Copy link
Member

@jdrefs jdrefs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Afaics, this MR provides an extensive implementation to run hyperparameter optimization (HPO) for TVAE on bars test using free energies as HPO objective. As significant parts of the code are tailored to this application, i.e., tailored to a concrete model and a concrete HPO objective (e.g., in neural_models.py, test_neural_models.py, workers.py; also see comments), my suggestion wouId be to make this repository part of the examples . Future refactoring may aim at bundling all methods that are generically applicable to different models and experiments and make them part of tvo.utils.

from .tvae import GaussianTVAE, BernoulliTVAE
from .gmm import GMM
from .pmm import PMM
from .sssc import SSSC
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this required?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not required.

# infer hyperparameter status
self.is_hyperparameter_S = not (S)
self.is_hyperparameter_H = not (H)
self.is_hyperparameter_EEM = False
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

EEM->EVO, also in several places below

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

hyperoptimization/explore.py Show resolved Hide resolved
import os


class ValidFreeEnergy:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For a general hyperparameter optimization framework, it may be desirable to provide an interface that allows to implement different optimization metrics (including free energies) as generically as possible.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed.

hyperoptimization/run_best_config.py Show resolved Hide resolved
hyperoptimization/run_best_config.py Show resolved Hide resolved
hyperoptimization/run_best_config.py Show resolved Hide resolved
hyperoptimization/run_best_config.py Show resolved Hide resolved
hyperoptimization/run_best_config.py Show resolved Hide resolved
Comment on lines +194 to +202
return {
"loss": -train_F if -train_F else np.nan,
"info": {
"test accuracy": test_F,
"train accuracy": train_F,
"validation accuracy": valid_F,
"number of parameters": model._external_model.number_of_parameters(),
},
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do I understand correctly that this implies dependency on a particular model and optimization metric? (similar comment above)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is for a particular worker. Individual models require individual workers.

@eguiraud eguiraud removed their request for review June 6, 2023 22:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants