Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discreate search space support #8

Open
CuriousSingularity opened this issue Aug 19, 2020 · 13 comments
Open

Discreate search space support #8

CuriousSingularity opened this issue Aug 19, 2020 · 13 comments
Labels
enhancement New feature or request

Comments

@CuriousSingularity
Copy link

CuriousSingularity commented Aug 19, 2020

Hi, I really admire your work, it is very useful for me.
I would like to know whether the library supports the multi-objective optimization when the search space is discrete i.e., the value x passed to the objective function should be discrete values (integers).
Any comment on this would be useful for me.

Thank you.

@ppgaluzio ppgaluzio added the enhancement New feature or request label Aug 20, 2020
@ppgaluzio
Copy link
Owner

Hello, thank you.

The package currently does not support discrete variables, however, I believe that it could be implemented with minor changes. There are approaches that involve only modification of the objective function, which could be enough in some cases. If better convergence is necessary, a filter function could be passed to the optimizer and applied to the input of the predict method of the GP in the _ObjectiveGP method in the optimizer. This reference describes the method.

That is something that I have been planning on doing, I just have been very busy on some other projects.

@CuriousSingularity
Copy link
Author

CuriousSingularity commented Aug 20, 2020

Hello @ppgaluzio you're right!
Actually, we can handle it in the objective function itself. And normalizing the bounds and later making the values discrete yielded me a good results. But sometimes, as you mentioned, better convergence is needed. As i see, once in a while it doesn't converge very well.

I would be looking forward to your changes with this enhancement. 👍

@Stephocke
Copy link

Stephocke commented Nov 7, 2020

Hello,

do you think, one can proceed in the same way with categorical variables? That means using one continuous variable [n,n+1] where n is the number of categories and cut-off the fractional part within the objective function?

Or is it a better way to use one-hot encoded data, i.e., n-1 continuous variables [0,1], and utilize constraints?

Kind regards

@ppgaluzio
Copy link
Owner

ppgaluzio commented Nov 9, 2020

I haven't made tests with categorical variables myself, however I think this reference presents a pretty good discussion about the topic and how to implement it in a Bayesian context. My intention is to eventually implement that in the code, but a haven't had the time to do it yet, as I am working on a different project right now.

The approach is relatively simple and is based on the idea of performing a one-hot encoded data transformation, but transform the data not only inside the objective function but when passing the input value as arguments of the kernel function. The problem with using the first approach you proposed (one continuous variable and cut-off the fractional part) is that you introduce a notion of order and distance to the categorical data that doesn't really make much sense and ends up being rather arbitrary.

Regards

@Stephocke
Copy link

Thank you for your feedback. However, unfortunately, have no access to the paper; however, I found an older version on researchgate.
I hope there were no significant changes made during the review process :)

@Stephocke

This comment has been minimized.

@Stephocke
Copy link

Stephocke commented Nov 25, 2020

Hello again,

I think I got it now. So I try to write my own kernel class and pass it as an argument. However, I get the error init() got an unexpected keyword argument 'kernel'
So I looked up and checked the install files and the kernel argument is actually missing. So I tried to reinstall the package using the GitHub master link. Unfortunately, this has no effect. Eventually, I successfully replaced the code of __bayes.py with the source code from Github. What I am doing wrong to properly install Mobopt?

@Stephocke
Copy link

Stephocke commented Nov 28, 2020

I successfully applied MOBOPT on a discrete search space. However, in this case, different points in the search space can correspond to the same objective value. So maybe the standard deviation of the Pareto front can be zero. So I have to change line 439 into return ArgMax, (MinDist-Mean)/(Std+0.0000001) in the function __LargestOfLeast

@ppgaluzio
Copy link
Owner

Hello, sorry I couldn't answer before. I haven't had time to work on this project lately, but I'll try to see to your problem.

So, were you able to solve your problem? If so, did you do any change other than the one you mentioned? Perhaps this could be made into a new feature, if you are interested.

Also, you mentioned lack of documentation. What kind of information do you feel is missing?

Thanks

@Stephocke
Copy link

Stephocke commented Feb 27, 2021

No problem it took me a long time to give this answer too. I wrote my own kernel class performing the adjustments to the input to Cov function according to the reference you suggested. Finally, I made a slight modification to your script mentioned above. Works like a charm.
I finished my experiments, wrote my manuscript, citing your paper, and eventually submitted my manuscript :)
Thanks again

@mailmjfmjf
Copy link

Hi Stephocke,
Good news to hear that you have sovled dealing with categorical and integer-valued variables in multi-objectvie Bayesian Optimization on ppgaluzio's great work. Looking forward your manuscript will be published soon.Actually, I have the same problem, could it possible open source your nice work after manuscript published,which will help a lot .

@ppgaluzio
Copy link
Owner

Hello @Stephocke ,

Good to hear that you managed to make the necessary adjustments. I would love to see how you did it. Looking forward to see your paper.

I haven't had much time to work on this project lately, but it would an interesting feature to add to the code.

@JinfengM
Copy link

@Stephocke Hi,have you finished your paper ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants