Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GP Kernel #4

Open
RodrigoAVargasHdz opened this issue Jun 24, 2020 · 7 comments
Open

GP Kernel #4

RodrigoAVargasHdz opened this issue Jun 24, 2020 · 7 comments

Comments

@RodrigoAVargasHdz
Copy link

Hi,

I like your code :)

I was wondering if you can implement the use of kernels with individual parameters for each feature. This tends to make the GPs more accurate.

Thanks :)

@ppgaluzio
Copy link
Owner

Hey, thanks!

I guess I could do it by including an extra parameter at the init method of the class, in which case you could give a kernel object to the optimizer with the parameter of your choice. This way the interface wouldn't be changed and the user has more flexibility in the implementation.

@RodrigoAVargasHdz
Copy link
Author

Hi!

Yeah you could do something like this,

        self.GP[i] = GPR(length_scale=np.ones(self.x_dim), 
        				kernel=C(1.0, (1e-3, 1e3)) * Matern(nu=2.5), 
                         n_restarts_optimizer=self.n_rest_opt)

where x_dim is the number of features and C is the constant kernel. May have to declare it,

     from sklearn.gaussian_process.kernels import Matern, ConstantKernel as C 

Thanks!

@RodrigoAVargasHdz
Copy link
Author

I had a mistake,

    self.GP[i] = GPR(kernel=C(1.0, (1e-3, 1e3)) * Matern(length_scale=np.ones(self.x_dim), nu=2.5), 
                     n_restarts_optimizer=self.n_rest_opt)

where "length_scale=np.ones(self.x_dim)" indicates that the kernel function has independent length parameters for each individual feature.

@ppgaluzio
Copy link
Owner

ppgaluzio commented Jun 25, 2020

I was thinking more of something like:

def __init__(self, ..., kernel=None):

    if kernel is None:
        self._kernel = Matern(nu) # current default
    else:
        self._kernel = kernel

    self.GP[i] = GPR(kernel=self._kernel)

so that the user can pass the kernel he wants or use the default, in this case, I think the kernel parameter could even be a list of kernel objects, in which case each objective could have a different kernel.

@RodrigoAVargasHdz
Copy link
Author

Yeah that could work too!
You will have to import all different types of kernels from the sklearn library.

@RodrigoAVargasHdz
Copy link
Author

Hi,
I saw that you updated the coda and include the possibility to use more robust kernels, Thanks :)
I should define the kernel function using the standard notation from Sklearn, correct?

Again, thanks :)

@ppgaluzio
Copy link
Owner

Hi, yeah, I just didn't have time to test it yet, but it should work without a problem.
You just define the any kernel from sklearn, the argument is passed directly to the GP, so any kernel implemented in sklearn is gonna work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants