Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tune] Don't re-evaluate HyperOpt's points_to_evaluate #46325

Open
daviddavo opened this issue Jun 28, 2024 · 1 comment
Open

[tune] Don't re-evaluate HyperOpt's points_to_evaluate #46325

daviddavo opened this issue Jun 28, 2024 · 1 comment
Labels
bug Something that is supposed to be working; but isn't triage Needs triage (eg: priority, bug/not-bug, and owning component) tune Tune-related issues

Comments

@daviddavo
Copy link

What happened + What you expected to happen

I'm using a param_space that uses tune.sample_from(...) to get a certain hyperparameter that might be dependent on the batch size.

For this, I'm using the HyperOpt search algorithm. I also want to start evaluating certain points, that I know are close to the optimum. Nevertheless, the sample_from function is executed before suggesting the configuration, and it returns a random value, therefore the desired point might not be explored.

Versions / Dependencies

ray==2.30.0
hyperopt==0.2.7

Reproduction script

def trainable(config):
    print("config:", config)
    return {'loss': 0.01}

def _another_param(config):
    return config['batch_size']**2 * np.random.uniform()

search_alg = HyperOptSearch(
    points_to_evaluate = [{
        'batch_size': 3,
        'another_param': 6.57,
    }],
    random_state_seed=42,
)

tuner = tune.Tuner(
    trainable,
    param_space={
        'batch_size': tune.randint(2, 10),
        'another_param': tune.sample_from(_another_param),
    },
    tune_config=tune.TuneConfig(
        search_alg=search_alg,
        num_samples=5,
        metric='loss',
        mode='min',
    )
)
tuner.fit()

Issue Severity

Medium: It is a significant difficulty but I can work around it.

@daviddavo daviddavo added bug Something that is supposed to be working; but isn't triage Needs triage (eg: priority, bug/not-bug, and owning component) labels Jun 28, 2024
@daviddavo
Copy link
Author

daviddavo commented Jun 28, 2024

In the meantime, setting the seed as an hparam works

def trainable(config):
    print("config:", config)
    return {'loss': 0.01}

def _another_param(config):
    g = np.random.default_rng(seed=config['seed'])
    return config['batch_size']**2 * g.uniform()

search_alg = HyperOptSearch(
    points_to_evaluate = [{
        'batch_size': 3,
        'another_param': 6.57,
    }],
    random_state_seed=42,
)

tuner = tune.Tuner(
    trainable,
    param_space={
        'batch_size': tune.randint(2, 10),
        'seed': tune.randint(0,256), # Or whatever
        'another_param': tune.sample_from(_another_param),
    },
    tune_config=tune.TuneConfig(
        search_alg=search_alg,
        num_samples=5,
        metric='loss',
        mode='min',
    )
)
tuner.fit()

@anyscalesam anyscalesam added the tune Tune-related issues label Jul 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something that is supposed to be working; but isn't triage Needs triage (eg: priority, bug/not-bug, and owning component) tune Tune-related issues
Projects
None yet
Development

No branches or pull requests

2 participants