Tune Search Algorithms

Tune provides various hyperparameter search algorithms to efficiently optimize your model. Tune allows you to use different search algorithms in combination with different trial schedulers. Tune will by default implicitly use the Variant Generation algorithm to create trials.

You can utilize these search algorithms as follows:

tune.run(my_function, search_alg=SearchAlgorithm(...))

Currently, Tune offers the following search algorithms (and library integrations):

HyperOpt Search (Tree-structured Parzen Estimators)

The HyperOptSearch is a SearchAlgorithm that is backed by HyperOpt to perform sequential model-based hyperparameter optimization. Note that this class does not extend ray.tune.suggest.BasicVariantGenerator, so you will not be able to use Tune’s default variant generation/search space declaration when using HyperOptSearch.

In order to use this search algorithm, you will need to install HyperOpt via the following command:

$ pip install --upgrade git+git://github.com/hyperopt/hyperopt.git

This algorithm requires using the HyperOpt search space specification. You can use HyperOptSearch like follows:

tune.run(... , search_alg=HyperOptSearch(hyperopt_space, ... ))

An example of this can be found in hyperopt_example.py.

class ray.tune.suggest.hyperopt.HyperOptSearch(space, max_concurrent=10, reward_attr=None, metric='episode_reward_mean', mode='max', points_to_evaluate=None, n_initial_points=20, random_state_seed=None, gamma=0.25, **kwargs)[source]

Bases: ray.tune.suggest.suggestion.SuggestionAlgorithm

A wrapper around HyperOpt to provide trial suggestions.

Requires HyperOpt to be installed from source. Uses the Tree-structured Parzen Estimators algorithm, although can be trivially extended to support any algorithm HyperOpt uses. Externally added trials will not be tracked by HyperOpt. Trials of the current run can be saved using save method, trials of a previous run can be loaded using restore method, thus enabling a warm start feature.

Parameters:
  • space (dict) – HyperOpt configuration. Parameters will be sampled from this configuration and will be used to override parameters generated in the variant generation process.
  • max_concurrent (int) – Number of maximum concurrent trials. Defaults to 10.
  • metric (str) – The training result objective value attribute.
  • mode (str) – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.
  • points_to_evaluate (list) – Initial parameter suggestions to be run first. This is for when you already have some good parameters you want hyperopt to run first to help the TPE algorithm make better suggestions for future parameters. Needs to be a list of dict of hyperopt-named variables. Choice variables should be indicated by their index in the list (see example)
  • n_initial_points (int) – number of random evaluations of the objective function before starting to aproximate it with tree parzen estimators. Defaults to 20.
  • random_state_seed (int, array_like, None) – seed for reproducible results. Defaults to None.
  • gamma (float in range (0,1)) – parameter governing the tree parzen estimators suggestion algorithm. Defaults to 0.25.

Example

>>> space = {
>>>     'width': hp.uniform('width', 0, 20),
>>>     'height': hp.uniform('height', -100, 100),
>>>     'activation': hp.choice("activation", ["relu", "tanh"])
>>> }
>>> current_best_params = [{
>>>     'width': 10,
>>>     'height': 0,
>>>     'activation': 0, # The index of "relu"
>>> }]
>>> algo = HyperOptSearch(
>>>     space, max_concurrent=4, metric="mean_loss", mode="min",
>>>     points_to_evaluate=current_best_params)

BOHB

Tip

This implementation is still experimental. Please report issues on https://github.com/ray-project/ray/issues/. Thanks!

BOHB (Bayesian Optimization HyperBand) is a SearchAlgorithm that is backed by HpBandSter to perform sequential model-based hyperparameter optimization in conjunction with HyperBand. Note that this class does not extend ray.tune.suggest.BasicVariantGenerator, so you will not be able to use Tune’s default variant generation/search space declaration when using BOHB.

Importantly, BOHB is intended to be paired with a specific scheduler class: HyperBandForBOHB.

This algorithm requires using the ConfigSpace search space specification. In order to use this search algorithm, you will need to install HpBandSter and ConfigSpace:

$ pip install hpbandster ConfigSpace

You can use TuneBOHB in conjunction with HyperBandForBOHB as follows:

# BOHB uses ConfigSpace for their hyperparameter search space
import ConfigSpace as CS

config_space = CS.ConfigurationSpace()
config_space.add_hyperparameter(
    CS.UniformFloatHyperparameter("height", lower=10, upper=100))
config_space.add_hyperparameter(
    CS.UniformFloatHyperparameter("width", lower=0, upper=100))

experiment_metrics = dict(metric="episode_reward_mean", mode="min")
bohb_hyperband = HyperBandForBOHB(
    time_attr="training_iteration", max_t=100, **experiment_metrics)
bohb_search = TuneBOHB(
    config_space, max_concurrent=4, **experiment_metrics)

tune.run(MyTrainableClass,
    name="bohb_test",
    scheduler=bohb_hyperband,
    search_alg=bohb_search,
    num_samples=5)

Take a look at an example here. See the BOHB paper for more details.

class ray.tune.suggest.bohb.TuneBOHB(space, bohb_config=None, max_concurrent=10, metric='neg_mean_loss', mode='max')[source]

Bases: ray.tune.suggest.suggestion.SuggestionAlgorithm

BOHB suggestion component.

Requires HpBandSter and ConfigSpace to be installed. You can install HpBandSter and ConfigSpace with: pip install hpbandster ConfigSpace.

This should be used in conjunction with HyperBandForBOHB.

Parameters:
  • space (ConfigurationSpace) – Continuous ConfigSpace search space. Parameters will be sampled from this space which will be used to run trials.
  • bohb_config (dict) – configuration for HpBandSter BOHB algorithm
  • max_concurrent (int) – Number of maximum concurrent trials. Defaults to 10.
  • metric (str) – The training result objective value attribute.
  • mode (str) – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.

Example

>>> import ConfigSpace as CS
>>> config_space = CS.ConfigurationSpace()
>>> config_space.add_hyperparameter(
        CS.UniformFloatHyperparameter('width', lower=0, upper=20))
>>> config_space.add_hyperparameter(
        CS.UniformFloatHyperparameter('height', lower=-100, upper=100))
>>> config_space.add_hyperparameter(
        CS.CategoricalHyperparameter(
            name='activation', choices=['relu', 'tanh']))
>>> algo = TuneBOHB(
        config_space, max_concurrent=4, metric='mean_loss', mode='min')
>>> bohb = HyperBandForBOHB(
        time_attr='training_iteration',
        metric='mean_loss',
        mode='min',
        max_t=100)
>>> run(MyTrainableClass, scheduler=bohb, search_alg=algo)

Contributing a New Algorithm

If you are interested in implementing or contributing a new Search Algorithm, the API is straightforward:

class ray.tune.suggest.SearchAlgorithm[source]

Interface of an event handler API for hyperparameter search.

Unlike TrialSchedulers, SearchAlgorithms will not have the ability to modify the execution (i.e., stop and pause trials).

Trials added manually (i.e., via the Client API) will also notify this class upon new events, so custom search algorithms should maintain a list of trials ID generated from this class.

See also: ray.tune.suggest.BasicVariantGenerator.

add_configurations(experiments)[source]

Tracks given experiment specifications.

Parameters:experiments (Experiment | list | dict) – Experiments to run.
next_trials()[source]

Provides Trial objects to be queued into the TrialRunner.

Returns:Returns a list of trials.
Return type:trials (list)
on_trial_result(trial_id, result)[source]

Called on each intermediate result returned by a trial.

This will only be called when the trial is in the RUNNING state.

Parameters:trial_id – Identifier for the trial.
on_trial_complete(trial_id, result=None, error=False, early_terminated=False)[source]

Notification for the completion of trial.

Parameters:
  • trial_id – Identifier for the trial.
  • result (dict) – Defaults to None. A dict will be provided with this notification when the trial is in the RUNNING state AND either completes naturally or by manual termination.
  • error (bool) – Defaults to False. True if the trial is in the RUNNING state and errors.
  • early_terminated (bool) – Defaults to False. True if the trial is stopped while in PAUSED or PENDING state.
is_finished()[source]

Returns True if no trials left to be queued into TrialRunner.

Can return True before all trials have finished executing.

Model-Based Suggestion Algorithms

Often times, hyperparameter search algorithms are model-based and may be quite simple to implement. For this, one can extend the following abstract class and implement on_trial_result, on_trial_complete, and _suggest. The abstract class will take care of Tune-specific boilerplate such as creating Trials and queuing trials:

class ray.tune.suggest.SuggestionAlgorithm[source]

Bases: ray.tune.suggest.search.SearchAlgorithm

Abstract class for suggestion-based algorithms.

Custom search algorithms can extend this class easily by overriding the _suggest method provide generated parameters for the trials.

To track suggestions and their corresponding evaluations, the method _suggest will be passed a trial_id, which will be used in subsequent notifications.

Example

>>> suggester = SuggestionAlgorithm()
>>> suggester.add_configurations({ ... })
>>> new_parameters = suggester._suggest()
>>> suggester.on_trial_complete(trial_id, result)
>>> better_parameters = suggester._suggest()
_suggest(trial_id)[source]

Queries the algorithm to retrieve the next set of parameters.

Parameters:trial_id – Trial ID used for subsequent notifications.
Returns:
Configuration for a trial, if possible.
Else, returns None, which will temporarily stop the TrialRunner from querying.
Return type:dict|None

Example

>>> suggester = SuggestionAlgorithm(max_concurrent=1)
>>> suggester.add_configurations({ ... })
>>> parameters_1 = suggester._suggest()
>>> parameters_2 = suggester._suggest()
>>> parameters_2 is None
>>> suggester.on_trial_complete(trial_id, result)
>>> parameters_2 = suggester._suggest()
>>> parameters_2 is not None