Tune Search Algorithms

Tune provides various hyperparameter search algorithms to efficiently optimize your model. Tune allows you to use different search algorithms in combination with different trial schedulers. Tune will by default implicitly use the Variant Generation algorithm to create trials.

You can utilize these search algorithms as follows:

run_experiments(experiments, search_alg=SearchAlgorithm(...))

Currently, Tune offers the following search algorithms:

HyperOpt Search (Tree-structured Parzen Estimators)

The HyperOptSearch is a SearchAlgorithm that is backed by HyperOpt to perform sequential model-based hyperparameter optimization. In order to use this search algorithm, you will need to install HyperOpt via the following command:

$ pip install --upgrade git+git://github.com/hyperopt/hyperopt.git

This algorithm requires using the HyperOpt search space specification. You can use HyperOptSearch like follows:

run_experiments(experiment_config, search_alg=HyperOptSearch(hyperopt_space, ... ))

An example of this can be found in hyperopt_example.py.

class ray.tune.suggest.HyperOptSearch(space, max_concurrent=10, reward_attr='episode_reward_mean', **kwargs)

Bases: ray.tune.suggest.suggestion.SuggestionAlgorithm

A wrapper around HyperOpt to provide trial suggestions.

Requires HyperOpt to be installed from source. Uses the Tree-structured Parzen Estimators algorithm, although can be trivially extended to support any algorithm HyperOpt uses. Externally added trials will not be tracked by HyperOpt.

Parameters:
  • space (dict) – HyperOpt configuration. Parameters will be sampled from this configuration and will be used to override parameters generated in the variant generation process.
  • max_concurrent (int) – Number of maximum concurrent trials. Defaults to 10.
  • reward_attr (str) – The training result objective value attribute. This refers to an increasing value.

Example

>>> space = {
>>>     'width': hp.uniform('width', 0, 20),
>>>     'height': hp.uniform('height', -100, 100),
>>>     'activation': hp.choice("activation", ["relu", "tanh"])
>>> }
>>> config = {
>>>     "my_exp": {
>>>         "run": "exp",
>>>         "num_samples": 10 if args.smoke_test else 1000,
>>>         "stop": {
>>>             "training_iteration": 100
>>>         },
>>>     }
>>> }
>>> algo = HyperOptSearch(
>>>     space, max_concurrent=4, reward_attr="neg_mean_loss")
>>> algo.add_configurations(config)

Contributing a New Algorithm

If you are interested in implementing or contributing a new Search Algorithm, the API is straightforward:

class ray.tune.suggest.SearchAlgorithm

Interface of an event handler API for hyperparameter search.

Unlike TrialSchedulers, SearchAlgorithms will not have the ability to modify the execution (i.e., stop and pause trials).

Trials added manually (i.e., via the Client API) will also notify this class upon new events, so custom search algorithms should maintain a list of trials ID generated from this class.

See also: ray.tune.suggest.BasicVariantGenerator.

add_configurations(experiments)

Tracks given experiment specifications.

Parameters:experiments (Experiment | list | dict) – Experiments to run.
next_trials()

Provides Trial objects to be queued into the TrialRunner.

Returns:Returns a list of trials.
Return type:trials (list)
on_trial_result(trial_id, result)

Called on each intermediate result returned by a trial.

This will only be called when the trial is in the RUNNING state.

Parameters:trial_id – Identifier for the trial.
on_trial_complete(trial_id, result=None, error=False, early_terminated=False)

Notification for the completion of trial.

Parameters:
  • trial_id – Identifier for the trial.
  • result (dict) – Defaults to None. A dict will be provided with this notification when the trial is in the RUNNING state AND either completes naturally or by manual termination.
  • error (bool) – Defaults to False. True if the trial is in the RUNNING state and errors.
  • early_terminated (bool) – Defaults to False. True if the trial is stopped while in PAUSED or PENDING state.
is_finished()

Returns True if no trials left to be queued into TrialRunner.

Can return True before all trials have finished executing.

Model-Based Suggestion Algorithms

Often times, hyperparameter search algorithms are model-based and may be quite simple to implement. For this, one can extend the following abstract class and implement on_trial_result, on_trial_complete, and _suggest. The abstract class will take care of Tune-specific boilerplate such as creating Trials and queuing trials:

class ray.tune.suggest.SuggestionAlgorithm

Bases: ray.tune.suggest.search.SearchAlgorithm

Abstract class for suggestion-based algorithms.

Custom search algorithms can extend this class easily by overriding the _suggest method provide generated parameters for the trials.

To track suggestions and their corresponding evaluations, the method _suggest will be passed a trial_id, which will be used in subsequent notifications.

Example

>>> suggester = SuggestionAlgorithm()
>>> suggester.add_configurations({ ... })
>>> new_parameters = suggester._suggest()
>>> suggester.on_trial_complete(trial_id, result)
>>> better_parameters = suggester._suggest()
_suggest(trial_id)

Queries the algorithm to retrieve the next set of parameters.

Parameters:trial_id – Trial ID used for subsequent notifications.
Returns:
Configuration for a trial, if possible.
Else, returns None, which will temporarily stop the TrialRunner from querying.
Return type:dict|None

Example

>>> suggester = SuggestionAlgorithm(max_concurrent=1)
>>> suggester.add_configurations({ ... })
>>> parameters_1 = suggester._suggest()
>>> parameters_2 = suggester._suggest()
>>> parameters_2 is None
>>> suggester.on_trial_complete(trial_id, result)
>>> parameters_2 = suggester._suggest()
>>> parameters_2 is not None