Searchers

MCTSSearcher

Monte-Carlo Tree Search (MCTS) extends the celebrated Multi-armed Bandit algorithm to tree-structured search spaces. The MCTS algorithm iterates over four phases: selection, expansion, playout and backpropagation.

  • Selection: In each node of the tree, the child node is selected after a Multi-armed Bandit strategy, e.g. the UCT (Upper Confidence bound applied to Trees) algorithm.

  • Expansion: The algorithm adds one or more nodes to the tree. This node corresponds to the first encountered position that was not added in the tree.

  • Playout: When reaching the limits of the visited tree, a roll-out strategy is used to select the options until reaching a terminal node and computing the associated reward.

  • Backpropagation: The reward value is propagated back, i.e. it is used to update the value associated to all nodes along the visited path up to the root node.

Code example

from hypernets.searchers import MCTSSearcher

searcher = MCTSSearcher(search_space_fn, use_meta_learner=False, max_node_space=10, candidates_size=10, optimize_direction='max')

Required Parameters

  • space_fn: callable, A search space function which when called returns a HyperSpace instance.

Optinal Parameters

  • policy: hypernets.searchers.mcts_core.BasePolicy, (default=None), The policy for Selection and Backpropagation phases, UCT by default.

  • max_node_space: int, (default=10), Maximum space for node expansion

  • use_meta_learner: bool, (default=True), Meta-learner aims to evaluate the performance of unseen samples based on previously evaluated samples. It provides a practical solution to accurately estimate a search branch with many simulations without involving the actual training.

  • candidates_size: int, (default=10), The number of samples for the meta-learner to evaluate candidate paths when roll out

  • optimize_direction: ‘min’ or ‘max’, (default=’min’), Whether the search process is approaching the maximum or minimum reward value.

  • space_sample_validation_fn: callable or None, (default=None), Used to verify the validity of samples from the search space, and can be used to add specific constraint rules to the search space to reduce the size of the space.

EvolutionSearcher

Evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions (see also loss function). Evolution of the population then takes place after the repeated application of the above operators.

Code example

from hypernets.searchers import EvolutionSearcher

searcher = EvolutionSearcher(search_space_fn, population_size=20, sample_size=5, optimize_direction='min')

Required Parameters

  • space_fn: callable, A search space function which when called returns a HyperSpace instance

  • population_size: int, Size of population

  • sample_size: int, The number of parent candidates selected in each cycle of evolution

Optinal Parameters

  • regularized: bool, (default=False), Whether to enable regularized

  • use_meta_learner: bool, (default=True), Meta-learner aims to evaluate the performance of unseen samples based on previously evaluated samples. It provides a practical solution to accurately estimate a search branch with many simulations without involving the actual training.

  • candidates_size: int, (default=10), The number of samples for the meta-learner to evaluate candidate paths when roll out

  • optimize_direction: ‘min’ or ‘max’, (default=’min’), Whether the search process is approaching the maximum or minimum reward value.

  • space_sample_validation_fn: callable or None, (default=None), Used to verify the validity of samples from the search space, and can be used to add specific constraint rules to the search space to reduce the size of the space.

RandomSearcher

As its name suggests, Random Search uses random combinations of hyperparameters. Code example

from hypernets.searchers import RandomSearcher

searcher = RandomSearcher(search_space_fn, optimize_direction='min')

Required Parameters

  • space_fn: callable, A search space function which when called returns a HyperSpace instance

Optinal Parameters

  • optimize_direction: ‘min’ or ‘max’, (default=’min’), Whether the search process is approaching the maximum or minimum reward value.

  • space_sample_validation_fn: callable or None, (default=None), Used to verify the validity of samples from the search space, and can be used to add specific constraint rules to the search space to reduce the size of the space.