Tuner¶
-
class
pytorch_wrapper.tuner.
AbstractTuner
(hyper_parameter_generators, algorithm, fit_iterations)¶ Bases:
abc.ABC
Objects of derived classes are used to tune a model using the Hyperopt library.
Parameters: - hyper_parameter_generators – Dict containing a hyperopt hyper-parameter generator for each hyper-parameter (e.g. {‘batch_size’: hp.choice(‘batch_size’, [32, 64])})
- algorithm – Hyperopt’s tuning algorithm (e.g. hyperopt.rand.suggest, hyperopt.tpe.suggest).
- fit_iterations – Number of trials.
-
run
(trials_load_path=None, trials_save_path=None)¶ Initiates the tuning algorithm.
Parameters: - trials_load_path – Path of a Trials object to load at the beginning of the tuning algorithm. If None the tuning algorithm will start from scratch.
- trials_save_path – Path where to save the Trials object after each iteration. If None the Trials object will not be saved.
Returns: A sorted list of tuples [ (loss, {parameters}), … ].
-
class
pytorch_wrapper.tuner.
Tuner
(hyper_parameter_generators, step_function, algorithm, fit_iterations)¶ Bases:
pytorch_wrapper.tuner.AbstractTuner
Objects of this class are used to tune a model using the Hyperopt library.
Parameters: - hyper_parameter_generators – Dict containing a hyperopt hyper-parameter generator for each hyper-parameter (e.g. {‘batch_size’: hp.choice(‘batch_size’, [32, 64])})
- step_function – callable that creates and evaluates a model using the provided hyper-parameters. A dict will be provided as an argument containing the chosen hyper-parameters for the current iteration. The key for each hyper-parameter is the same as its corresponding generator. It must return a numeric value representing the loss of the current iteration.
- algorithm – Hyperopt’s tuning algorithm (e.g. hyperopt.rand.suggest, hyperopt.tpe.suggest).
- fit_iterations – Number of trials.