oumi.core.tuners#
Core tuners module for the Oumi (Open Universal Machine Intelligence) library.
This module provides various tuners implementations for use in the Oumi framework. These tuners are designed to facilitate the hyper parameter tuning process.
Example
>>> from oumi.core.trainers import Tuner
>>> tuner = Tuner(model=my_model, dataset=my_dataset, tuning_params=params)
>>> trainer.optimize()
Note
- For detailed information on each tuner, please refer to their respective
class documentation.
- class oumi.core.tuners.BaseTuner(tuning_params: TuningParams)[source]#
Bases:
ABCAbstract base class for hyperparameter tuners.
This class defines the interface that all tuner implementations must follow, allowing for different optimization backends (Optuna, Ray Tune, etc.) while maintaining a consistent API.
- abstractmethod create_study() None[source]#
Create a new optimization study.
This method should initialize the tuner’s internal study object with the appropriate configuration (study name, direction, etc.).
- abstractmethod get_best_trial() dict[str, Any][source]#
Get the best trial from the study, if only one objective is being optimized.
- Returns:
Dictionary containing best parameters and their metric values.
- abstractmethod get_best_trials() list[dict[str, Any]][source]#
Get the best trials from the study, for multiple objectives.
- Returns:
Dictionary containing best parameters and their metric values for the best trials.
- abstractmethod optimize(objective_fn: Callable[[...], Any], n_trials: int) None[source]#
Run the optimization process.
- Parameters:
objective_fn – Function that takes suggested parameters and returns a dictionary of metric values.
n_trials – Number of trials to run.
- Returns:
None
- abstractmethod save_study(config: TuningConfig) None[source]#
Saves the study object to the specified output directory.
- Parameters:
config (TrainingConfig) – The Oumi training config.
- Returns:
None
- class oumi.core.tuners.OptunaTuner(tuning_params: TuningParams)[source]#
Bases:
BaseTunerOptuna-based hyperparameter tuner implementation.
- get_best_trial() dict[str, Any][source]#
Get the best trial from the Optuna study if only one objective.
- optimize(objective_fn: Callable[[dict[str, Any], dict[str, Any], int], dict[str, float]], n_trials: int) None[source]#
Run Optuna optimization.
- save_study(config: TuningConfig) None[source]#
Saves the study results in a csv file.