AutoMAX Search
AutoMAX uses SMAC3 — a Bayesian optimization framework — to search over hyperparameter configurations for your chosen loss/optimizer pair.
How It Works
AutoMAX reads your config and builds a configuration space from the predefined search space of the selected loss function.
It starts with the default configuration (via
DefaultInitialDesign) to establish a baseline.SMAC3 proposes new configurations, trains a full model for each trial, and uses the returned target metric to guide the next suggestion.
After
n_trialsevaluations, the best configuration is saved tooutput_directory.
Resuming a Search
SMAC3 serializes the run history to disk. If a search is interrupted,
it can be resumed by re-running the same command — as long as
output_directory and name remain unchanged and overwrite: false.
automax:
name: my_search
output_directory: ./automax_output
overwrite: false # don't wipe previous trials
What Gets Tuned
AutoMAX tunes the loss and optimizer hyperparameters defined in each loss function’s search space. Common parameters across all pairs include:
lr — learning rate (log-scale range)
epoch_decay — learning rate decay factor
weight_decay — L2 regularization
margin — surrogate loss margin
Loss-specific parameters (e.g. beta, eta, gamma, Lambda,
tau) are included automatically when that loss is selected.
See Loss / Optimizer Pairs for the full list per loss function.
Output
After the search completes, results are written to output_directory/name/:
automax_output/
└── my_search/
├── runhistory.json #
├── configspace.json #
├── intensifier.json #
├── optimization.json #
└── scenario.json #