AutoMAX

API Reference

  • API Reference
    • src.core
    • AutoMAX
      • AutoMAX
        • AutoMAX.__init__()
        • AutoMAX.train()
        • AutoMAX.optimize()
    • AutoMAXConfigration
      • AutoMAXConfigration
    • autopartial
      • autopartial
        • autopartial.__call__()
    • src.core.helpers
      • parse_hyperparameters_from_dict()
    • src.auto_trainer
      • main()
      • apply_cli_overrides()
      • set_seed()
    • src.auto_gnn

Get Started

  • Quick Start
    • Step 1 — Install
    • Step 2 — Write a config
    • Step 3 — Run AutoMAX
    • Entry Points
  • Installation
    • Requirements
    • From Source
    • Core Dependencies
    • Verifying the Install
  • Configuration Reference
    • dataset
    • model
    • metrics
    • training
      • Experiment tracking
      • Core training parameters
      • Loss and optimizer
      • Learning rate schedule
      • Checkpointing
    • automax

AutoMAX Search

  • AutoMAX Search
    • How It Works
    • Resuming a Search
    • What Gets Tuned
    • Output
  • Loss / Optimizer Pairs
    • Choosing a Pair
  • Hyperparameter Definition Format
    • Scalar Constant
    • Search Space Dict
      • Sub-fields
    • Examples
  • CLI Overrides
    • Reference

Recipes

  • Recipes
    • AUROC — AUCMLoss + PESG
      • Config
      • Run
      • Hyperparameter search space
    • AUPRC — APLoss + SOAP
      • Config
      • Run
    • One-way pAUC — pAUCLoss (1w) + SOPAs
      • Config
      • Run
    • Two-way pAUC — pAUCLoss (2w) + SOTAs
      • Config
      • Run
    • Two-way pAUC CVaR — tpAUC_CVaR_loss + STACO
      • Config
      • Run
AutoMAX
  • AutoMAX Search
  • View page source

AutoMAX Search

AutoMAX uses SMAC3 — a Bayesian optimization framework — to search over hyperparameter configurations for your chosen loss/optimizer pair.

How It Works

  1. AutoMAX reads your config and builds a configuration space from the predefined search space of the selected loss function.

  2. It starts with the default configuration (via DefaultInitialDesign) to establish a baseline.

  3. SMAC3 proposes new configurations, trains a full model for each trial, and uses the returned target metric to guide the next suggestion.

  4. After n_trials evaluations, the best configuration is saved to output_directory.

Resuming a Search

SMAC3 serializes the run history to disk. If a search is interrupted, it can be resumed by re-running the same command — as long as output_directory and name remain unchanged and overwrite: false.

automax:
  name: my_search
  output_directory: ./automax_output
  overwrite: false   # don't wipe previous trials

What Gets Tuned

AutoMAX tunes the loss and optimizer hyperparameters defined in each loss function’s search space. Common parameters across all pairs include:

  • lr — learning rate (log-scale range)

  • epoch_decay — learning rate decay factor

  • weight_decay — L2 regularization

  • margin — surrogate loss margin

Loss-specific parameters (e.g. beta, eta, gamma, Lambda, tau) are included automatically when that loss is selected.

See Loss / Optimizer Pairs for the full list per loss function.

Output

After the search completes, results are written to output_directory/name/:

automax_output/
└── my_search/
    ├── runhistory.json  #
    ├── configspace.json  #
    ├── intensifier.json  #
    ├── optimization.json #
    └── scenario.json #
Previous Next

© Copyright 2024, Optimization-AI Lab.

Built with Sphinx using a theme provided by Read the Docs.