AUROC — AUCMLoss + PESG
The canonical setup for binary imbalanced classification. AUCMLoss
with PESG directly maximizes AUROC using a minimax formulation.
Config
recipes/config_auc.yaml
dataset:
name: cifar10
eval_splits: [val, test]
kwargs:
imratio: 0.1
model:
name: resnet18
pretrained: false
num_classes: 1
in_channels: 3
metrics:
- AUROC
training:
project_name: libauc
experiment_name: resnet18_AUCMLoss_cifar10
SEED: 2026
epochs: 100
batch_size: 128
eval_batch_size: 256
sampling_rate: 0.2
num_workers: 0
decay_epochs: [0.5, 0.75]
loss: AUCMLoss
optimizer: PESG
output_path: ./output
resume_from_checkpoint: false
save_checkpoint_every: 5
automax:
deterministic: true
n_trials: 5
SEED: 42
name: resnet18_AUCMLoss_cifar10
output_directory: ./automax_output
overwrite: true
Run
python -m src.auto_trainer \
--config_file recipes/config_auc.yaml
Hyperparameter search space
AutoMAX tunes the following parameters for AUCMLoss + PESG:
Parameter |
Source |
Type |
Notes |
|---|---|---|---|
|
optimizer |
log-uniform range |
Learning rate |
|
optimizer |
range |
LR decay factor applied at |
|
optimizer |
log-uniform range |
L2 regularization |
|
optimizer |
categorical |
Momentum coefficient |
|
loss |
categorical |
Surrogate loss margin |