src.auto_gnn

Entry point for running AutoMAX hyperparameter search with a GNNTrainer (graph neural network tasks).

CLI usage

python -m src.auto_gnn --config_file config.yaml

This module mirrors src.auto_trainer but targets graph datasets and uses LibAUC’s GNNTrainer. Key differences:

  • The model: section uses GNN-specific keys (emb_dim, num_layers, etc.).

  • decay_factor is an additional training key for LR schedule scaling.

  • Tasks are always treated as binary / single-label for sampler logic (multilabel = False).

Functions

main()

Parse CLI arguments, load the GNN megaconf, build an autopartial GNNTrainer, and run optimize().

apply_cli_overrides(cfg, args)

Identical to src.auto_trainer.apply_cli_overrides().

Parameters:
  • cfg (OmegaConf DictConfig) – Merged megaconf.

  • args (argparse.Namespace) – Parsed CLI arguments.

Returns:

The mutated config (same object).

Return type:

OmegaConf DictConfig

set_seed(seed)

Identical to src.auto_trainer.set_seed().

Parameters:

seed (int) – Seed value to apply.

GNN megaconf defaults

training:
  optimizer: PESG
  optimizer_kwargs: {}
  loss: AUCMLoss
  loss_kwargs: {}
  SEED: 42
  batch_size: 128
  eval_batch_size: 128
  sampling_rate: 0.5
  epochs: 50
  decay_epochs: []
  decay_factor: 10.0
  num_workers: 2
  output_path: ./output
  resume_from_checkpoint: true
  save_checkpoint_every: 5
  project_name: libauc
  experiment_name: run_auto_gnn
  verbose: 1

automax:
  deterministic: true
  n_trials: 5
  n_configs: 1
  SEED: 42
  name: automax_gnn_search
  output_directory: ./automax_output
  overwrite: true

dataset:
  name: ""
  kwargs: {}
  eval_splits: [val]

model:
  name: gcn
  num_tasks: 1
  emb_dim: 256
  num_layers: 5

metrics: [AUROC]
metric_kwargs: []