Skip to content

sudarshanregmi/AdaSCALE

Repository files navigation

AdaSCALE: Adaptive Scaling for OOD Detection

This codebase provides a Pytorch implementation of:

AdaSCALE: Adaptive Scaling for OOD Detection
AdaSCALE
Sudarshan Regmi

AdaSCALE's illustration

Comparison with fixed scaling procedure

Abstract

The ability of the deep learning model to recognize when a sample falls outside its learned distribution is critical for safe and reliable deployment. Recent state-of-the-art out-of-distribution (OOD) detection methods leverage activation shaping to improve the separation between in-distribution (ID) and OOD inputs. These approaches resort to sample-specific scaling but apply a static percentile threshold across all samples regardless of their nature, resulting in suboptimal ID-OOD separability. In this work, we propose \textbf{AdaSCALE}, an adaptive scaling procedure that dynamically adjusts the percentile threshold based on a sample's estimated OOD likelihood. This estimation leverages our key observation: OOD samples exhibit significantly more pronounced activation shifts at high-magnitude activations under minor perturbation compared to ID samples. AdaSCALE enables stronger scaling for likely ID samples and weaker scaling for likely OOD samples, yielding highly separable energy scores. Our approach achieves state-of-the-art OOD detection performance, outperforming the latest rival OptFS by 14.94% in near-OOD and 21.67% in far-OOD datasets in average FPR@95 metric on the ImageNet-1k benchmark across eight diverse architectures.

Check other works:

t2fnorm
reweightood
ascood

Follow OpenOOD official instruction to complete the setup.

pip install git+https://github.com/Jingkang50/OpenOOD

Evaluation setup

Example Scripts

Use the following scripts for inferencing with AdaSCALE-A postprocessor on different datasets:

  • CIFAR-10:
    bash scripts/ood/adascale_a/cifar10_train_adascale.sh
    bash scripts/ood/adascale_a/cifar10_test_adascale.sh
  • CIFAR-100:
    bash scripts/ood/adascale_a/cifar100_train_adascale.sh
    bash scripts/ood/adascale_a/cifar100_test_adascale.sh
  • ImageNet-200:
    bash scripts/ood/adascale_a/imagenet200_train_adascale.sh
    bash scripts/ood/adascale_a/imagenet200_test_adascale.sh
  • ImageNet-1k:
    bash scripts/ood/adascale_a/imagenet_train_adascale.sh
    bash scripts/ood/adascale_a/imagenet_test_adascale.sh

Please see ./scripts/ood/adascale_l/ folder for another variant.

Please see results folder for OpenOOD v1.5 benchmark results.

Please refer to Google Drive for access to the models we trained on CIFAR-10/100 datasets.

Results

  • AdaSCALE's generalization in ImageNet-1k benchmark:

  • AdaSCALE's compatibility with ISH

  • AdaSCALE's competitiveness in CIFAR benchmarks:

  • Adaptive percentile vs. Static percentile

  • AdaSCALE's efficacy with limited ID data

Consider citing this work if you find it useful.

@misc{regmi2025adascaleadaptivescalingood,
      title={AdaSCALE: Adaptive Scaling for OOD Detection},
      author={Sudarshan Regmi},
      year={2025},
      eprint={2503.08023},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2503.08023},
}

Acknowledgment

This codebase builds upon OpenOOD.

About

Official code for arXiv preprint arXiv:2503.08023: "AdaSCALE: Adaptive Scaling for OOD Detection"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published