Revealing Inherent and Counterintuitive Sensitivities of Out-Of-Distribution Detection Methods
Sprache des Titels:
Englisch
Original Buchtitel:
Out Of Distribution Generalization in Computer Vision Workshop
Original Kurzfassung:
Out-of-distribution (OOD) detection identifies samples outside the data distribution used to train a machine learning model and is crucial in safety-critical domains like autonomous driving.
While neural network robustness has advanced, its effect on OOD detectors is less studied.
We address dataset limitations due to unknown preprocessing artifacts by introducing Shapetastic, a framework to generate annotated images, and introduce a novel synthetic dataset, ShapetasticOOD, generated with it.
We propose to incorporate robustness into OOD detection benchmarks, using various image interventions such as rotating, resizing, and compressing.
Our experiments reveal inherent and counterintuitive sensitivities in state-of-the-art OOD detectors, highlighting gaps in current research.
Codes and dataset are available on https://github.com/chuber1986/ood-robustness