On the Robustness of Out-of-Distribution Detection Methods for Camera-based Systems
Sprache des Vortragstitels:
Englisch
Original Tagungtitel:
Asilomar Conference on Signals, Systems, and Computers
Sprache des Tagungstitel:
Englisch
Original Kurzfassung:
Out-of-distribution (OOD) detection refers to recognizing instances that lie outside the scope of what a machine learning model has been exposed to during training. In safety-critical domains like autonomous driving, OOD detection is paramount for enhancing the reliability and safety of machine learning systems. To investigate the robustness of OOD detection methods, we conduct experiments tailored to camera-based autonomous driving scenarios, focusing on realistic challenges these systems may encounter. Our experimental setup includes benchmarking various types of corruption, such as image sensor degradation, lens contamination, adverse weather conditions, and motion blur.
The findings suggest intrinsic weaknesses across all tested state-of-the-art OOD detection methods. Unexpectedly, even single-pixel alterations corresponding to image sensor degradation over time can result in notable changes in performance.