A Geometric Method for Improved Uncertainty Estimation in Real-time

Gabriella Chouraqui, Liron Cohen, Gil Einziger, Liel Leman

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Machine learning classifiers are probabilistic in nature, and thus inevitably involve uncertainty. Predicting the probability of a specific input to be correct is called uncertainty (or confidence) estimation and is crucial for risk management. Post-hoc model calibrations can improve models' uncertainty estimations without the need for retraining, and without changing the model. Our work puts forward a geometric-based approach for uncertainty estimation. Roughly speaking, we use the geometric distance of the current input from the existing training inputs as a signal for estimating uncertainty and then calibrate that signal (instead of the model's estimation) using standard post-hoc calibration techniques. We show that our method yields better uncertainty estimations than recently proposed approaches by extensively evaluating multiple datasets and models. In addition, we also demonstrate the possibility of performing our approach in near real-time applications. Our code is available at our Github [Leman and Chouraqui, 2022].

Original languageEnglish
Title of host publicationProceedings of the 38th Conference on Uncertainty in Artificial Intelligence, UAI 2022
PublisherAssociation For Uncertainty in Artificial Intelligence (AUAI)
Pages422-432
Number of pages11
ISBN (Electronic)9781713863298
StatePublished - 1 Jan 2022
Event38th Conference on Uncertainty in Artificial Intelligence, UAI 2022 - Eindhoven, Netherlands
Duration: 1 Aug 20225 Aug 2022

Conference

Conference38th Conference on Uncertainty in Artificial Intelligence, UAI 2022
Country/TerritoryNetherlands
CityEindhoven
Period1/08/225/08/22

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A Geometric Method for Improved Uncertainty Estimation in Real-time'. Together they form a unique fingerprint.

Cite this