Authors:
Mikhaël Presley Kibinda-Moukengue
;
Alexandre Baussard
and
Pierre Beauseroy
Affiliation:
Computer Science and Digital Society Laboratory (LIST3N), Université de Technologie de Troyes, Troyes, France
Keyword(s):
Deep Neural Networks, Out-of-Distribution, Detection, Statistical Hypothesis Tests.
Abstract:
Decision-making in a number of industries, including environmental management, transportation, and public health, is greatly aided by artificial intelligence systems. Nonetheless, to perform well, these systems requires to follow some usage conditions. For instance, the data fed into a classification neural network must come from the same distribution as the training data to maintain the performance measured during test. In practice, however, this condition is not always met and not so easy to guarantee. In particular, for image recognition, it’s possible to submit images that do not contain any learned classes and still receive a firm response from the network. This paper presents an approach to out-of-distribution observation detection applied to deep neural networks (DNNs) for image classification, called DNN Layers Features Reduction for Out-Of-Distribution Detection (DROOD). The principle of DROOD is to construct a decision statistic by successively synthesizing information from
the features of all the intermediate layers of the classification network. The method is adaptable to any DNN architecture and experiments show results that outperform reference methods.
(More)