Authors:
Shusaku Uemura
;
Kazuhide Fukushima
and
Shinsaku Kiyomoto
Affiliation:
KDDI Research, Inc., Saitama, Japan
Keyword(s):
Polynomial Approximation, Sigmoid Function, Fully Homomorphic Encryption, Privacy-Preserving Neural Network.
Abstract:
Artificial intelligence and data analysis have recently attracted attention, but privacy is a serious problem when sensitive data are analyezed. Privacy-preserving neural networks (PPNN) solve this problem, since they can infer without knowing any information about the input. The PPNN promotes the analyses of sensitive or confidential data and collaboration among companies by combining their data without explicitly sharing them. Fully homomorphic encryption is a promising method for PPNN. However, there is a limitation that PPNN cannot easily evaluate non-polynomial functions. Thus, polynomial approximations of activation functions are required, and much research has been conducted on this topic. The existing research focused on some fixed domain to improve their approximation accuracy. In this paper, we compared seven ways in total for several degrees of polynomials to approximate a commonly used sigmoid function in neural networks. We focused on the approximation errors beyond the
domain used to approximate, which have been dismissed but may affect the accuracy of PPNN. Our results reveal the differences of each method and each degree, which help determine the suitable method for PPNN. We also found a difference in the behavior of the approximations beyond the domain depending on the parity of the degrees, the cause of which we clarified.
(More)