2024-04-30
2024-06-28
2024-06-06
Manuscript received December 23, 2023; revised March 1, 2024; accepted March 13, 2024; published August 6, 2024
Abstract—Deep neural network-based machine learning algorithms are widely used within different sectors and produce excellent results. However, their use requires access to private, often confidential, and sensitive information (financial, medical, etc). This requires precise measures and particular attention to data security and confidentiality. In this paper, we propose a new solution to this problem by integrating a proposed Convolutional Neural Network (CNN) model on encrypted data within the constraints of homomorphic encryption techniques. Specifically, we focus on the approximate activation functions ReLU, Sigmoid, and Tanh, which appear to be the key functions of CNNs. We start by developing new low-degree polynomials, which are essential for successful Homomorphic Encryption (HE). The activation functions will be replaced by these polynomials, which are based on the Beta function and its primitive. To make certain that the data is contained within a given range, the next step is to build a new CNN model using batch normalization. Finally, our methodology and the effectiveness of the proposed strategy are evaluated using Mnist and Cifar10. The experimental results support the proposed approach’s efficiency. Keywords—Convolutional Neural Network (CNN), homomorphic encryption, activation function, beta function, batch normalization Cite: Hanen Issaoui, Asma ElAdel, and Mourad Zaied, "Intelligent Beta-Based Polynomial Approximation of Activation Functions for a Robust Data Encryption System," Journal of Image and Graphics, Vol. 12, No. 3, pp. 259-268, 2024. Copyright © 2024 by the authors. This is an open access article distributed under the Creative Commons Attribution License (CC BY-NC-ND 4.0), which permits use, distribution and reproduction in any medium, provided that the article is properly cited, the use is non-commercial and no modifications or adaptations are made.