Automatically Avoiding Overfitting in Deep Neural Networks by Using Hyper-Parameters Optimization Methods

Authors

  • Zahraa Saddi Kadhim university of technology https://orcid.org/0000-0003-0869-6148
  • Hasanen S. Abdullah University of technology
  • Khalil I. Ghathwan University of technology

DOI:

https://doi.org/10.3991/ijoe.v19i05.38153

Keywords:

Deep Learning, Hyper-Parameters Optimization, Regularization, Overfitting

Abstract


Overfitting is one issue that deep learning faces in particular. It leads to highly accurate classification results, but they are fraudulent. As a result, if the overfitting problem is not fully resolved, systems that rely on prediction or recognition and are sensitive to accuracy will produce untrustworthy results. All prior suggestions helped to lessen this issue but fell short of eliminating it entirely while maintaining crucial data. This paper proposes a novel approach to guarantee the preservation of critical data while eliminating overfitting completely. Numeric and image datasets are employed in two types of networks: convolutional and deep neural networks. Following the usage of three regularization techniques (L1, L2, and dropout), apply two optimization algorithms (Bayesian and random search), allowing them to select the hyperparameters automatically, with regularization techniques being one of the hyperparameters that are automatically selected. The obtained results, in addition to completely eliminating the overfitting issue, showed that the accuracy of the image data was 97.82% and 90.72 % when using Bayesian and random search techniques, respectively, and was 95.3 % and 96.5 % when using the same algorithms with a numeric dataset.

 

 

 

Author Biographies

Zahraa Saddi Kadhim, university of technology

 

 

 

Hasanen S. Abdullah, University of technology

 

 

 

 

 

 

 

 

 

 

Khalil I. Ghathwan, University of technology

 

 

 

 

 

Downloads

Published

2023-04-27

How to Cite

Kadhim, Z. S., Abdullah, H. S., & Ghathwan, K. I. (2023). Automatically Avoiding Overfitting in Deep Neural Networks by Using Hyper-Parameters Optimization Methods. International Journal of Online and Biomedical Engineering (iJOE), 19(05), pp. 146–162. https://doi.org/10.3991/ijoe.v19i05.38153

Issue

Section

Papers