A New Approach for Determining Hyperparameters in Artificial Neural Networks: Enhanced Black Hole Optimization Algorithm

Main Article Content

Mehmet Bilen


Artificial Neural Networks (ANN), a simple copy of the neurons in our brain, have been used for many years to bring people's problem-solving skills to computers. Although we have been able to run these networks faster with the help of developing technology, our need to run them better has created a challenging study area. Determining the hyper-parameters of the ANN has critical importance in this direction. In this study, an approach was inspired by Black Holes to determine the ANN hyperparameters. The Black Hole Algorithm (BHA), which is used as an optimization algorithm in the literature, is used for determining the hyperparameters of the ANN. The disadvantages of the BHA algorithm were identified and enhanced, a new approach called the Enhanced Black Hole Algorithm (EBHA) was proposed to the literature. Performance values ​​obtained by the test processes because the parameters selected with this algorithm are compared with other algorithms frequently used in the literature, it has been seen that the developed method achieves the most successful performance values.

Article Details

How to Cite
BILEN, Mehmet. A New Approach for Determining Hyperparameters in Artificial Neural Networks: Enhanced Black Hole Optimization Algorithm. Journal of Multidisciplinary Developments, [S.l.], v. 6, n. 1, p. 18-28, july 2021. ISSN 2564-6095. Available at: <http://jomude.com/index.php/jomude/article/view/90>. Date accessed: 22 sep. 2021.
Natural Sciences - Regular Research Paper


[1] Blanton, H. (1997). An Introduction to Neural Networks for Technicians, Engineers and Other non PhDs. Proceedings of 1997 Artificial Neural Networks in Engineering Conference. St. Louis.

[2] Hinkelmann, K. (2021, 7 16). Neural Networks. University of Applied Sciences Northwestern Switzerland School of Business: Accessed on July 7, 2021, from http://didattica.cs.unicam.it/lib/exe/fetch.php?media=didattica:magistrale:kebi:ay_1718:ke-11_neural_networks.pdf

[3] Nair, V., & Hinton, G. E. (2010, June). Rectified linear units improve restricted Boltzmann machines. ICML'10: Proceedings of the 27th International Conference on International Conference on Machine Learning (s. 807-814). Haifa: Omnipress, United States.

[4] Hinton, G., Deng, L., Deng, L., Yu, D., & Dahl, G. (2012). Deep Neural Networks for Acoustic Modeling in Speech Recognition. IEEE Signal Processing Magazine, 26(6), 82-97.

[5] Goodfellow, I., Bengio, Y., & Courville, A. (2017). Deep Learning (Adaptive Computation and Machine Learning series). Massachusetts: Cambridge.

[6] Injadat, M., Moubayed, M., Nassif, A. B., & Shami, A. (2020). Systematic ensemble model selection approach for educational data mining. Knowledge-Based Systems, 200, 105992.

[7] Yang, L., & Shami, A. (2020). On hyperparameter optimization of machine learning algorithms: Theory and practice. Neurocomputing, 415, 295-316.

[8] James, B., & Yoshua, B. (2012). Random search hyper-parameter optimization. Journal of Machine Learning Research, 13(2).

[9] Sameen, M. I., Pradhan, B., & Lee, S. (2019). Self-Learning Random Forests Model for Mapping Groundwater Yield in Data-Scarce Areas. Natural Resources Research, 28, 757-775.

[10] Sameen, M. I., Pradhan, B., & Lee, S. (2020). Application of convolutional neural networks featuring Bayesian optimization for landslide susceptibility assessment. CATENA, 186, 104249.

[11] Kouziokas, G. (2020). A new W-SVM kernel combining PSO-neural network transformed vector and Bayesian optimized SVM in GDP forecasting. Engineering Applications of Artificial Intelligence, 92, 103650.

[12] Hatamlou, A. (2013). Blackhole: A new heuristic optimization approach for data clustering. Information Sciences, 222, 175-184.

[13] Dua, D., & Graff, C. (2019). Irvine, CA: The University of California, School of Information and Computer Science. http://archive.ics.uci.edu/ml.

[14] Machine Learning Repository. (2021). Accessed on May 26, 2021, from http://archive.ics.uci.edu/ml