期刊论文详细信息
Applied Sciences
Simple Deterministic Selection-Based Genetic Algorithm for Hyperparameter Tuning of Machine Learning Models
Adeiza James Onumanyi1  Mutiu Adesina Adegboye2  Ime Jarlath Umoh3  Ismail Damilola Raji3  Habeeb Bello-Salau3  Ahmed Tijani Salawudeen4 
[1] Advanced Internet of Things, Next, Generation Enterprises and Institutions, Council for Scientific and Industrial Research, Pretoria 0001, South Africa;Communications and Autonomous Systems Group, Robert Gordon University, Aberdeen AB10 7GJ, UK;Department of Computer Engineering, Ahmadu Bello University, Zaria 810107, Nigeria;Department of Electrical and Electronics Engineering, University of Jos, Jos 930222, Nigeria;
关键词: algorithm;    convolutional neural network;    hyperparameter;    random forest;    machine learning;    metaheuristic;   
DOI  :  10.3390/app12031186
来源: DOAJ
【 摘 要 】

Hyperparameter tuning is a critical function necessary for the effective deployment of most machine learning (ML) algorithms. It is used to find the optimal hyperparameter settings of an ML algorithm in order to improve its overall output performance. To this effect, several optimization strategies have been studied for fine-tuning the hyperparameters of many ML algorithms, especially in the absence of model-specific information. However, because most ML training procedures need a significant amount of computational time and memory, it is frequently necessary to build an optimization technique that converges within a small number of fitness evaluations. As a result, a simple deterministic selection genetic algorithm (SDSGA) is proposed in this article. The SDSGA was realized by ensuring that both chromosomes and their accompanying fitness values in the original genetic algorithm are selected in an elitist-like way. We assessed the SDSGA over a variety of mathematical test functions. It was then used to optimize the hyperparameters of two well-known machine learning models, namely, the convolutional neural network (CNN) and the random forest (RF) algorithm, with application on the MNIST and UCI classification datasets. The SDSGA’s efficiency was compared to that of the Bayesian Optimization (BO) and three other popular metaheuristic optimization algorithms (MOAs), namely, the genetic algorithm (GA), particle swarm optimization (PSO) and biogeography-based optimization (BBO) algorithms. The results obtained reveal that the SDSGA performed better than the other MOAs in solving 11 of the 17 known benchmark functions considered in our study. While optimizing the hyperparameters of the two ML models, it performed marginally better in terms of accuracy than the other methods while taking less time to compute.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:2次