期刊论文详细信息
NEUROCOMPUTING 卷:415
On hyperparameter optimization of machine learning algorithms: Theory and practice
Article
Yang, Li1  Shami, Abdallah1 
[1] Univ Western Ontario, Dept Elect & Comp Engn, 1151 Richmond St, London, ON N6A 3K7, Canada
关键词: Hyper-parameter optimization;    Machine learning;    Bayesian optimization;    Particle swarm optimization;    Genetic algorithm;    Grid search;   
DOI  :  10.1016/j.neucom.2020.07.061
来源: Elsevier
PDF
【 摘 要 】

Machine learning algorithms have been used widely in various applications and areas. To fit a machine learning model into different problems, its hyper-parameters must be tuned. Selecting the best hyperparameter configuration for machine learning models has a direct impact on the model's performance. It often requires deep knowledge of machine learning algorithms and appropriate hyper-parameter optimization techniques. Although several automatic optimization techniques exist, they have different strengths and drawbacks when applied to different types of problems. In this paper, optimizing the hyper-parameters of common machine learning models is studied. We introduce several state-of-theart optimization techniques and discuss how to apply them to machine learning algorithms. Many available libraries and frameworks developed for hyper-parameter optimization problems are provided, and some open challenges of hyper-parameter optimization research are also discussed in this paper. Moreover, experiments are conducted on benchmark datasets to compare the performance of different optimization methods and provide practical examples of hyper-parameter optimization. This survey paper will help industrial users, data analysts, and researchers to better develop machine learning models by identifying the proper hyper-parameter configurations effectively. (C) 2020 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2020_07_061.pdf 648KB PDF download
  文献评价指标  
  下载次数:6次 浏览次数:5次