Open Mathematics | |
Bias-variance decomposition in Genetic Programming | |
Doursat René1  Kowaliw Taras2  | |
[1] Informatics Research Centre, School of Computing, Mathematics & Digital Technology, Manchester Metropolitan University, John Dalton Building, Chester Street, Manchester M1 5GD, United Kingdom of Great Britain and Northern Ireland;Institut des Systèmes Complexes Paris Île-de-France (ISC-PIF), Centre National de la Recherche Scientifique (CNRS UPS3611), 113 rue Nationale, 75013 Paris, France; | |
关键词: analysis of algorithms; bias-variance decomposition; classification; computational learning theory; evolutionary computation; genetic programming; learning and adaptive systems; non-parametric inference; regression; 62g08; 62j10; 68q32; 68t05; 68w40; | |
DOI : 10.1515/math-2016-0005 | |
来源: DOAJ |
【 摘 要 】
We study properties of Linear Genetic Programming (LGP) through several regression and classification benchmarks. In each problem, we decompose the results into bias and variance components, and explore the effect of varying certain key parameters on the overall error and its decomposed contributions. These parameters are the maximum program size, the initial population, and the function set used. We confirm and quantify several insights into the practical usage of GP, most notably that (a) the variance between runs is primarily due to initialization rather than the selection of training samples, (b) parameters can be reasonably optimized to obtain gains in efficacy, and (c) functions detrimental to evolvability are easily eliminated, while functions well-suited to the problem can greatly improve performance—therefore, larger and more diverse function sets are always preferable.
【 授权许可】
Unknown