| International Journal of Information Technology | |
| Comparison between XGBoost, LightGBM and CatBoost Using a Home Credit Dataset | |
| Essam Al Daoud | |
| 关键词: Gradient boosting; XGBoost; LightGBM; CatBoost; home credit.; | |
| DOI : 10.1999/1307-6892/10009954 | |
| 学科分类:计算机应用 | |
| 来源: World Academy of Science, Engineering and Technology (W A S E T) | |
PDF
|
|
【 摘 要 】
Gradient boosting methods have been proven to be a very important strategy. Many successful machine learning solutions were developed using the XGBoost and its derivatives. The aim of this study is to investigate and compare the efficiency of three gradient methods. Home credit dataset is used in this work which contains 219 features and 356251 records. However, new features are generated and several techniques are used to rank and select the best features. The implementation indicates that the LightGBM is faster and more accurate than CatBoost and XGBoost using variant number of features and records.
【 授权许可】
Unknown
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| RO201910289445605ZK.pdf | 413KB |
PDF