IEEE Access | |
Adaptive FH-SVM for Imbalanced Classification | |
Dalian Liu1  Yingjie Tian2  Qi Wang3  | |
[1] Department of Basic Course Teaching, Beijing Union University, Beijing, China;Research Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing, China;School of Mathematical Sciences, University of Chinese Academy of Science, Beijing, China; | |
关键词: Focal loss; hinge loss; class imbalance; support vector machines (SVMs); | |
DOI : 10.1109/ACCESS.2019.2940983 | |
来源: DOAJ |
【 摘 要 】
Support vector machines (SVMs), powerful learning methods, have been popular among machine learning researches due to their strong performance on both classification and regression problems. However, traditional SVM making use of Hinge Loss cannot deal with class imbalance problems, because it applies the same weight of loss to each class. Recently, Focal Loss has been widely used for deep learning to address the imbalanced datasets. The significant effectiveness of Focal loss attracts the attention in many fields, such as object detection, semantic segmentation. Inspired by Focal loss, we reconstructed Hinge Loss with the scaling factor of Focal loss, called FH Loss, which not only deals with the class imbalance problems but also preserve the distinctive property of Hinge loss. Owing to the difficulty of the trade-off between positive and negative accuracy in imbalanced classification, FH loss pays more attention on minority class and misclassified instances to improve the accuracy of each class, further to reduce the influence of imbalance. In addition, due to the difficulty of solving SVM with FH loss, we propose an improved model with modified FH loss, called Adaptive FH-SVM. The algorithm solves the optimization problem iteratively and adaptively updates the FH loss of each instance. Experimental results on 31 binary imbalanced datasets demonstrate the effectiveness of our proposed method.
【 授权许可】
Unknown