期刊论文详细信息
Frontiers in Applied Mathematics and Statistics
Stochastic AUC Optimization Algorithms With Linear Convergence
Siwei Lyu1  Yiming Ying2  Michael Natole2 
[1] Department of Computer Science, University at Albany, State University of New York, Albany, NY, United States;Department of Mathematics and Statistics, University at Albany, State University of New York, Albany, NY, United States;
关键词: AUC maximization;    imbalanced data;    linear convergence;    stochastic optimization;    ROC curve;   
DOI  :  10.3389/fams.2019.00030
来源: DOAJ
【 摘 要 】

Area under the ROC curve (AUC) is a standard metric that is used to measure classification performance for imbalanced class data. Developing stochastic learning algorithms that maximize AUC over accuracy is of practical interest. However, AUC maximization presents a challenge since the learning objective function is defined over a pair of instances of opposite classes. Existing methods circumvent this issue but with high space and time complexity. From our previous work of redefining AUC optimization as a convex-concave saddle point problem, we propose a new stochastic batch learning algorithm for AUC maximization. The key difference from our previous work is that we assume that the underlying distribution of the data is uniform, and we develop a batch learning algorithm that is a stochastic primal-dual algorithm (SPDAM) that achieves a linear convergence rate. We establish the theoretical convergence of SPDAM with high probability and demonstrate its effectiveness on standard benchmark datasets.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:1次