会议论文详细信息
12th International Conference on Artificial Intelligence and Statistics
Speed and Sparsity of Regularized Boosting
Yongxin Taylor Xi Zhen James Xiang Peter J. Ramadge
PID  :  120849
来源: CEUR
PDF
【 摘 要 】

Boosting algorithms with l1regularization are of interest because l1 regularization leads to sparser composite classifiers. Moreover, Rosset et al. have shown that for separable data, standard lp regularized loss minimization results in a margin maximizing classifier in the limit as regulariza tion is relaxed. For the case p = 1, we ex tend these results by obtaining explicit conver gence bounds on the regularization required to yield a margin within prescribed accuracy of the maximum achievable margin. We derive simi lar rates of convergence for the AdaBoost algo rithm, in the process providing a new proof that AdaBoost is margin maximizing asconverges to 0. Because both of these known algorithms are computationally expensive, we introduce a new hybrid algorithm, AdaBoost+L1, that combines the virtues of AdaBoost with the sparsity of l1 regularization in a computationally efficient fash ion. We prove that the algorithm is margin maxi mizing and empirically examine its performance

【 预 览 】
附件列表
Files Size Format View
Speed and Sparsity of Regularized Boosting 943KB PDF download
  文献评价指标  
  下载次数:8次 浏览次数:8次