会议论文详细信息
16th International Conference on Artificial Intelligence and Statistics
Highdimensional Inference via Lipschitz SparsityYielding Regularizers
Zheng Pan Changshui Zhang
PID  :  121219
来源: CEUR
PDF
【 摘 要 】

Nonconvex regularizers are more and more applied to highdimensional inference with s parsity prior knowledge. In general, the non convex regularizer is superior to the convex ones in inference but it suffers the difficul ties brought by local optimums and massive computation. A ”good” regularizer should perform well in both inference and optimiza tion. In this paper, we prove that some nonconvex regularizers can be such ”good” regularizers. They are a family of sparsity yielding penalties with proper Lipschitz sub gradients. These regularizers keep the su periority of nonconvex regularizers in infer ence. Their estimation conditions based on s parse eigenvalues are weaker than the convex regularizers. Meanwhile, if properly tuned, they behave like convex regularizers since s tandard proximal methods guarantee to give stationary solutions. These stationary solu tions, if sparse enough, are identical to the global solutions. If the solution sequence pro vided by proximal methods is along a sparse path, the convergence rate to the global op timum is on the order of 1/k where k is the

【 预 览 】
附件列表
Files Size Format View
Highdimensional Inference via Lipschitz SparsityYielding Regularizers 1698KB PDF download
  文献评价指标  
  下载次数:19次 浏览次数:15次