期刊论文详细信息
Dependence Modeling 卷:1
Prediction of time series by statistical learning:general losses and fast rates
Alquier Pierre1  Wintenberger Olivier2  Li Xiaoyin3 
[1] University College Dublin, School of Mathematical Sciences;
[2] Université Paris-Dauphine, CEREMADE;
[3] Université de Cergy, Laboratoire Analyse Géométrie Modélisation;
关键词: statistical learning theory;    time series forecasting;    pacbayesian bounds;    weak dependence;    mixing;    oracle inequalities;    fast rates;    gdp forecasting;    62m20;    60g25;    62m10;    62p20;    65g15;    68q32;    68t05;   
DOI  :  10.2478/demo-2013-0004
来源: DOAJ
【 摘 要 】

We establish rates of convergences in statistical learning fortime series forecasting. Using the PAC-Bayesian approach,slow rates of convergence√ d/n for the Gibbs estimator underthe absolute loss were given in a previous work [7], wheren is the sample size and d the dimension of the set of predictors.Under the same weak dependence conditions, weextend this result to any convex Lipschitz loss function. Wealso identify a condition on the parameter space that ensuressimilar rates for the classical penalized ERM procedure. Weapply this method for quantile forecasting of the French GDP.Under additional conditions on the loss functions (satisfiedby the quadratic loss function) and for uniformly mixing processes,we prove that the Gibbs estimator actually achievesfast rates of convergence d/n. We discuss the optimality ofthese different rates pointing out references to lower boundswhen they are available. In particular, these results bring ageneralization the results of [29] on sparse regression estimationto some autoregression.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次