学位论文详细信息
New results in detection, estimation, and model selection
Detectability;Convex sets;Variable selection;Least angle regressions;Leaps and bounds;M-estimator
Ni, Xuelei ; Industrial and Systems Engineering
University:Georgia Institute of Technology
Department:Industrial and Systems Engineering
关键词: Detectability;    Convex sets;    Variable selection;    Least angle regressions;    Leaps and bounds;    M-estimator;   
Others  :  https://smartech.gatech.edu/bitstream/1853/10419/1/ni_xuelei_200605_phd.pdf
美国|英语
来源: SMARTech Repository
PDF
【 摘 要 】

This thesis contains two parts: the detectability of convex sets and the study on regression modelsIn the first part of this dissertation, we investigate the problem of the detectability of an inhomogeneous convex region in a Gaussian random field. The first proposed detection method relies on checking a constructed statistic on each convex set within an nn image, which is proven to be un-applicable. We then consider using h(v)-parallelograms as the surrogate, which leads to a multiscale strategy. We prove that 2/9 is the minimum proportion of the maximally embedded h(v)-parallelogram in a convex set. Such a constant indicates the effectiveness of the above mentioned multiscale detection method.In the second part, we study the robustness, the optimality, and the computing for regression models.Firstly, for robustness, M-estimators in a regression model where the residuals are of unknown but stochastically bounded distribution are analyzed. An asymptotic minimax M-estimator (RSBN) is derived. Simulations demonstrate the robustness and advantages.Secondly, for optimality, the analysis on the least angle regressions inspired us to consider the conditions under which a vector is the solution of two optimization problems. For these two problems, one can be solved by certain stepwise algorithms, the other is the objective function in many existing subset selection criteria (including Cp, AIC, BIC, MDL, RIC, etc). The latter is proven to be NP-hard. Several conditions are derived.They tell us when a vector is the common optimizer. At last, extending the above idea about finding conditions into exhaustive subset selection in regression, we improve the widely used leaps-and-bounds algorithm (Furnival and Wilson). The proposed method further reduces the number of subsets needed to be considered in the exhaustive subset search by considering not only the residuals, but also the model matrix, and the current coefficients.

【 预 览 】
附件列表
Files Size Format View
New results in detection, estimation, and model selection 1198KB PDF download
  文献评价指标  
  下载次数:9次 浏览次数:16次