学位论文详细信息
Mismatched divergence and universal hypothesis testing
Kullback-Leibler divergence;Hoeffding test;"Pinskers inequality";universal hypothesis testing;robust test;mismatched universal test;mismatched divergence;detection;learning;variance;variational representation
Huang, Dayu ; Meyn, Sean P. ; Meyn ; Sean P.
关键词: Kullback-Leibler divergence;    Hoeffding test;    "Pinskers inequality";    universal hypothesis testing;    robust test;    mismatched universal test;    mismatched divergence;    detection;    learning;    variance;    variational representation;   
Others  :  https://www.ideals.illinois.edu/bitstream/handle/2142/14758/Huang_Dayu.pdf?sequence=1&isAllowed=y
美国|英语
来源: The Illinois Digital Environment for Access to Learning and Scholarship
PDF
【 摘 要 】

An important challenge in detection theory is that the size of the state space may be very large. In the context of universal hypothesis testing, two important problems pertaining to the large state space that have not been addressed before are: (1) What is the impact of a large state space on the performance of tests? (2) How does one design an effective test when the state space is large? This thesis addresses these two problems by developing a generalization of Kullback-Leibler (KL) mismatched divergence, called mismatched divergence. 1. We describe a drawback of the Hoeffding test: The asymptotic bias and variance of the Hoeffding test are approximately proportional to the size of the state space; thus, it performs poorly when the number of test samples is comparable to the size of state space. 2. We develop a generalization of the Hoeffding test based on the mismatched divergence, called the mismatched universal test. We show that this test has asymptotic bias and variance proportional to the dimension of the function class used to define the mismatched divergence. The dimension of the function class can be chosen to be much smaller than the size of the state space, and thus our proposed test has a better finite-sample performance in terms of bias and variance. 3. We demonstrate that the mismatched universal test also has an advantage when the distribution of the null hypothesis is learned from data. 4. We develop some algebraic properties and geometric interpretations of the mismatched divergence. We also show its connection to a robust test. 5. We develop a generalization of Pinsker’s inequality, which gives a lower bound of the mismatched divergence.

【 预 览 】
附件列表
Files Size Format View
Mismatched divergence and universal hypothesis testing 319KB PDF download
  文献评价指标  
  下载次数:11次 浏览次数:19次