An important challenge in detection theory is that the size of the state space may be very large. In the context of universal hypothesis testing, two important problems pertaining to the large state space that have not been addressed before are: (1) What is the impact of a large state space on the performance of tests? (2) How does one design an effective test when the state space is large? This thesis addresses these two problems by developing a generalization of Kullback-Leibler (KL) mismatched divergence, called mismatched divergence. 1. We describe a drawback of the Hoeffding test: The asymptotic bias and variance of the Hoeffding test are approximately proportional to the size of the state space; thus, it performs poorly when the number of test samples is comparable to the size of state space. 2. We develop a generalization of the Hoeffding test based on the mismatched divergence, called the mismatched universal test. We show that this test has asymptotic bias and variance proportional to the dimension of the function class used to define the mismatched divergence. The dimension of the function class can be chosen to be much smaller than the size of the state space, and thus our proposed test has a better finite-sample performance in terms of bias and variance. 3. We demonstrate that the mismatched universal test also has an advantage when the distribution of the null hypothesis is learned from data. 4. We develop some algebraic properties and geometric interpretations of the mismatched divergence. We also show its connection to a robust test. 5. We develop a generalization of Pinsker’s inequality, which gives a lower bound of the mismatched divergence.
【 预 览 】
附件列表
Files
Size
Format
View
Mismatched divergence and universal hypothesis testing