学位论文详细信息
Adapting Component Analysis
Domain adaptation;Kernel embedding;Hilbert-Schmidt Independence Criteria;Dimension reduction;Computer Science
Dorri, Fatemeh
University of Waterloo
关键词: Domain adaptation;    Kernel embedding;    Hilbert-Schmidt Independence Criteria;    Dimension reduction;    Computer Science;   
Others  :  https://uwspace.uwaterloo.ca/bitstream/10012/6738/1/Dorri_Fatemeh.pdf
瑞士|英语
来源: UWSPACE Waterloo Institutional Repository
PDF
【 摘 要 】

A main problem in machine learning is to predict the response variables of a test set given the training data and its corresponding response variables. A predictive model can perform satisfactorily only if the training data is an appropriate representative of the test data. Thisintuition is reflected in the assumption that the training data and the test data are drawnfrom the same underlying distribution. However, the assumption may not be correct inmany applications for various reasons. For example, gathering training data from the test population might not be easily possible, due to its expense or rareness. Or, factors like time, place, weather, etc can cause the difference in the distributions.I propose a method based on kernel distribution embedding and Hilbert Schmidt Independence Criteria (HSIC) to address this problem. The proposed method explores a newrepresentation of the data in a new feature space with two properties: (i) the distributionsof the training and the test data sets are as close as possible in the new feature space, (ii) the important structural information of the data is preserved. The algorithm can reduce the dimensionality of the data while it preserves the aforementioned properties and therefore it can be seen as a dimensionality reduction method as well. Our method has a closed-form solution and the experimental results on various data sets show that it works well in practice.

【 预 览 】
附件列表
Files Size Format View
Adapting Component Analysis 710KB PDF download
  文献评价指标  
  下载次数:23次 浏览次数:20次