期刊论文详细信息
Information
Looking Back to Lower-Level Information in Few-Shot Learning
Sebastian Raschka1  Zhongjie Yu1 
[1] Department of Statistics, University of Wisconsin-Madison, Madison, WI 53706, USA;
关键词: machine learning;    deep learning;    few-shot learning;    meta-learning;    graph neural networks;    image classification;   
DOI  :  10.3390/info11070345
来源: DOAJ
【 摘 要 】

Humans are capable of learning new concepts from small numbers of examples. In contrast, supervised deep learning models usually lack the ability to extract reliable predictive rules from limited data scenarios when attempting to classify new examples. This challenging scenario is commonly known as few-shot learning. Few-shot learning has garnered increased attention in recent years due to its significance for many real-world problems. Recently, new methods relying on meta-learning paradigms combined with graph-based structures, which model the relationship between examples, have shown promising results on a variety of few-shot classification tasks. However, existing work on few-shot learning is only focused on the feature embeddings produced by the last layer of the neural network. The novel contribution of this paper is the utilization of lower-level information to improve the meta-learner performance in few-shot learning. In particular, we propose the Looking-Back method, which could use lower-level information to construct additional graphs for label propagation in limited data settings. Our experiments on two popular few-shot learning datasets, miniImageNet and tieredImageNet, show that our method can utilize the lower-level information in the network to improve state-of-the-art classification performance.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次