期刊论文详细信息
Electronics
CondNAS: Neural Architecture Search for Conditional CNNs
Gunju Park1  Youngmin Yi1 
[1] Department of Electrical and Computer Engineering, University of Seoul, Seoul 02504, Korea;
关键词: neural architecture search;    conditional CNN;    genetic algorithm;    performance prediction;    deep learning;   
DOI  :  10.3390/electronics11071101
来源: DOAJ
【 摘 要 】

As deep learning has become prevalent and adopted in various application domains, the need for efficient convolution neural network (CNN) inference on diverse target platforms has increased. To address the need, a neural architecture search (NAS) technique called once-for-all, or OFA, which aims to efficiently find the optimal CNN architecture for the given target platform using genetic algorithm (GA), has recently been proposed. Meanwhile, a conditional CNN architecture, which allows early exits with auxiliary classifiers in the middle of a network to achieve efficient inference without accuracy loss or with negligible loss, has been proposed. In this paper, we propose a NAS technique for the conditional CNN architecture, CondNAS, which efficiently finds a near-optimal conditional CNN architecture for the target platform using GA. By attaching auxiliary classifiers through adaptive pooling, OFA’s SuperNet is successfully extended, such that it incorporates the various conditional CNN sub-networks. In addition, we devise machine learning-based prediction models for the accuracy and latency of an arbitrary conditional CNN, which are used in the GA of CondNAS to efficiently explore the large search space. The experimental results show that the conditional CNNs from CondNAS is 2.52× and 1.75× faster than the CNNs from OFA for Galaxy Note10+ GPU and CPU, respectively.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次