Electronics | 卷:10 |
Deep Collaborative Learning for Randomly Wired Neural Networks | |
Ehab Essa1  Xianghua Xie1  | |
[1] Department of Computer Science, Swansea University, Swansea SA1 8EN, UK; | |
关键词: randomly wired neural networks; model distillation; ensemble model; deep learning; | |
DOI : 10.3390/electronics10141669 | |
来源: DOAJ |
【 摘 要 】
A deep collaborative learning approach is introduced in which a chain of randomly wired neural networks is trained simultaneously to improve the overall generalization and form a strong ensemble model. The proposed method takes advantage of functional-preserving transfer learning and knowledge distillation to produce an ensemble model. Knowledge distillation is an effective learning scheme for improving the performance of small neural networks by using the knowledge learned by teacher networks. Most of the previous methods learn from one or more teachers but not in a collaborative way. In this paper, we created a chain of randomly wired neural networks based on a random graph algorithm and collaboratively trained the models using functional-preserving transfer learning, so that the small network in the chain could learn from the largest one simultaneously. The training method applies knowledge distillation between randomly wired models, where each model is considered as a teacher to the next model in the chain. The decision of multiple chains of models can be combined to produce a robust ensemble model. The proposed method is evaluated on CIFAR-10, CIFAR-100, and TinyImageNet. The experimental results show that the collaborative training significantly improved the generalization of each model, which allowed for obtaining a small model that can mimic the performance of a large model and produce a more robust ensemble approach.
【 授权许可】
Unknown