IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | |
Multisensor Land Cover Classification With Sparsely Annotated Data Based on Convolutional Neural Networks and Self-Distillation | |
Raffaele Gaetano1  Stephane Dupuy1  Dino Ienco2  Yawogan Jean Eudes Gbodjo2  Olivier Montet2  | |
[1] TETIS Research Unit, French Agricultural Research Centre for International Development, Montpellier, France;TETIS Research Unit, National Research Institute for Agriculture, Food and the Environment, University of Montpellier, Montpellier, France; | |
关键词: Convolutional neural networks (CNNs); land use and land cover (LULC) mapping; multisensor; multitemporal and multiscale remote sensing; self-distillation; sparsely annotated data; | |
DOI : 10.1109/JSTARS.2021.3119191 | |
来源: DOAJ |
【 摘 要 】
Extensive research studies have been conducted in recent years to exploit the complementarity among multisensor (or multimodal) remote sensing data for prominent applications such as land cover mapping. In order to make a step further with respect to previous studies, which investigate multitemporal SAR and optical data or multitemporal/multiscale optical combinations, here, we propose a deep learning framework that simultaneously integrates all these input sources, specifically multitemporal SAR/optical data and fine-scale optical information at their native temporal and spatial resolutions. Our proposal relies on a patch-based multibranch convolutional neural network (CNN) that exploits different per-source encoders to deal with the specificity of the input signals. In addition, we introduce a new self-distillation strategy to boost the per-source analyses and exploit the interplay among the different input sources. This new strategy leverages the final prediction of the multisource framework to guide the learning of the per-source CNN encoders supporting the network to learn from itself. Experiments are carried out on two real-world benchmarks, namely, the
【 授权许可】
Unknown