陈关州

微信扫描

当前位置: 中文主页 > 科学研究 > 论文成果

Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation

发布时间:2024-10-27

点击次数:

DOI码:10.3390/rs10050719

发表刊物:REMOTE SENSING

关键字:knowledge distillation,scene classification,convolutional neural networks (CNNs),remote sensing,deep learning

摘要:Scene classification, aiming to identify the land-cover categories of remotely sensed image patches, is now a fundamental task in the remote sensing image analysis field. Deep-learning-model-based algorithms are widely applied in scene classification and achieve remarkable performance, but these high-level methods are computationally expensive and time-consuming. Consequently in this paper, we introduce a knowledge distillation framework, currently a mainstream model compression method, into remote sensing scene classification to improve the performance of smaller and shallower network models. Our knowledge distillation training method makes the high-temperature softmax output of a small and shallow student model match the large and deep teacher model. In our experiments, we evaluate knowledge distillation training method for remote sensing scene classification on four public datasets: AID dataset, UCMerced dataset, NWPU-RESISC dataset, and EuroSAT dataset. Results show that our proposed training method was effective and increased overall accuracy (3% in AID experiments, 5% in UCMerced experiments, 1% in NWPU-RESISC and EuroSAT experiments) for small and shallow models. We further explored the performance of the student model on small and unbalanced datasets. Our findings indicate that knowledge distillation can improve the performance of small network models on datasets with lower spatial resolution images, numerous categories, as well as fewer training samples.

合写作者:Xiaoliang Tan,Yufeng Cheng,Fan Dai,Kun Zhu,Yuanfu Gong,Qing Wang

论文类型:期刊论文

通讯作者:Xiaodong Zhang

文献类型:J

卷号:10

期号:5

是否译文:否

发表时间:2018-05-07

收录刊物:SCI