期刊论文详细信息
PeerJ
Comparative study of convolutional neural network architectures for gastrointestinal lesions classification
article
Erik O. Cuevas-Rodriguez1  Carlos E. Galvan-Tejada1  Valeria Maeda-Gutiérrez1  Gamaliel Moreno-Chávez1  Jorge I. Galván-Tejada1  Hamurabi Gamboa-Rosales1  Huizilopoztli Luna-García1  Arturo Moreno-Baez1  José María Celaya-Padilla1 
[1] Unidad Académica de Ingeniería Eléctrica, Universidad Autónoma de Zacatecas
关键词: Convolutional neural network;    Gastrointestinal lesions;    Classification;    Deep learning;    Endoscopy;    Gastrointestinal;    Computer-aided diagnostic;   
DOI  :  10.7717/peerj.14806
学科分类:社会科学、人文和艺术(综合)
来源: Inra
PDF
【 摘 要 】

The gastrointestinal (GI) tract can be affected by different diseases or lesions such as esophagitis, ulcers, hemorrhoids, and polyps, among others. Some of them can be precursors of cancer such as polyps. Endoscopy is the standard procedure for the detection of these lesions. The main drawback of this procedure is that the diagnosis depends on the expertise of the doctor. This means that some important findings may be missed. In recent years, this problem has been addressed by deep learning (DL) techniques. Endoscopic studies use digital images. The most widely used DL technique for image processing is the convolutional neural network (CNN) due to its high accuracy for modeling complex phenomena. There are different CNNs that are characterized by their architecture. In this article, four architectures are compared: AlexNet, DenseNet-201, Inception-v3, and ResNet-101. To determine which architecture best classifies GI tract lesions, a set of metrics; accuracy, precision, sensitivity, specificity, F1-score, and area under the curve (AUC) were used. These architectures were trained and tested on the HyperKvasir dataset. From this dataset, a total of 6,792 images corresponding to 10 findings were used. A transfer learning approach and a data augmentation technique were applied. The best performing architecture was DenseNet-201, whose results were: 97.11% of accuracy, 96.3% sensitivity, 99.67% specificity, and 95% AUC.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202307100002432ZK.pdf 14281KB PDF download
  文献评价指标  
  下载次数:11次 浏览次数:4次