[1]曾庆鹏,崔 鹏.基于 PRAU-Net 的新冠肺炎 CT 图像分割研究[J].计算机技术与发展,2024,34(03):133-139.[doi:10. 3969 / j. issn. 1673-629X. 2024. 03. 020]
 ZENG Qing-peng,CUI Peng.Research of COVID-19 CT Image Segmentation Based on PRAU-Net[J].,2024,34(03):133-139.[doi:10. 3969 / j. issn. 1673-629X. 2024. 03. 020]
点击复制

基于 PRAU-Net 的新冠肺炎 CT 图像分割研究()
分享到:

《计算机技术与发展》[ISSN:1006-6977/CN:61-1281/TN]

卷:
34
期数:
2024年03期
页码:
133-139
栏目:
人工智能
出版日期:
2024-03-10

文章信息/Info

Title:
Research of COVID-19 CT Image Segmentation Based on PRAU-Net
文章编号:
1673-629X(2024)03-0133-07
作者:
曾庆鹏崔 鹏
南昌大学 数学与计算机学院,江西 南昌 330031
Author(s):
ZENG Qing-pengCUI Peng
School of Mathematics and Computer Sciences,Nanchang University,Nanchang 330031,China
关键词:
新冠肺炎医学图像分割U-Net残差结构注意力机制
Keywords:
COVID-19medical image segmentationU-Netresidual structureattention mechanism
分类号:
TP391. 41
DOI:
10. 3969 / j. issn. 1673-629X. 2024. 03. 020
摘要:
针对新冠肺炎 CT 影像病灶区域小、形状结构差异大和噪声等问题,提出一种基于编解码结构的 PRAU-Net 医学图像分割方法。 首先,在编码阶段使用一种残差 Inception 注意力卷
积模块( Residual Inception Attention,RIA) 提取特征,RIA 采用残差结构将更深的并行卷积块和通道注意力机制相结合捕获更丰富的特征;其次,将不同尺度的特征通过跳跃连接进行融合,使解码器中特征有更加丰富的全局信息;最后,在解码器中使用全局注意力模块使网络关注相关特征,有效减少了 CT 影像中噪声的影响。 为了验证该方法的有效性,分别在三个数据集( Segmentation dataset nr. 2, CC - CCII 和COVID19_1110)上进行验证,实验结果表明,该方法比经典方法分割结果更加准确,相较于 U-Net 等经典分割方法,Dice系数提升了 1. 12% ~ 14. 84% , 敏感度提升了 0. 7% ~ 24. 63% 。 为了进一步提高分割性能, 使用生成对抗网络 对Segmentation dataset nr. 2 数据集进行了扩充,并利用 PRAU-Net 分割方法和多种经典分割网络进行了验证,结果表明,扩充小样本数据集可以有效地提高分割性能,PRAU-Net 方法的 Dice 系数从 0. 836 4 上升到了 0. 858 3。
Abstract:
Aiming at the problems of small lesion area, large variation in shape structure and noise in CT images of COVID - 19, weproposed a parallel residual attention U - Net medical image segmentation method based on encoder - decoder architecture. Firstly,themodel extracted features by residual inception attention ( RIA) in the encoder. RIA adopted a residual structure to combine deeperparallel convolution block and channel attention mechanisms to capture richer features. Secondly,features of different scales were fused
by skip connection to obtain richer global information. Lastly,the global attention module was used in the decoder to enable the networkfocus on relevant features,which effec-tively reduced the influence of noise in CT images. To verify the effectiveness of the proposedmethod,we have conducted experiments on segmentation dataset nr. 2,CC-CCII?
and COVID19 _1110. Experimental results show thatproposed method is more accurate than the classical methods. Compared with classical segmentation methods such as U-Net,the Dice coefficient increases by 1. 12%~14. 84% and the sensitivity increases by 0. 7% ~ 24. 63% . To further demonstrate the segmentation performance,the Segmentation dataset nr. 2 is extended by using generative adversarial network,the PRU-NET method and several classicalsegmentation networks are used to verify the method. It is showed that expanding the small sample dataset can effectively improve thesegmentation performance,the Dice coefficient of PRAU-Net method was increased from 0. 836 4 to 0. 858 3.

相似文献/References:

[1]王艳华 管一弘.基于模糊集理论的医学图像分割的应用[J].计算机技术与发展,2008,(11):223.
 WANG Yan-hua,GUAN Yi-hong.Application of Medical Image Segmentation Technology Based on Fuzzy - Set - Theory[J].,2008,(03):223.
[2]喻建锋,吕毅斌,房巾莉,等.基于 Canny 算子的 C-V 水平集模型[J].计算机技术与发展,2019,29(07):145.[doi:10. 3969 / j. issn. 1673-629X. 2019. 07. 029]
 YU Jian-feng,LYU Yi-bin,FANG Jin-li,et al.C-V Level Set Model Based on Canny Operator[J].,2019,29(03):145.[doi:10. 3969 / j. issn. 1673-629X. 2019. 07. 029]
[3]田翠姣,苏义武.DataX 工具在新冠肺炎数据上报中的应用[J].计算机技术与发展,2020,30(11):216.[doi:10. 3969 / j. issn. 1673-629X. 2020. 11. 040]
 TIAN Cui-jiao,SU Yi-wu.Application of DataX Tool in Data Reporting of New Coronavirus Pneumonia[J].,2020,30(03):216.[doi:10. 3969 / j. issn. 1673-629X. 2020. 11. 040]
[4]宋 杰,刘彩霞,李慧婷.基于 U-Net 网络的医学图像分割研究综述[J].计算机技术与发展,2024,34(01):9.[doi:10. 3969 / j. issn. 1673-629X. 2024. 01. 002]
 SONG Jie,LIU Cai-xia,LI Hui-ting.Review of Medical Image Segmentation Based on U-Net Network[J].,2024,34(03):9.[doi:10. 3969 / j. issn. 1673-629X. 2024. 01. 002]

更新日期/Last Update: 2024-03-10