用于图像分割的强制召回特征注意力网络
作者:
作者单位:

作者简介:

通讯作者:

基金项目:

国家重点研发计划项目(2018YFB1105600);国家自然科学基金项目(61671440)

伦理声明:



Attention Network with Forced Recall Feature for Image Segmentation
Author:
Ethical statement:

Affiliation:

Funding:

National Key Research and Development Program of China(2018YFB1105600);National Natural Science Foundation of China(61671440)

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
    摘要:

    为解决医学图像中前景背景比例严重失衡及小目标区域难以分割的问题,该文提出了一种 基于高斯图像金字塔的注意力网络。具体地,首先在特征解码阶段将空间信息与抽象信息进行特征融 合;其次,设计了一个特征召回器以强制编码器减少遗漏感兴趣区域的特征;最后,引入分类精度和 全局区域重叠项组成的混合损失函数来处理医学图像前景背景严重不平衡问题。所提出的方法在膝关 节软骨数据集和 COVOID-19 胸部 CT 数据集中进行了验证,其分割区域分别占 2.08% 和 10.73%。与 U-Net 及其主流变体相比,该方法在两个数据集上都得到了最佳的 Dice 系数,分别为 0.884±0.032 和 0.831±0.072。

    Abstract:

    To solve the problem of serious imbalance between the foreground and background in medical images and small objects segmentation, we propose an attention network based on Gaussian image pyramid to fuse spatial information and abstract information in the feature decoding stage. In addition, a feature recaller is designed to force the encoder to avoid missing features of the region of interest. Finally, a hybrid loss function composed of classification accuracy and global overlapping terms is employed to deal with the serious imbalance between the foreground and background. The proposed method was validated on a knee articular cartilage dataset and the COVOID-19 chest CT dataset where the foreground proportions are 2.08% and 10.73%, respectively. The proposed method achieves the highest Dice coefficients on both datasets as compared with U-Net and its state-of-the-art variants, which are 0.884±0.032 and 0.831±0.072, respectively.

    参考文献
    相似文献
    引证文献
引用本文

引文格式
魏建华,李佳颖,黄成健,等.用于图像分割的强制召回特征注意力网络 [J].集成技术,2020,9(6):59-70

Citing format
WEI Jianhua, LI Jiaying, HUANG Chengjian, et al. Attention Network with Forced Recall Feature for Image Segmentation[J]. Journal of Integration Technology,2020,9(6):59-70

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2020-11-24
  • 出版日期: