Contrastive learning–guided multi-meta attention network for breast ultrasound video diagnosis

Xiaoyang Huang, Zhi Lin, Shaohui Huang, Fu Lee Wang, Moon Tong Chan, Liansheng Wang

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Breast cancer is the most common cause of cancer death in women. Early screening and treatment can effectively improve the success rate of treatment. Ultrasound imaging technology, as the preferred modality for breast cancer screening, provides an essential reference for early diagnosis. Existing computer-aided ultrasound imaging diagnostic techniques mainly rely on the selected key frames for breast cancer lesion diagnosis. In this paper, we first collected and annotated a dataset of ultrasound video sequences of 268 cases of breast lesions. Moreover, we propose a contrastive learning–guided multi-meta attention network (CLMAN) by combining a deformed feature extraction module and a multi-meta attention module to address breast lesion diagnosis in ultrasound sequence. The proposed feature extraction module can autonomously acquire key information of the feature map in the spatial dimension, whereas the designed multi-meta attention module is dedicated to effective information aggregation in the temporal dimension. In addition, we utilize a contrast learning strategy to alleviate the problem of high imaging variability within ultrasound lesion videos. The experimental results on our collected dataset show that our CLMAN significantly outperforms existing advanced methods for video classification.

Original languageEnglish
Article number952457
JournalFrontiers in Oncology
Volume12
DOIs
Publication statusPublished - 24 Oct 2022

Keywords

  • breast lesion
  • contrastive learning
  • multi-meta attention network
  • ultrasound sequence
  • video classification

Fingerprint

Dive into the research topics of 'Contrastive learning–guided multi-meta attention network for breast ultrasound video diagnosis'. Together they form a unique fingerprint.

Cite this