DI-UMONS : Dépôt institutionnel de l’université de Mons

Recherche transversale
Rechercher
(titres de publication, de périodique et noms de colloque inclus)
2019-06-29 - Article/Dans un journal avec peer-review - Anglais - 12 page(s)

El Adoui Mohammed , Mahmoudi Sidi , LARHMAM Mohamed Amine, Benjelloun Mohammed , "MRI Breast Tumor Segmentation Using Different Encoder and Decoder CNN Architectures" in Computers, 8, 3, 52-63, https://doi.org/10.3390/computers8030052

  • Edition : MDPI AG
  • Codes CREF : Techniques d'imagerie et traitement d'images (DI2770), Intelligence artificielle (DI1180), Théorie de l'information (DI1161), Informatique mathématique (DI1160), Imagerie médicale, radiologie, tomographie (DI3243)
  • Unités de recherche UMONS : Informatique (F114)
  • Instituts UMONS : Institut de Recherche en Technologies de l’Information et Sciences de l’Informatique (InforTech)
Texte intégral :

Abstract(s) :

(Anglais) Breast tumor segmentation in medical images is a decisive step for diagnosis and treatment follow-up. Automating this challenging task helps radiologists to reduce the high manual workload of breast cancer analysis. In this paper, we propose two deep learning approaches to automate the breast tumor segmentation in dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) by building two fully convolutional neural networks (CNN) based on SegNet and U-Net. The obtained models can handle both detection and segmentation on each single DCE-MRI slice. In this study, we used a dataset of 86 DCE-MRIs, acquired before and after two cycles of chemotherapy, of 43 patients with local advanced breast cancer, a total of 5452 slices were used to train and validate the proposed models. The data were annotated manually by an experienced radiologist. To reduce the training time, a high-performance architecture composed of graphic processing units was used. The model was trained and validated, respectively, on 85% and 15% of the data. A mean intersection over union (IoU) of 68.88 was achieved using SegNet and 76.14% using U-Net architecture.