Breast mass segmentation based on ultrasonic entropy maps and attention gated U-Net

by   Michał Byra, et al.
Polish Academy of Sciences

We propose a novel deep learning based approach to breast mass segmentation in ultrasound (US) imaging. In comparison to commonly applied segmentation methods, which use US images, our approach is based on quantitative entropy parametric maps. To segment the breast masses we utilized an attention gated U-Net convolutional neural network. US images and entropy maps were generated based on raw US signals collected from 269 breast masses. The segmentation networks were developed separately using US image and entropy maps, and evaluated on a test set of 81 breast masses. The attention U-Net trained based on entropy maps achieved average Dice score of 0.60 (median 0.71), while for the model trained using US images we obtained average Dice score of 0.53 (median 0.59). Our work presents the feasibility of using quantitative US parametric maps for the breast mass segmentation. The obtained results suggest that US parametric maps, which provide the information about local tissue scattering properties, might be more suitable for the development of breast mass segmentation methods than regular US images.


page 2

page 3


Mass Segmentation in Automated 3-D Breast Ultrasound Using Dual-Path U-net

Automated 3-D breast ultrasound (ABUS) is a newfound system for breast s...

Deep learning achieves radiologist-level performance of tumor segmentation in breast MRI

Purpose: The goal of this research was to develop a deep network archite...

Deep Structured learning for mass segmentation from Mammograms

In this paper, we present a novel method for the segmentation of breast ...

AUNet: Breast Mass Segmentation of Whole Mammograms

Deep learning based segmentation has seen rapid development lately in bo...

Improved Breast Mass Segmentation in Mammograms with Conditional Residual U-net

We explore the use of deep learning for breast mass segmentation in mamm...

Textural Approach for Mass Abnormality Segmentation in Mammographic Images

Mass abnormality segmentation is a vital step for the medical diagnostic...

Image translation of Ultrasound to Pseudo Anatomical Display Using Artificial Intelligence

Ultrasound is the second most used modality in medical imaging. It is co...

I Introduction

Breast cancer is the most common invasive cancers in women worldwide [2]. Ultrasound (US) scanning has been widely used for breast mass diagnosis in clinics. Breast mass segmentation is an important part of breast computer aided diagnosis (CAD) systems. Automatic and accurate mass segmentation enables extraction of handcrafted features for the differentiation of malignant and benign breast masses [14]. However, breast mass segmentation in US images is considered difficult due to speckle noise and blurred breast mass boundaries. Recently, deep learning methods based on convolutional neural networks (CNNs) have been proposed for breast mass detection and segmentation [16, 15, 7]

. These data driven machine learning methods can automatically process US images to determine the segmentation mask.

In US imaging, the appearance of tissues is related to the applied US image reconstruction method. Quantitative US techniques use raw US data (before US image reconstruction) to generate parametric maps illustrating local physical properties of tissues [8]. Parametric maps can provide information about tissues that is not present in regular US images [8]. In this work, we investigate the feasibility of developing deep learning segmentation methods based on entropy parametric maps, which visualize the information associated with tissue micro-structure [13]. Entropy imaging has been successfully used for the breast mass characterization [13]. Since the parametric maps are quantitative, we hypothesize that the segmentation based on the parametric maps may provide better performance than in the case of the US images. To segment the breast masses we develop U-Net CNNs equipped with attention gates.

Ii Materials and Methods

Ii-a Dataset

To develop and evaluate deep learning models we used raw US data (before US image reconstruction) collected from 269 breast masses. 123 masses were malignant and 146 masses were benign. Malignant masses were histologically assessed by core needle biopsy, benign masses were assessed either by the biopsy or a two year observation. This retrospective study was approved by the Institutional Review Board. The data were acquired during routing scanning performed at the Maria Sklodowska-Curie Memorial Cancer Centre and Institute of Oncology in Warsaw. Raw US data were collected using the Ultrasonix SonixTouch Research US scanner equipped with the L14-5/38 transducer operating at center imaging frequency of 9 MHz. For each mass two perpendicular scans were performed. Each signal frame consisted of 256 scan lines sampled at 40 MHz. US images were reconstructed based on the raw data. First, Hilbert transform was applied to calculate US signals’ amplitudes based on raw US signals. Second, amplitude samples were logarithmically compressed and mapped to 8 bits at a dynamic range of 50 dB. Next, the reconstructed US images were used by an experienced radiologist to outline regions of interest (ROIs) indicating breast mass areas. More information about the imaging protocol can be found in our previous papers [4, 9].

Ii-B Entropy imaging

Entropy parametric maps were generated using amplitude samples calculated with the Hilbert transform based on raw US signals. To generate the maps we used the sliding window technique proposed by Tsui et al. [13]

. Square window of size equal to 2 wavelengths (100x14 pixels) was used to collect local amplitude samples to estimate the probability density function and calculate the entropy value using the following equation:


where refers to the signal amplitude and is the amplitude probability density function. US images and entropy maps generated for two breast masses are presented in Fig. 1.

Fig. 1: Pipeline describing the generation of ultrasound images and entropy maps based on raw data collected from breast masses.

Ii-C Segmentation method

We used attention gated U-Net CNN to segment breast masses. The scheme of our method is shown in Fig. 2. In comparison to the standard U-Net architecture, the attention U-Net includes attention gates that process feature maps propagated through the skip connections. The aim of the attention gates is to filter the feature maps to improve network’s capabilities to focus on important regions in the image [10, 5]

. Additionally, to improve the performance we applied the following transfer learning technique. The first two convolutional blocks of the attention U-Net CNN were initialized with the weights of the VGG19 network pre-trained on the ImageNet dataset

[6, 11]

. The VGG19 was originally developed to classify objects from the ImageNet dataset. The first convolutional blocks in a CNN are commonly responsible for the recognition of edges and blobs, therefore thanks to the applied transfer learning technique our U-Net CNN from the very beginning had the ability to recognize low level image features related to local image patterns. To enable transfer learning, US images and entropy maps were resized to the default VGG19 input size of 224x224. The VGG19 CNN was developed for RGB images, but the US images and entropy maps are gray scale. To convert those images to RGB we used the matching layer method

[3]. The aim of this layer was to rescale pixel intensities and convert gray scale images to RGB color space.

Fig. 2: Ultrasound (US) images and entropy maps generated for sample malignant and benign breast masses, and the corresponding manual segmentations determined based on US images by the radiologist. Entropy parametric maps generated based on raw US signals are commonly less noisy than US images.
Fig. 3:

The architecture of the attention U-Net CNN used for the breast mass segmentation. Convolutional blocks extracted from the VGG19 network are indicated with gray color. AL -– attention layer, BN – batch normalization, Conv – 2D convolutional block, MP – max pooling operation, Up – up sampling with a 2D transposed convolutional block (kernel size of 2x2, stride of 2x2). Each convolutional block, except for the first and the last block, utilized the rectifier linear unit (ReLu) activation function and 3x3 convolutional filters. The first convolutional block utilized 1D 1x1 convolutional filters (matching layer) without the activation function. The sigmoid activation function was applied for the last convolutional block.

Ii-D Training and evaluation

We developed two separate deep learning models. The first CNN was trained using US images, while to train the second one we used entropy maps. The dataset of 269 breast masses was randomly divided into training, validation and test sets with a 147, 41, 81 split (55%, 15%, 30%). The ratio of malignant and benign breast masses was approximately the same for each set. The test set contained data from 38 malignant and 43 benign breast masses. We applied augmentation to improve the training, all US images and entropy maps were horizontally flipped.

The attention U-Net CNNs were trained to maximize the Dice score based cost function, with the radiologist’s ROIs as the ground truth. The Dice coefficient is commonly used for the assessment of segmentation performance, therefore the maximization of this score is desirable. Moreover, the Dice score based cost function is a good choice for imbalanced data, i.e. when the objects, to be segmented, like the breast masses in our case, vary in size [12].

During the training, we monitored the Dice score on the validation set. Each attention U-Net was trained using back-propagation algorithm with the Adam optimizer. The batch size was equal to 16. The learning rate and the momentum were set to 0.0005 and 0.9, respectively. The learning rate was exponentially decreased every 4 epochs by using a drop factor of 0.5 if no improvement was observed on the validation set. The training was stopped after 20 epochs if no improvement in Dice score based cost function was observed on the validation set. After the training was stopped, we selected the better performing models with respect to the validation set for the evaluation. The automatic segmentations calculated for the test set were additionally processed using morphological closing operation with a disk of radius equal to 3 pixels.

Fig. 4: Segmentation results obtained for the attention U-Net convolutional neural networks developed using ultrasound (US) images and entropy parametric maps. ROI - region of interest.

To evaluate the segmentation performance we calculated the Dice and Jaccard scores using the test set. Wilcoxon rank sum test at significance level of 0.05 was applied to determine whether there were statistically significant differences between the obtained Dice scores. Additionally, we investigated whether there was a difference in performance between the segmentation of malignant and benign breast masses. All calculations were done in Matlab (MathWorks, Inc, USA) and in Python. The networks were implemented in Keras with the Tensorflow backend

[1]. The experiments were performed on a computer equipped with a NVIDIA Titan RTX graphics card.

Iii Results

Score Method Benign Malignant Benign and malignant
Dice US image 0.51 (0.580.29) 0.54 (0.600.24) 0.53 (0.590.27)
Entropy 0.60 (0.720.31) 0.61 (0.700.26) 0.60 (0.710.29)
Jaccard US image 0.39 (0.410.25) 0.41 (0.430.21) 0.39 (0.420.23)
Entropy 0.49 (0.560.28) 0.48 (0.540.25) 0.49 (0.540.27)

Breast mass segmentation performance achieved by the attention U-Net convolutional neural networks developed using ultrasound (US) images and entropy parametric maps. The average Dice and Jaccard scores (plus median and standard deviation) were calculated based on the test set of 38 malignant and 43 benign breast masses.

Breast mass segmentation performance achieved by the deep learning models is summarized in Table 1. The attention U-Net developed using entropy maps achieved significantly higher median Dice test score (0.71) than for the US images (0.59). There were no associated differences in segmentation performance between the benign and malignant breast masses, both deep learning models achieved similar scores in this case. Fig. 3 shows sample automatic segmentations obtained for benign and malignant masses. Boundaries of masses in parametric maps were more visible resulting in better automatic segmentations calculated by the model.

Iv Discussion

The proposed segmentation method based on the attention U-Net CNN achieved good performance. Our work, for the first time, shows the feasibility of using statistical US parametric maps for the breast mass segmentation. Our results suggest that the parametric maps might be more suitable for the development of segmentation networks than regular US images. For instance, boundaries of breast masses may be more visible in the parametric maps than in US images, making the development of the segmentation method easier. In the case of our study, the attention U-Net trained using entropy maps achieved significantly higher average Dice score, 0.60, than the network developed based on US images, 0.53.

There are several issues related to our study. First, the ROIs used to train the networks were prepared based on US images. It would be interesting to ask the radiologist to outline the ROIs using entropy maps, and compare the results with the US image based ROIs. Second, the better performance of the network developed using entropy maps could be due to the applied transfer learning technique, which favoured the entropy maps. Third, a research US scanner is required to collect the raw US data necessary for the generation of US parametric maps, which limits the applicability of the proposed method.

In the future we would like to investigate the usefulness of other US parametric maps for the breast mass segmentation. For instance, it would be interesting to generate attenuation coefficient or Nakagami parameter maps, and use those to train the segmentation CNNs. Moreover, we plan to develop a deep learning model that jointly process different parametric maps to output segmentation ROIs. It would be also interesting to use raw US data directly to develop the segmentation method.


The authors acknowledge grant support from the National Science Center, Poland (2014/13/B/ST7/01271, 2016/23/B/ST8/03391).

Conflict of interest statement

The authors do not have any conflicts of interest.


  • [1] M. Abadi, P. Barham, J. Chen, Z. Chen, A. Davis, J. Dean, M. Devin, S. Ghemawat, G. Irving, M. Isard, et al. (2016) Tensorflow: a system for large-scale machine learning. In 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16), pp. 265–283. Cited by: §II-D.
  • [2] F. Bray, J. Ferlay, I. Soerjomataram, R. L. Siegel, L. A. Torre, and A. Jemal (2018) Global cancer statistics 2018: globocan estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA: a cancer journal for clinicians 68 (6), pp. 394–424. Cited by: §I.
  • [3] M. Byra, M. Galperin, H. Ojeda-Fournier, L. Olson, M. O’Boyle, C. Comstock, and M. Andre (2019) Breast mass classification in sonography with transfer learning using a deep convolutional neural network and color conversion. Medical physics 46 (2), pp. 746–755. Cited by: §II-C.
  • [4] M. Byra, A. Nowicki, H. Wróblewska-Piotrzkowska, and K. Dobruch-Sobczak (2016) Classification of breast lesions using segmented quantitative ultrasound maps of homodyned k distribution parameters. Medical physics 43 (10), pp. 5561–5569. Cited by: §II-A.
  • [5] M. Byra, M. Wu, X. Zhang, H. Jang, Y. Ma, E. Y. Chang, S. Shah, and J. Du Knee menisci segmentation and relaxometry of 3d ultrashort echo time cones mr imaging using attention u-net with transfer learning. Magnetic Resonance in Medicine 0 (0), pp. . Cited by: §II-C.
  • [6] J. Deng, W. Dong, R. Socher, L. Li, K. Li, and L. Fei-Fei (2009) Imagenet: a large-scale hierarchical image database. In Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on, pp. 248–255. Cited by: §II-C.
  • [7] Y. Hu, Y. Guo, Y. Wang, J. Yu, J. Li, S. Zhou, and C. Chang (2019) Automatic tumor segmentation in breast ultrasound images using a dilated fully convolutional network combined with an active contour model. Medical Physics 46 (1), pp. 215–228. Cited by: §I.
  • [8] M. L. Oelze and J. Mamou (2016) Review of quantitative ultrasound: envelope statistics and backscatter coefficient imaging and contributions to diagnostic ultrasound. IEEE transactions on ultrasonics, ferroelectrics, and frequency control 63 (2), pp. 336–351. Cited by: §I.
  • [9] H. Piotrzkowska-Wróblewska, K. Dobruch-Sobczak, M. Byra, and A. Nowicki (2017) Open access database of raw ultrasonic signals acquired from malignant and benign breast lesions. Medical physics 44 (11), pp. 6105–6109. Cited by: §II-A.
  • [10] J. Schlemper, O. Oktay, M. Schaap, M. Heinrich, B. Kainz, B. Glocker, and D. Rueckert (2019) Attention gated networks: learning to leverage salient regions in medical images. Medical image analysis 53, pp. 197–207. Cited by: §II-C.
  • [11] K. Simonyan and A. Zisserman (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556. Cited by: §II-C.
  • [12] C. H. Sudre, W. Li, T. Vercauteren, S. Ourselin, and M. J. Cardoso (2017)

    Generalised dice overlap as a deep learning loss function for highly unbalanced segmentations

    In Deep learning in medical image analysis and multimodal learning for clinical decision support, pp. 240–248. Cited by: §II-D.
  • [13] P. Tsui, C. Chen, W. Kuo, K. Chang, J. Fang, H. Ma, and D. Chou (2017) Small-window parametric imaging based on information entropy for ultrasound tissue characterization. Scientific reports 7, pp. 41004. Cited by: §I, §II-B.
  • [14] G. Wu, L. Zhou, J. Xu, J. Wang, Q. Wei, Y. Deng, X. Cui, and C. F. Dietrich (2019) Artificial intelligence in breast ultrasound. World Journal of Radiology 11 (2), pp. 19. Cited by: §I.
  • [15] M. H. Yap, M. Goyal, F. M. Osman, R. Martí, E. Denton, A. Juette, and R. Zwiggelaar (2018) Breast ultrasound lesions recognition: end-to-end deep learning approaches. Journal of Medical Imaging 6 (1), pp. 1 – 8. External Links: Document Cited by: §I.
  • [16] M. H. Yap, G. Pons, J. Martí, S. Ganau, M. Sentís, R. Zwiggelaar, A. K. Davison, and R. Martí (2017) Automated breast ultrasound lesions detection using convolutional neural networks. IEEE journal of biomedical and health informatics. Cited by: §I.