A generic deep learning framework to classify thyroid and breast lesions in ultrasound images

Journal article


Zhu, Y.C., AlZoubi, A., Jassim, S., Jiang, Q., Zhang, Y., Wang, Y.B., Ye, X.D. and Hongbo, D.U. 2021. A generic deep learning framework to classify thyroid and breast lesions in ultrasound images. Ultrasonics. 110, pp. 1-8. https://doi.org/10.1016/j.ultras.2020.106300
AuthorsZhu, Y.C., AlZoubi, A., Jassim, S., Jiang, Q., Zhang, Y., Wang, Y.B., Ye, X.D. and Hongbo, D.U.
Abstract

Breast and thyroid cancers are the two common cancers to affect women worldwide. Ultrasonography (US) is a commonly used non-invasive imaging modality to detect breast and thyroid cancers, but its clinical diagnostic accuracy for these cancers is controversial. Both thyroid and breast cancers share some similar high frequency ultrasound characteristics such as taller-than-wide shape ratio, hypo-echogenicity, and ill-defined margins. This study aims to develop an automatic scheme for classifying thyroid and breast lesions in ultrasound images using deep convolutional neural networks (DCNN). In particular, we propose a generic DCNN architecture with transfer learning and the same architectural parameter settings to train models for thyroid and breast cancers (TNet and BNet) respectively, and test the viability of such a generic approach with ultrasound images collected from clinical practices. In addition, the potentials of the thyroid model in learning the common features and its performance of classifying both breast and thyroid lesions are investigated. A retrospective dataset of 719 thyroid and 672 breast images captured from US machines of different makes between October 2016 and December 2018 is used in this study. Test results show that both TNet and BNet built on the same DCNN architecture have achieved good classification results (86.5% average accuracy for TNet and 89% for BNet). Furthermore, we used TNet to classify breast lesions and the model achieves sensitivity of 86.6% and specificity of 87.1%, indicating its capability in learning features commonly shared by thyroid and breast lesions. We further tested the diagnostic performance of the TNet model against that of three radiologists. The area under curve (AUC) for thyroid nodule classification is 0.861 (95% CI: 0.792–0.929) for the TNet model and 0.757–0.854 (95% CI: 0.658–0.934) for the three radiologists. The AUC for breast cancer classification is 0.875 (95% CI: 0.804–0.947) for the TNet model and 0.698–0.777 (95% CI: 0.593–0.872) for the radiologists, indicating the model’s potential in classifying both breast and thyroid cancers with a higher level of accuracy than that of radiologists.

KeywordsThyroid Cancer; Breast Cancer; Ultrasonography; Cancer Recognition; Deep Convolutional Neural Network
Year2021
JournalUltrasonics
Journal citation110, pp. 1-8
PublisherElseiver
ISSN0041-624X
Digital Object Identifier (DOI)https://doi.org/10.1016/j.ultras.2020.106300
Web address (URL)https://bear.buckingham.ac.uk/495/
https://www.sciencedirect.com/science/article/pii/S0041624X20302377
Output statusPublished
Publication dates
Online12 Nov 2020
Publication process dates
Accepted05 Nov 2020
Deposited25 Jul 2024
Permalink -

https://repository.derby.ac.uk/item/q77yw/a-generic-deep-learning-framework-to-classify-thyroid-and-breast-lesions-in-ultrasound-images

  • 25
    total views
  • 0
    total downloads
  • 0
    views this month
  • 0
    downloads this month

Export as

Related outputs

Explainable DCNN Decision Framework for Breast Lesion Classification from Ultrasound Images Based on Cancer Characteristics
AlZoubi, A., Eskandari, A., Yu, H. and Du, H. 2024. Explainable DCNN Decision Framework for Breast Lesion Classification from Ultrasound Images Based on Cancer Characteristics . Bioengineering. 11 (5), pp. 1-23. https://doi.org/10.3390/bioengineering11050453
Automatic Bi-LSTM Architecture Search Using Bayesian Optimisation for Vehicle Activity Recognition
AlZoubi, A. and Radhakrishnan , R. 2023. Automatic Bi-LSTM Architecture Search Using Bayesian Optimisation for Vehicle Activity Recognition. in: A. Augusto de Sousa, Kurt Debattista, Alexis Paljic, Mounia Ziat, Christophe Hurter, Helen Purchase, Giovanni Maria Farinella, Petia Radeva and Kadi Bouatouch (ed.) Computer Vision, Imaging and Computer Graphics Theory and Applications New York Springer. pp. 108–134
ENAS-B: Combining ENAS with Bayesian Optimisation for Automatic Design of Optimal CNN Architectures for Breast Lesion Classification from Ultrasound Images
Ahmed, M., Du, H. and AlZoubi, A. 2023. ENAS-B: Combining ENAS with Bayesian Optimisation for Automatic Design of Optimal CNN Architectures for Breast Lesion Classification from Ultrasound Images. Ultrasonic Imaging. https://doi.org/10.1177/01617346231208709
Automatic Detection of Thyroid Nodule Characteristics From 2D Ultrasound Images
Han, D., Ibrahim, N., Lu, F., Zhu, Y., Du, H. and AlZoubi, A. 2023. Automatic Detection of Thyroid Nodule Characteristics From 2D Ultrasound Images. Ultrasonic Imaging. pp. 1-18. https://doi.org/10.1177/01617346231200804
Classification of breast lesions in ultrasound images using deep convolutional neural networks: transfer learning versus automatic architecture design
AlZoubi, A., Lu, F., Zhu, Y., Ying, T., Ahmed, M. and Du, H. 2023. Classification of breast lesions in ultrasound images using deep convolutional neural networks: transfer learning versus automatic architecture design. Medical & Biological Engineering & Computing. pp. 1-15. https://doi.org/10.1007/s11517-023-02922-y
Machine Learning Assisted Doppler Features for Enhancing Thyroid Cancer Diagnosis
Zhu, Y., Du, H., Jiang, Q., Zhang, T., Huang, X., Zhang, Y., Shi, X., Shan, J. and AlZoubi, A. 2021. Machine Learning Assisted Doppler Features for Enhancing Thyroid Cancer Diagnosis. Journal of Ultrasound in Medicine. 41 (8), pp. 1961-1974. https://doi.org/10.1002/jum.15873