You are here

Deep ultrasound: how artificial intelligence could impact sonography

4 October, 2019

Author: Jackie Matthew, Research Sonographer, Intelligent Fetal Imaging and Diagnosis (iFIND) Project, King’s College London

Data abstract

Advances in artificial intelligence (AI) will change all our roles. Significantly. Sometime soon. And it may not be as far away as you think, with trials using AI enabled software already underway in the field of mammography.1

If we were to believe all the hype regarding advances in AI within medical imaging, some of us would be running for the hills and planning a career change. Of course, I don’t believe this drastic action is necessary. However, I do believe that there is a need for all of us to be fully aware of impending developments as a minimum, and to be proactive leaders in the development, validation and translation of this emerging technology into clinical practice.

The UK government is investing heavily in AI driven technology and innovation2, positioning the UK as a future world leader in this field. In the healthcare sector, most of the discussion about AI in radiology has been from a radiologist perspective, yet it is very possible that the initial impact could also be significantly felt by radiographers and sonographers.

The potential use of AI tools in the ultrasound imaging pipeline include:

Patient scheduling and preparation: prioritising worklists and identifying high risk patients using relevant priors/clinical information.

Standardising imaging protocols and acquisition: pre‐processing tasks such as reconstruction, registration and segmentation, quality optimisation, image navigation, multi transducer technologies, eg Novel 3DUS imaging methodologies.

Image and data interpretation: real time biometrics and anomaly detection or offline 3D applications, and new computer interface design for AI application plugins.

Reporting and recommendations: automated structured reporting versus narrative. Decision support for follow up recommendations in the case of abnormalities.

I highlight this issue from my sphere of reference because, as a research sonographer, working within a broad clinical and scientific team with ambitions to improve the second trimester fetal screening examination, I have seen, first‐hand, some eyebrow raising deep‐learning based research outputs. Some of these are poised to ‘disrupt’ the way we screen for fetal anomalies. I use the term ‘disrupt’ in a technological innovation sense, meaning we could see a domain shift in the way we deliver this pregnancy screening examination. A scan that is highly valued by patients but currently has variable detection rates for some serious anomalies.

There are now tools that can automatically find the required Fetal Anomaly Screening Programmestandard ultrasound views and then measure them accurately with minimal sonographer interaction4,5. I have seen variations in fetal skull shape quickly assessed as a 3D model and overlaid with statistical shape modelling heatmaps6 (see figure 1), potentially applied to detecting craniosynostosis (premature fusion of the fetal skull sutures: a rare condition often poorly detected antenatally and postnatally7).

Having used these tools in validation studies, I know that their use could slash examination times, potentially by half, and change our screening methodology to include new biometrics.

There is such a range of applications to which this type of machine learning could be applied throughout our field, that we urgently need to be contributing to the conversation. This will require collaborating with our colleagues from computer science, biomedical engineering, physics, clinical and service user backgrounds, enabling us to steer attention to the problems that will have the most significant impact in patient outcomes and clinical services.

Being at the front and centre of these exciting developments by default will further cement our value as healthcare professionals, ensuring that any rapidly evolving technology is evidence‐based and validated in real‐world clinical settings with the focus firmly on improving the experience of our patients.

Figure 1: 3D Fetal Skull with Statistical Shape Modelling: Heatmap of 3D fetal cranium (case 1 and 2 with severe and moderate dolichocephaly, ie narrow head shape, respectively).





  • TopTalk, The Society of Radiographers. (2019) New AI tools for breast screening services. Issue 127. SCoR. Accessed 20 September 2019.
  • A. Mari. (2019) UK leads in government AI readiness. Accessed 20 September 2019.
  • PHE (2018) Fetal Anomaly Screening Programme Standards 2015‐16. Public Health England.
  • C. F. Baumgartner, K. Kamnitsas, J. Matthew, T. P. Fletcher, S. Smith, L. M. Koch, B. Kainz & D. Rueckert. SonoNet: Real‐Time Detection and Localisation of Fetal Standard Scan Planes in Freehand Ultrasound. IEEE Transactions on Medical Imaging. 2017. pp(99), 1‐1.
  • M. Sinclair, C. Baumgartner, J. Matthew, W. Bai, J. J. Cerrolaza, Y. Li, S. Smith, C. Knight, B.Kainz, J. Hajnal, A. King, D. Rueckert Human‐level Performance on Automatic Head Biometrics in Fetal Ultrasound Using Fully Convolutional Neural Networks, in Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS. 2018 [online].
  • J. J. Cerrolaza, Yuanwei Li, C. Biffi, A. Gomez, J. Matthew, M. Sinclair, C. Gupta, C. L. Knight, and D. Rueckert. Fetal Skull Reconstruction using Deep Convolutional Autoencoders, in Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS. 2018 [online].
  • Headlines Craniofacial Support Charity. Top Ten Research Questions in Craniosynostosis. Accessed September 2019.

Recommended reading

  • E. Pakdemirli. A preliminary glossary of artificial intelligence in radiology. Acta Radiol Open. 2019;8(7) doi:10.1177/2058460119863379.
  • Eric Topol (2019) Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again. Ingram Publisher Services, New York, United States.
  • Yuval Noah Harari (2017) Homo Deus: A Brief History of Tomorrow. Vintage Publishing, London, United Kingdom.

Content tools

Accessibility controls

Text size