How can we help you?
Category
News

The Future of Sonography: Understanding the Limitations of AI in Ultrasound

Sophie Wieser, MD

Fri, 14/07/2023

The advent of AI technology in the field of medical imaging has provoked a lot of debates about the future of specialists in the area.

While concerns about the potential redundancy of human professionals persist, it is vital to keep abreast of AI developments and their possible implementations in ultrasound.

This article sums up the unique challenges that hinder complete replacement by AI while emphasizing the need to overcome these obstacles to fully harness ultrasound's potential in clinical practice. Ignorance thrives fear! So stay informed and confidently navigate the impact of AI technology on your professional landscape.

Small white robot with a stethoscope in hand.

AI has already taken its place in CT and MRI

AI technology has already made significant advancements in most areas of medical imaging, particularly in cross-sectional imaging like CT and MRI.  From the 1980s onwards, deep-learning systems, which rely on large datasets ("big data"), found success in radiology due to the reproducibility and high-resolution nature of images generated by CT scans, MRI scans, and nuclear medicine imaging. AI is utilized for tasks such as computer-assisted detection (or computer-assisted diagnosis, CAD), lesion segmentation, disease process monitoring, and even automated diagnosis. Their high sensitivity was expected to improve accuracy, which was achieved to some extent. However, the low specificity can lead to high false positive rates, especially in the detection of malignant tumors, resulting in a number of unnecessary diagnostic steps. It is therefore crucial to keep the human professionals central in the diagnostic process, with the algorithms serving as complementary tools to avoid oversight of pathologies.

AI in Ultrasound – Unique challenges

Ultrasound presents distinct challenges that make the integration of AI more complex. Unlike in other imaging modalities, the operators must choose which areas and organs to scan leading to variations in the completeness and density of available data. If a specific area of interest is not scanned, AI cannot compensate for the lack of data. Ultrasound exams also involve an interactive component, where sonographers interact with patients to enhance organ visualization (through inspiration, positioning, and various maneuvers). Additionally, ultrasound exams encompass clinical dimensions beyond image production. Integrating clinical information into deep learning algorithms would thus be essential for their effectiveness. Perhaps the greatest challenges when thinking about AI implementations in ultrasound are operator dependency and image variability. Both image generation and diagnosis are highly operator-dependent. Images vary not only between different operators, but also between ultrasound manufacturers, and different machine settings. 

Furthermore, the interpretation of ultrasound image data can vary significantly between operators, leading to potential biases, errors in the "ground truth" data, and expert labels that are needed to train deep-learning models. The lack of standardization becomes a bottleneck in AI implementation, representing a major obstacle to the development of accurate deep-learning algorithms for clinical applications in Ultrasound.

The need for “big data”

AI implementations depend highly not only on image quality but also on image quantity. However, when it comes to ultrasound data, there is a notable scarcity compared to other modalities that contribute to the slower progress of AI-powered ultrasound. The training of deep learning algorithms often relies on retrospective data and limited, single datasets from individual medical centers or vendors. Acquiring comprehensive datasets can be challenging due to proprietary restrictions and ethical concerns surrounding large patient datasets. The creation of large, validated imaging data sets in a prospective fashion is crucial. 

By employing transfer learning and fine-tuning techniques, pre-trained DL systems can be adapted to incorporate new ultrasound image inputs.

What lies ahead?

Although AI technology has the potential to reshape the field of ultrasound, the unique challenges and operator dependency of ultrasound exams present barriers to full automation. Current applications focus on operator guidance to standardize image acquisition, often with real-time feedback on image quality. AI tools can automatically select images from previously acquired loops, such as in speckle tracking analysis. Another example in echocardiography is AI-assisted quantification of ejection fraction (auto-EF), which is already available in many ultrasound machines with relatively high accuracy. In general, there is still much to be done to ensure maximal standardization of US imaging data. In addition, the creation of large databases shared by different providers and PACS (Picture Archiving and Communication System) from different healthcare facilities with well-curated image data might greatly improve the accuracy of deep learning algorithms. 

The future integration of AI should be seen as a tool that enhances the capabilities of sonographers, rather than a replacement for their expertise. Collaborative efforts are needed to standardize ultrasound examinations, improve operator qualifications through training including feedback via AI tools, and develop accurate AI algorithms to give a second opinion on pathology detection and diagnosis, ensuring that AI and human professionals work together synergistically in the future of sonography.

A human hand and robot hand touch fingers.

Do not hesitate to explore the realm of AI in ultrasound. It might be wise to stay informed, engage in critical thinking, and dive into the possibilities that lie ahead rather than turning a blind eye to the possibilities.

In the next parts, we will take a closer look at current and future applications, discuss how AI impacts education, and last but not least, discuss the ethical aspects of its use and misuse.

 

Yours,

Sophie Wieser, Thomas Binder & the team of 123sonography