How researchers are bringing AI-enabled radiology to rural India

Providing quality health services and screening to rural populations in a nation as large as India is extremely challenging:

  • 67 percent of the Indian population resides in rural areas;
  • 90 percent of medical imaging facilities are in cities;
  • And there are only three radiologists per million people in the country.

While educational health programmes have raised awareness of the importance of medical screening across India; this, in turn, has increased workloads for the few available radiologists, who are already over-burdened. As a result, radiologists have even less time to perform detailed diagnoses. The medical profession is in a race against time to improve the outlook for all Indians, and Artificial Intelligence (AI) may be the breakthrough it needs.

Enhancing screening with deep neural networks

Project MIRIAD, led by assistant professor Debdoot Sheet, is exploring ways in which deep neural networks (DNNs) can enhance AI-enabled radiological screening techniques to save lives and improve healthcare across the whole of India.

The challenge, Debdoot says, is how to handle the large-scale diversity across the variety of medical images, including x-rays, Computed Tomography (CT), Magnetic Resonance Imaging (MRI) and Whole Slide Images of Histopathology (WSI). There are also domain specifics into the nature of the image data, as well as the modality and organ-specific appearance of lesions and disease. “The number of channels in these images are not always being restricted to 1 (greyscale) or 3 (RGB colour). These factors complicate the object detection problem,” says Debdoot.

Despite these challenges, Project MIRIAD’s first major achievement has been the development of a deep neural compression engine for mammograms. And the team is also in the advanced stages with benchmarking existing DNNs for extensions to mammography and chest x-ray screening.

Deep learning-based compression

Inspired by deep learning-based compression for natural images, Debdoot’s team designed and trained a fully convolutional autoencoder-like model for diagnostically relevant feature-preserving lossy compression of mammograms.

Using arithmetic coding for encapsulating a high amount of spatial redundancy in features for further high-density code packing, leading to variable bit length; the team demonstrated compression factors of >300× (0.04 bpp) on two different publicly-available digital mammography datasets using peak signal-to-noise ratio (pSNR), structural similarity (SSIM) index and domain adaptability tests between datasets.

The figure below shows the details of the architecture of the DNN the team utilised for the high-density compression of mammograms.

Figure 1. Technique for high-density compression of mammograms.

Supporting technologies

For training the DNNs, Debdoot’s team used the Intel® AI DevCloud running on the Intel® Xeon® Platinum 8160 and Intel® Xeon® Gold 6128 processors. The networks were implemented on PyTorch* with Intel® Math Kernel Library (Intel® MKL) and Intel® Math Kernel Library for Deep Neural Networks (Intel® MKL-DNN) bindings and Intel® Distribution for Python* 3.5.

Intel engineers helped the team achieve maximum optimisation, accelerating training of DNNs in mixed-precision datasets. Access to a range of software resources and a supportive network of developers were critical to this success.

For developers embarking on similar projects to Project MIRIAD, Getting started with Intel AI DevCloud provides a good introduction to Intel’s optimised software tools and reference platforms.

Opportunities for developers

The architectures and methods for learning-based radiological image compression and efficient radiological image screening have the potential to improve the outlook for all patients taking part in mass screening programmes. Consequently, there are many great opportunities for developers who can utilise AI to aid manufacturers of medical imaging and computer-aided detection and diagnostic (CADx) equipment.

Leave a Reply

Your email address will not be published. Required fields are marked *