When she started her practice about two decades ago, Dr Darshana Sanghvi, a radiologist from Mumbai’s Kokilaben Dhirubhai Ambani Hospital, had almost 12 to 14 assistants, each of whom would do everything manually. Today, her work and the way she does it have completely changed. “I did not even have digital information then,”she gasps. “Mostly, the image stats that came from the MRI would be captured on films, which would then be put up on the physical film box, and not on a computer. So if there are a hundred images, which would be the case on any given day, someone would have to manually do everything―right from print the films, collect and segregate them, put them on the view box. Then I would see them and dictate the report to a typist. The report would be printed and dispatched. There were at least 12-14 people to do all of this. But now, it has come down to just four of them as everything is done by artificial intelligence (AI).”
Now all they have to do is simply tell the machine orally that they need to do an MRI, say, of the knee for trauma. “From thereon everything is automated,” says Sanghvi. “The images are acquired by deep learning algorithms, which, in turn, send them to the servers for post processing of the data, which was done by us in the past and was extremely time-consuming. This is a step prior to interpretation.”
And it is in the next step―interpretation―that AI truly scores. What AI essentially does is it interprets and analyses information and sends all the big data in an easy and quick-to-understand format, like in colour codes. For instance, a patient with tuberculosis of the lungs had 50 small lesions in the lungs before treatment. It would be a labour-intensive task for the radiologists to figure out whether the lesions had reduced with treatment. In comes AI, helping radiologists to interpret the data quickly and understand if the treatment is working. Moreover, unlike earlier, when there were errors between each doctor’s post processing techniques, now there are none.
Sanghvi also recalls how earlier, in case of emergencies, she had to travel at night to the hospital to dictate the report to the typist. “But now, with all the data being stored on the cloud, we can do everything at home,”she says. “All doctors now have their own voice recognition passwords. I verbally dictate the report to the AI from home, and it captures everything and relays it back to the treating doctor in the hospital.”Sanghvi says more than any other branch of medicine, radiology is the foremost branch being impacted by AI. “The reason is that the information is already in a digital form, so it helps us―right from image acquisition, post processing to image interpretation and making the report,”she says.
In the ever-evolving landscape of health care, AI shows immense promise. AI refers to computer systems that mimic human cognitive functions such as learning and problem-solving, which can be performed with or without human supervision. From diagnostics to surgical precision, it is catalysing a transformation across the entire spectrum of medical care. Machine learning (ML) is a subfield of AI that enables machines to learn and make predictions by recognising patterns to support rational human decision-making and it is increasingly being applied to medicine. Deep learning, meanwhile, is a method in AI that teaches computers to process data in a way that is inspired by the human brain. Deep learning models can recognise complex patterns in pictures, text, sounds, and other data to produce accurate insights and predictions.
Diabetic retinopathy (DR) is one area where AI is being used as a screening tool. According to Dr Gopal Pillai, professor and head of department of ophthalmology at Amrita Institute of Medical Sciences in Kochi, approximately one third of people with diabetes will have DR in India, and about 10-12 per cent of those will have vision-threatening DR, indicating that the development of DR has seriously affected the patient’s vision and failure to treat it in a timely manner will result in irreversible vision loss. One thing that was hampering early detection of DR was that it is asymptomatic, until, that is, the person completely loses his vision. “One might be driving, reading, watching TV and doing their activities without even knowing that a time bomb is ticking inside the eye. And because the patient would often come in late, there was no way of early diagnosis,”says Pillai, who is leading a government-sponsored clinical trial network.
Five thousand of the 7,000 patients in a multi-centric registry across India have DR. “Usually 50 per cent of those [5,000] people come to the hospital with loss of vision because there is only a very small window during which we can do something to prevent permanent blindness,”says Pillai. “This is where AI comes into play, and the first is screening.”
Normally, a patient would be required to visit an ophthalmologist, who will dilate the pupils and examine the eye to conclude the presence of DR. “But now, we can eliminate the pupil dilation and simply take a photograph of the eye and AI can detect whether there is DR or not,”says Pillai. “This would reduce screening time by an extremely huge margin. Currently, we have 25,000 ophthalmologists in India. Of these, about 15,000 ophthalmologists will be able to screen the retina for DR. And that is grossly inadequate, because a huge percentage of people are diabetic.”
Diagnosis is another area where AI is playing a key role. “Based on morphological markers of the patient, it tells us exactly what kind of a retinopathy the patient suffers from. It also provides us with the grading of DR,”says Pillai. “For instance, proliferative DR would be classified as a vision-threatening one and require urgent treatment as against a mild DR. Also, deep learning or machine learning can pick up a lot of things that ordinary eyes cannot pick up, especially when there is large data.”As of now, Pillai and his team are trying to explore the possibilities of predicting heart attacks and strokes using the retina.
Recently, a research published in the British Journal of Ophthalmology mentions how AI-enabled imaging of the retina’s network of veins and arteries can accurately predict cardiovascular disease and death, without the need for blood tests or blood pressure measurement. The research team developed a fully automated AI-enabled algorithm―QUantitative Analysis of Retinal vessels Topology and siZe, or QUARTZ―to develop models to assess the potential of retinal vasculature imaging plus known risk factors to predict vascular health and death. “AI-enabled vasculometry risk prediction is fully automated, low cost, non-invasive and has the potential for reaching a higher proportion of the population in the community because of ‘high street’availability and because blood sampling or [blood pressure measurement] are not needed,”said the report. Another significant aspect of AI, says Pillai, is picking up systemic diseases from the retina, that is predicting the occurrence of a disease even 10 years or 20 years down the line before it has actually occurred.
Likewise, in surgical domains, including the intricate field of spine surgery, a notable advancement is the introduction of robotic assistants, designed to work alongside surgeons in spine surgery, says Dr S. Vidyadhara, chairman and head, spine surgery, and consultant, robotic spine surgery, Manipal Hospital Old Airport Road. When combined with augmented reality, the CT/MRI images can be superimposed real time during the surgery, providing surgeons with an in-depth 3D view of the surgical site and thus enhancing their ability to make informed decisions. “This is being used in complex cancer surgeries, where the tumour is hidden inside the organ. With the help of AI, a surgeon can get the clear picture and feedback about the clear margins of the cancerous tumour to be removed,”says Dr Saurabh Patil, consultant urologist and robotic surgeon, Fortis Hospital, Mumbai. “AI also helps in the training of new surgeons by comparing the steps by expert surgeons and providing real-time feedback to a novice surgeon.”
Ultimately, the patient benefits are huge when it comes to AI. Dr Aditya Khemka, orthopaedic surgeon at Mumbai’s Bhatia Hospital talks about an 80-year-old patient with a fractured dislocation of the ankle who visited the hospital on a Sunday morning. The patient was advised surgery by the doctor on call, but the octogenarian dismissed the advice and returned home. “He [the patient] called me in the evening and sent across the X-rays; it was a disastrous picture,”recalls Khemka. “The fracture was pressing an artery, thereby entirely stopping blood flow to his foot at the time. We then got him admitted and operated on him. He went home two days later. I wish there was AI for diagnosis and planning the treatment to make it easy for the patient to understand and be convinced and not waste time contemplating about surgery. That way the surgery, which he got late at night, would have been over by 2pm, thereby cutting down on chances of further injury. Also, AI can be used more efficiently in surgical decision-making and eliminating risk factors and human-driven errors.”AI-enabled robotics has revolutionised surgery, including knee replacements in orthopaedics, says Khemka.
So, which AI software do hospitals in India use? “None of these software have been officially adopted by any of the hospitals,”says Khemka. “These are available as beta uses for individuals. Once we have a system adopted by hospitals, all doctors can then come to centralised, clinical decision making.”
In its National Strategy for Artificial Intelligence #AIForAll report published in June 2018, the NITI Aayog highlighted how AI in health care can help the country address issues of logistics, accessibility and availability of health care and shortage of health care professionals. “With most of the private facilities concentrated in and around tier 1 and tier 2 cities, patients have to travel substantial distances for basic and advanced health care services,”the report reads. “The problem is further accentuated by lack of consistent quality in health care across India. Most of the services provided are individual-driven rather than institution-driven, and less than 2 per cent of hospitals in India are accredited.”This is why health care in the country is primed for intervention by AI-driven solutions, adds the report, as evidenced by the increasing activity from large corporates and startups alike in developing AI-focused health care solutions.
One such company is Siemens Healthineers, which boasts more than 800 AI-related patents. “Siemens Healthineers uses AI in Automatic Landmarking and Parsing of Human Anatomy (ALPHA), organs-at-risk (OAR) contouring in radiation therapy and multi-modality imaging decision support,”says Dileep Mangsuli, executive director and head, Siemens Healthineers Development Centre. “Moreover, the Sherlock supercomputer at our innovation centre in Princeton, New Jersey, carries more than 1,200 AI experiments every day. This data is also widely used in the Indian market.”
The OAR contouring that Mangsuli mentions is based on deep learning. Organs at risk are healthy tissues or organs that are close to the target area that needs intervention and irradiating them could potentially affect the treatment plan. The advanced contouring solutions offer reliable and consistent results for treatment planning. More than 95 per cent of the contouring results are clinically usable or require minor edits. The ALPHA is a system designed to recognise anatomical patterns and consists of an algorithm for detecting anatomical landmarks. Another AI-driven augmented workflow solution by Healthineers is AI-Rad Companion, which effectively lessens the load of repetitive tasks while potentially enhancing the accuracy of medical image interpretation. These solutions streamline the daily workflow, by automating repetitive tasks and high case volumes.
“We have 18 cancer speciality hospitals in India where we utilise a range of AI tools for the early detection of cancer,”says Mangsuli. “These tools play a crucial role in patient care by assisting in cancer diagnosis and assessing other critical aspects, such as the patient’s risk of cardiac issues or stroke.”
In cancer care, CHAVI (Comprehensive Archive of Imaging) is key in a country like India that sees more than a million new cases every year. CHAVI is India’s first fully annotated, relational, de-identified cancer image bank, developed jointly by the Tata Medical Center, Kolkata, and IIT Kharagpur. Oncology is a discipline that is heavily dependant on imaging, not only for diagnosis but also for therapy and followup. Quantitative analysis of these images may provide additional insights about disease biology that may add to information available from clinical and pathological data. Image banks provide a way for researchers to get access to a large number of images that can be used for such analytical research. Currently, this project is for head and neck cancers, but will eventually be made open source, available to experts from other institutions.
Earlier, experts from India would have to access datasets from abroad to get a deeper understanding of the disease and the trends therein. This was true for genomics, pathology and radiology. With cancer, the challenge was that their focus on the type of cancer and the equipment they use to collect that data did not work well for India. Also, there would not be enough open-access datasets available for, say, oral or head and neck cancers that occur from chewing excessive tobacco, as is more common in India. “So we felt that there was a need to create a dataset in India which is suited to our kind of population and issues,”says Dr Swapnil Rane, associate professor, Advanced Centre for Treatment Research and Education in Cancer, Tata Memorial Centre, Navi Mumbai.
The kind of algorithms that they are developing will use AI on images, particularly for pathology and radiology. In radiology, a lot of data is already digitised but that needs to be archived. But even large hospitals do not have a data centre to store them. And, pathology is yet to go fully digital. So the AI algorithms that are being developed will basically look at image data and integrate other clinical data and treatment and followup data. “So there are two ways now that we can use AI―one is to try and replicate the human expert. So if the pathologist says this is grade 3 or 4 cancer, you take that as gold standard and tell AI to replicate that result and if it is 90 per cent accurate, then easy cases can be completely given to AI with some oversight from the pathologist,”says Amit Sethi, professor, department of electrical engineering, IIT Bombay.
But the team wants to go beyond that, he says. “We have a wealth of historical data at Tata Medical Center of several lakhs of patients. What we are doing now is to train AI to predict what will happen in three or five or 10 years,”he explains. “So if we apply it to patients today it will give us prognostic information―it will tell us whether treatment is likely to be successful or not. That will give us an insight to apply new treatment protocols, if need be.”
AI also helps in initiating treatment at the earliest after the patient is first diagnosed. “Suppose a patient comes in with a diagnosis of lung cancer, we start with a CT scan, biopsy, genomic testing, and then the treatment is decided. [All of this] takes about four to five weeks before the actual treatment begins,”says Rane. “But with AI, we can really [bring it down] to three to four days.”
Rane and Sethi are also working to ensure that AI specifically understands the Indian markers. For instance, Rane says how carbon pigment is mostly present in tumours of the lung in an Indian patient, owing to pollution and smoking, but not so much in tumour samples of a patient from the west. But given that data is at the core of AI, India is at a loss because of our dismal data collection and storage capacity. “Electronic medical records are present in only 15 per cent of large cancer hospitals in India,”says Rane. “Many have no electronic record and do not have the infrastructure to digitise radiology and pathology. So then how do we apply AI in these hospitals [when there is] lack of data? We are still behind the west in terms of adopting electronic records. We need large Graphics Processing Unit clusters but the investment required for that is [not coming]. We also need higher funding in innovation in India.”
As per a World Economic Forum report published in October 2022, AI expenditure in India is expected to reach $11.78 billion by 2025. The AI in health care market is projected to grow from $14.6 billion in 2023 to $102.7 billion by 2028. The report further states that NITI Aayog has been testing AI application in primary care for early detection of diabetes complications, and is currently validating the use of AI as a screening tool in eye care by comparing its diagnostic accuracy with that of retina specialists.
As per the report, in cardiovascular health care, Microsoft’s AI Network for Healthcare and Apollo Hospitals are developing a machine learning model to better predict heart attack risk. Using clinical and lab data from more than 4 lakh patients, the AI solution can identify new risk factors and provide a heart risk score to patients without a detailed health checkup, enabling early disease detection.
Apollo Hospitals has also taken initiatives such as AI-based disease progression and risk score models for non-communicable diseases such as cardiovascular diseases, diabetes, kidney failure and liver fibrosis. AI-machine learning and deep learning are being used to identify clinically significant abnormalities and active pulmonary tuberculosis from images such as chest X-rays. Apollo EARS (Empirical Antibiotic Recommendation System) leverages AI to give doctors recommendations at the point of care, which, in turn, helps in tackling antibiotic resistance in the long run. Its Clinical Intelligence Engine, built by doctors for doctors, is an intuitive clinical assistant. Its ProHealth programme, powered by a predictive algorithm, assesses a person’s current state of health, forecasts possible health risks and customises each health check.
According to Dr K. Hariprasad, president, hospitals division, Apollo Hospitals Enterprise Ltd, AI in health care is a multifaceted issue. There are many challenges and limitations in building clinical AI solutions that can have real impact on health care. “The requirements and preferences of end users, including physicians, patients and policymakers are not always met by AI solutions,”he says. “Low adoption rates and ineffective solutions may arise from this. There is frequently a lack of integration between AI solutions and the current workflows, standards, and health care systems. This may limit the impact of AI solutions and create implementation barriers.”
Despite these issues, there is no denying that AI can work wonders in health care, especially in India. But that does not mean it will replace the human touch. AI can be viewed, at best, as a companion doctor, offering an additional opinion based on data-driven insights. “It is a tool that every medical professional should embrace and receive training in,”says Dr. S. Vidyadhara of Manipal hospital. “Rather than replacing human expertise, AI serves as a collaborative partner that unlocks possibilities beyond our imagination. Imagine administering drugs with nanogram precision or nanobots targeting tumours! The future of health care is one where technology and human expertise join forces to achieve remarkable feats.”