Machine learning, big data, artificial intelligence — yesterday one could have encountered those words only at IT conferences, and today you must not be surprised to hear them from doctors and nurses. Healthcare became one of the first non-IT domains to start adopting these technologies to effectively analyze patient personal data, streamline disease diagnostics, optimize medical services, and improve the standards of care.
Convolutional neural networks, support vector machines, and discriminant analysis algorithms finally empowered doctors to predict and cure even the deadliest diseases of the modern world.. Apart from that, machine learning in healthcare helps to optimize other medical activities like administrative workflows, remote patient consulting (so-called telehealth), and healthcare data structurization.
Even though we are now witnessing only the first steps of AI implementation in healthcare, numbers prove its astonishing prospects. According to the latest industry researches, Big Data-powered AI systems will generate around $6.6 bln in global healthcare industry revenue in 2021, which is a tenfold increase from 2014. What’s more, specialists from Accenture expect AI and machine learning in healthcare to save at least $150 bln for U.S. economy until 2026.
AI health market growth dynamics 2014-2021. Source: Accenture analysis
So, let’s have a look at the 4 important healthcare operations, which AI is already able to handle better than human doctors.
Medical image analysis still remains the most effective method to identify cancer, strokes, lung diseases, and more. However, according to Medical Futurist, right now more than half of the world’s population has no access to medical imaging. That’s because specific medical equipment is rather expensive and you need a team of highly-trained professionals to operate it and interpret the results.
Moreover, manually analyzing thousands of other medical images is not a piece-of-cake task for human radiologists and pathologists. Finding a single “bad-sign” pixel among thousands of medical 40X magnified MRI/CTs and X-rays still remains a matter of doctor’s skill and attention.
The latest research proves that specially trained deep learning algorithms are able to show the average result of 96% in cancer cells detection, which exceeds the image analysis results top-notch human pathologists show.
Unlike humans, neural networks are able to quickly learn on enormous image datasets. For example, IBM acquired more than 30 billion of medical images to train its custom IBM Watson technology for early cancer identification..
Nowadays, a combined power of DL algorithms and pathologists’ expertise allows increasing the effectiveness of medical image analysis to 99.5%. By leveraging deep learning algorithms and computer vision, AI-powered healthcare systems can enable effective screening and virtually guarantee that no diseased cells will not be missed.
Medical image analysis systems are also very cost-effective compared to a team of highly-professional radiologists, which allows providing even the poorest regions with the effective medical image analysis.
Let’s have a look at the few Human vs. Machine cases, proving that AI-enabled algorithms have already outstripped human doctors on their playing field.
Robot-assisted surgery seems to be the most promising AI-powered technology in healthcare business — according to the latest Accenture’s report it will save for US healthcare industry around $40 bln annually by 2026.
Potential annual benefits for Top 10 AI applications by 2026. Source: Accenture analysis
Surgeons have been using remote robotic systems to facilitate complex surgeries for quite a while. For example, the most known robotic system called da Vinci® is installed in 4000+ institutions around the world and was used to save millions of lives. According to Accenture’s analysis, robot-assisted minimally invasive surgery allows reducing the length of hospital stay by 21%.
But what picture pops in your head when you hear “robotic surgery system”? Maybe you think of a robot operating a patient autonomously, while human surgeons play Mahjong, drink coffee and occasionally check things on the monitor. Unfortunately, this type of automation is yet to be achieved. A powerful surgeon’s mechanical arm, the da Vinci Surgical System is still not a fully-fledged medical robot.
As the researchers say, a fully autonomous surgery system will require additional crucial steps. To become at least semi-autonomous, it must leverage a power of Big Data, ML/DL models, and implementation of computer vision. And that’s what AI engineers and data scientists have been working on for a very long time.
The idea of supervised autonomy (meaning that human surgeons delegate some tasks to the robot and supervise their fulfillment) was fully realized by the group of researchers in 2014. They created a Smart Tissue Autonomous Robot (STAR), a semi-autonomous ML-powered robotic surgery system which immediately became a true star of the laboratories and operating rooms around the world. STAR became the first of his kind, but does this robotic surgery system have any advantages over professional flesh-and-blood surgeons?
STAR leverages an integrated system of pre-trained ML-algorithms, CV-enabled NIR/RGB cameras, and a set of electrosurgical tools to find a tumor, cut it accurately and stitch the cuts. During the last year’s experiments on artificial tissue and an anesthetized pig, STAR not only managed to execute a perfect surgery but also showed better results than human surgeons. STAR made more precise cuts, damaged less surrounding flesh, and the stitches it made were more regular and leak-resistant.
STAR is still not autonomous — it needs a tumor to be marked with the infrared markers. But STAR’s inventors promise that in just a few years robotic surgery systems will be extracting all the information they need from CT, MRI and other medical images. It’s obvious that robots cannot replace human surgeons entirely in the nearest future. But very soon they will transform from a “mechanical hand” into fully-fledged surgeon assistants.
We live in the age of sophisticated healthcare devices — from patient monitors and smart wearables to CV-powered scanners and internal biomonitors. All these devices continuously generate a patient data flow, 99% of which remains unused — this mountain of data cannot be processed or analyzed with standard software. Finding a way to make this data work to the patient’s benefit will help us to effectively control chronic diseases and prevent many deadly conditions, such as heart attacks, strokes, and more.
Machine learning, cloud computing, and big data, — these are the three keystones of advanced patient data analytics in healthcare. Virtually unlimited resources of сloud сomputing enable the effective storage and non-stop processing of terabytes of patient data that was generated by remote monitoring devices.
Big Data algorithms structure and unify data for further ML analysis. And then machine learning takes the stage: it filters out data to generate important insights in real-time and automatically sends most relevant information to the doctor. Moreover, ML/DL models help to facilitate treatment personalization by monitoring a patient’s health and carefully calibrating medical dosages when necessary.
Let’s witness an another round of Human vs. Machine battle to understand how machine learning solutions in healthcare help to optimize 24/7 patient monitoring in the intensive care unit, the most important part of the hospital.
Intensive Care Unit (ICU) is the place where human lives heavily depend on real-time data analysis. Today all the equipment in ICUs works separately, data sources and flows are not integrated, meaning all the equipment calibrations and adjustments must be made manually. Therefore, an effective treatment demands ICU team’s undivided attention — but they simply cannot keep watch on every patient around-the-clock.
Autonomous Healthcare company developed a solution integrating machine learning, big data algorithms, and mathematical modeling to analyze data from all the monitors and draw a comprehensive picture of a patient’s health. To keep the patient in the optimal conditions, the ML-powered software is also able to autonomously adjust medical equipment and medication dosages.
Basic workflow of Autonomous Healthcare app. Source: www.autonomoushealthcare.com
According to latest healthcare industry studies, doctors and nurses spend around 37% and 71% of their working time respectively to work on non-patient care activities.
Lots of doctors and nurses still handwrite chart notes and prescriptions, clinical notes and medical journals and then have to transcribe the handwriting into a machine-readable format. By automating these processes, we could dramatically improve the effectivity of clinicians, optimize their workflow, and reduce the costs of patient care.
Medical personnel in some hospitals still uses stickers and blackboards for planning
The specialists from Accenture say that virtual personal healthcare assistants (VPHA) and telehealth solutions may be an answer. VPHAs and administrative workflow assistants are predicted to save for US healthcare industry at least $38 bln each year by 2026.
The standard workflow for virtual assistants is the following:
AI-powered virtual nursing assistants are able to give patients round-the-clock access to medical support and health monitoring. According to Syneos Health Communications, 64 percent of patients reported they would be comfortable with AI virtual healthcare assistants.
Another way to optimize administrative work in hospitals is via the implementation of NLP algorithms. Voice-to-text and handwriting-to-text healthcare mobile apps are able to automate non-patient care activities like writing chart notes, prescribing medications, and ordering tests.
Doctors and nurses use NLP-powered healthcare apps to transform all types of clinical notes into machine-readable data for later ML analysis. This allows enriching actionable healthcare data and provides medical specialists with unprecedented timesaving capabilities.
Let’s have a look at the few Human vs. Machine cases, proving the tremendous progress of AI in this field.
Computers and mobile apps won’t replace doctors in the nearest future, because healthcare operations demand common sense, intuition, and experience to interpret data appropriately. Moreover, human doctors can analyze non-verbal patient data — like behavior, breath smell, and the like.
However, computers are already able to deliver a great value in some healthcare fields, like medical image analysis or patient data processing. Just 5 years ago even the most sophisticated medical equipment could serve only as a mere tool for doctors to operate. But the implementation of AI technologies, big data algorithms, and machine learning shifted the paradigm. And mark our words, in another 5 years human doctors will entrust some major healthcare operations to AI-powered assistants.
Since 2005, Oxagile has gained profound expertise around AI, machine learning, and healthcare domains. We empower medical solutions with ML to enable high performance, ultimate data protection, and compliance with all the industry standards like HIPAA, HITECH, DICOM, and more. If you face a challenging healthcare project, let’s discuss how we can help.