To improve patient care while reducing costs, healthcare organizations must treat big data as a strategic asset that allows for better decision-making and performance.
However, to rely on this asset, care providers need to adopt and master the software tools and techniques that can turn these large data sets into meaningful insights.
In this post, we will discuss the potential and characteristics of big data analytics in healthcare, describe the steps for implementing big data solutions, and discover the success stories of data-driven healthcare organizations.
The Promise of Healthcare Big Data Analytics
McKinsey estimates that big data analytics accounts for $450 billion, which is 17% of total US healthcare spending, powering the industry at all levels – from a physician’s office to the entire health system. According to IBM, here are just a few potential benefits of enhanced data and analytics for healthcare organizations:
- Identifying inefficient treatments and processes, and unveiling better alternatives
- Recognizing patterns such as patients at risk of readmission, or the most resource-intensive diseases
- Driving engagement by providing patients with the insights they need to make better decisions and thus manage their health more effectively
- Identifying environmental and lifestyle factors that lead to reduced readmissions
- Examining vitals by gathering data from wearables and at-home health monitors to timely detect abnormalities
- Aggregating clinical, financial and operational data to analyze performance and outcomes, be it a department, a clinic, or a health system
Understanding the 4 Vs of Big Data in Healthcare
To better understand medical big data, let’s take a closer look at health information in the following four dimensions.
Volume. According to the EMC report, the amount of worldwide healthcare data totaled 153 exabytes in 2013 and is projected to reach 2,300 exabytes by 2020, increasing by 48% annually.
Variety. Health data comes structured, semi-structured, and unstructured, challenging organizations to discover value through a combination of these data forms. Existing healthcare data includes medical records, handwritten nurse and doctor notes, paper prescriptions, radiology images, and biometric sensor readings, to name just a few.
Velocity. Velocity stands for the speed at which data is generated and analyzed, which constantly increases with the proliferation of regular at-home monitoring, including daily glucose, blood pressure, EKG and other measurements, as well as in-hospital real-time data collected from bedside heart monitors, operating room monitors, and more.
Veracity. This characteristic sets the standard for healthcare data quality – that is, how credible and accurate data is – so that it helps professionals to improve decision-making while avoiding medical errors. Think of inaccurately translated prescription handwriting.
Now, since in healthcare the 4 Vs are closely interconnected, you can picture the challenge: enormous amounts of disparate data are to be collected and analyzed correctly in real time. Experts, however, have little consensus on how dramatic the challenge really is. For example, Rajib Ghosh, who focuses on IT-enabled sustainable healthcare delivery in the US, believes that healthcare data is yet to become a big data domain:
Do digitized healthcare data sets have those attributes [the 4 Vs]? In most cases the answer is no… Healthcare is very episodic in nature and therefore relatively low in volume and velocity. When a patient visits a doctor, a new encounter record gets created. Patient’s vital signs are recorded; allergies, symptoms and prescriptions are created. Once the episode is over, a billing record is generated, and if the patient is insured a claim is sent to the clearinghouse for submission to the patient’s insurance company. If a patient does not come back to see the doctor for the rest of the year or get admitted to a hospital for disease exacerbation, no more data for the patient gets added to any data set.
Actionable Methodology for Big Data Analytics Adoption
Scholars W. and V. Raghupathi outlined a practical methodology for the adoption of big data analytics in a healthcare organization. There are four important stages to consider.
Step 1. Concept statement. The organization identifies the need for big data analytics and describes the project goals.
Step 2. Proposal. Based on the approved concept statement, the organization’s stakeholders sit down with the data analytics team to discuss the problem and decide whether the “big data analytics” approach is plausible, considering that the latter is more costly compared to traditional business intelligence methods. At this stage, the organization should clearly see the trade-offs in terms of costs, alternatives, and scalability.
Step 3.Methodology. Using the concept statement, the project team develops propositions, identifies variables, and data sources, and decides on the data analytics tools and techniques to be applied.
Step 4. Deployment. Once the system is implemented, the team tests and validates the models as well as the findings they bring forth, and presents them to the stakeholders for evaluation and action. Based on the feedback, the team goes on to fine-tune the software to improve accuracy and minimize the risk of failure.
Healthcare organizations are already implementing big data analytics to gain deep and accurate insights into clinical performance, population health, and care effectiveness. IBM’s white paper describes several inspiring examples of health data analytics in action.
Premier is the largest US healthcare alliance with over 2,700 hospitals and health systems, 90,000 non-acute care facilities, and 400,000 physicians. The alliance built the largest clinical, financial, supply chain and operational comparative database that provides information on clinical outcomes, resource utilization, and transaction-level costs. Being able to improve the processes and outcomes through informed strategic decision, Premier has saved over 29,000 lives and reduced spending by almost $7 billion.
North York General Hospital is a 450-bed community teaching hospital in Canada. They adopted a scalable, real-time analytics solution that provides full vision into the organization’s clinical, administrative and financial performance, consolidating data from over 50 collection points across multiple internal systems. The hospital now boasts a better understanding of its operations, and improved patient outcomes.
Columbia University Medical Center is using advanced analytics of complex physiological data to proactively treat brain-injured patients. This helps detect severe complications 48 hours earlier than traditional, reactive methods for patients who have suffered a bleeding stroke from a ruptured brain aneurysm.
In this post, we saw how powerful big data can be in helping healthcare organizations reduce costs and improve outcomes through better-informed decisions.
We also unveiled the 4 Vs that describe the challenge behind big data in healthcare: the volume, variety, velocity, and veracity.
Finally, we laid out a methodology for implementing a big data solution and discussed several successful implementations in action. Now, whether we are health professionals or big data experts, our job is to translate these enormous health data sets into big gains, thus building a smarter and better healthcare industry.