Home > Uncategorized > Adapting New Data Management To Health Care

Adapting New Data Management To Health Care

MedicalCrossThere is obviously a lot of talk about Big Data – data with relatively high Volume, Velocity and Variety.  In health care management, the need to handle big data is acute and exacerbated by the Veracity of data – the amount of historical truth about patients and procedures that must be retained over time.

In this article,  Charles Boicey explains how UC Irvine Medical Center is using Hadoop as an “adjunctive environment” to their existing enterprise data warehouse. The goal was to have a near real-time analytics data store instead of waiting for 24 hours for the extract-transform-load processes that had to take place before they could access their enterprise data warehouse.

What is interesting is the variety of data they want to access at one time – everything from nurses notes from the electronic medical records to lab results coming in from multiple internal and external sources (HL7). Traditional information architects would have to put a lot of thought into how to model the data to get it into a traditional data warehouse using tables and SQL – especially to get optimal performance for load times, retrieval, sharing, merging, mastering, and query efficiency and effectiveness.

I don’t think this story by itself is unique – there are lots of interesting use cases for Hadoop. What really caught my attention was a comment by one of the architects that the primary reason for the evolution of their MongoDB/Hadoop data store strategy was to avoid the need for data modeling.  I would suspect it was also much easier not to deal with all the process involved in extract/transform/load logic, security, and metadata management.  Does this mean the traditional IT approach was a hindrance to the business need?  Was there some kind of thought about canonical models and user access security that benefited from the collective experience in IT data management?

I think what it says is that Information Management professionals have to embrace the “self service” capabilities for analysis that are now available to business users, and work with them to help them get the business value they need while also helping them to understand the risks in exposing some of these great data stores to lots of potentially less sophisticated users.  At a minimum, everyone stands to gain from a security and data governance strategy focused on how to accomodate new models for information delivery rather than stifling innovation.

Can we adapt 20 years of information management process to the new paradigm without spoiling all the cool stuff?  I think so, and especially look forward to solving lots of interesting business problems we couldn’t touch in the past.

  1. No comments yet.
  1. No trackbacks yet.

Leave a comment