Saturday, November 23, 2013

What's next for EHRs

On Nov 14, the HIT Policy Committee sent recommendations to the HIT Standards Committee on three key areas for future EHR capability:  query for a patient record, provider directory management, and data portability and migration.  An article on the recommendations can be found here.

These recommendations were the result of many months of deliberation by the Information Exchange Working Group, of which I have the privilege of being Chair.  These three functional capabilities are very important because they address key needs important to health care delivery but that won't be adequately met by the market on its own.

query for a patient record
Meaningful use has approached interoperability in a deliberate and methodical fashion.  Stage 1 focused on adoption of EHRs, and routinizing use of HIE capabilities that already existed in the market (primarily eprescribing and lab results delivery).  Stage 2 took it one step further to move the market to adoption of "push" capabilities among providers and between providers and patients.  The new recommendation on "query" takes the final step to enabling "pull" or "query" functions among providers.  While it will still take many years for the market to create business practices and infrastructure to support seamless inter-connectivity among all providers and patients, the "query" requirement will make EHRs finally fully capable of being the building blocks of that larger interoperability vision.

provider directories
This is mostly a "clean-up" recommendation from Stage 2.  As the HISP market starts to take shape based on the Direct requirements of Stage 2 MU, a clear obstacle to more seamless integration of HISPs is the lack of standards for provider directory transactions -- being able to look up a provider and his/her security credentials from one system to another.  This recommendation will enable one EHR system to discover a provider, their routing address, and their security credentials, and will also enable EHR systems (or standalone provider directories) to respond to such electronic provider directory searches.

data portability and migration
I personally know of someone whose doctor changed EHR systems only to find that the medical records got matched to the wrong patients during the migration.  Imagine the disastrous consequences that could result from such errors!  Current market predictions are that 20-30% of providers will be changing their EHR systems in the next few years, for a variety of reasons.  Data migration -- the ability to transfer data from one EHR to another -- will thus become an increasingly important issue in the market.  As EHR systems and EHR users enhance their ability to apply quality and decision support tools to clinical data, there are important safety and quality risks to having incomplete and/or error-filled data migrations from one system to another.  Data portability refers to the transfer of data from one EHR system to another to support a patient's desire to change physicians, for example.  It is similar in many ways to the data migration need which is why we included this use case in our recommendation.

There is a balance that needs to be struck between the scope and specificity of government regulation, on the one hand, and the strong desire and need for market flexibility and innovation.  We already have examples of where this can go awry.  I believe that these recommendations are judicious in covering only areas that are important to society and that also won't get fixed by the market on its own.

Monday, November 11, 2013

More tales of health care cost and quality

An interesting article in the Wall Street Journal goes through the pros and cons of concierge medicine.  I’ll admit, my view of concierge medicine has been largely negative up until now.  Probably having to do with my being as confused as society is about whether we should treat physicians as if they have a special calling (saint-like) or as if they are business people with special skills (pro athlete-like).  Anyway, this article does convincingly make the case that concierge medicine isn’t just rent-seeking behavior but is actually value-creating in the economist’s sense for some people.  And ironically ObamaCare may have actually shored up the business case for concierge medicine.

The more interesting piece of the article was actually a tangential reference they made to a study from the North Carolina State University.  The study apparently reviewed the expenditures of the patients of a particular concierge practice and found that the longer office visits allowed by the concierge business model led to health improvements which ultimately decreased out-of-pocket payments (presumably through lower utilization of something – office visits?  medications?)!

Now, I realize that there is plenty of literature out there on PCMH savings, but the evidence still seems to be spotty and it’s still unclear from most of the studies whether the savings are net or gross savings, which is important because PCMH requires considerable investment to make the model tick.  And of course there’s the ground-breaking work of the Alternative Quality Contract in Massachusetts, which showed small but significant savings compared with controls, but again, the savings clearly are gross savings, not net of investments by Blue Cross and/or the participating providers.

The intriguing point about the WSJ reference is that the result suggests net value to both patient and provider.  Concierge practices are supposed to improve the quality of care – that’s why people join them.  In terms of who gains what, the common wisdom is that patients benefit through higher quality of care, and providers benefit from higher income.  In this case, both patients and providers seem to have gotten a financial gain -- the patient saved 12% out-of-pocket, and the provider has a profitable business and so is presumably better off than before.  It also seems that not only did quality of care improve, it improved by a lot because it resulted in lower utilization of some type of medical service.  I don’t think the WSJ reporter even had a clue that such a finding would be somewhat of a blockbuster if confirmed.  

I’ve contacted NC State to get a copy of the study – hopefully they’ll respond.

Monday, April 29, 2013

The double-edged sword of losing our privacy

Today's New York Times had a fascinating pair of articles that nicely, but seemingly without the intention of the editors, shows some of the pros and cons of applying data mining to publicly available private information.

"I was discovered by an algorithm", the lead story in the business section, is about a headhunter start-up company that aggregates information from a variety of public sources to identify high-end programming and development talent.  They use this data to supplement the standard information that an employer would receive (eg, degrees, schools, awards, work history, etc) and identify high potential candidates whose talents don't always come through in a typical resume or CV.  The article describes how "big data" techniques allow employers to utilize a richer array of variables to identify and evaluate prospective job candidates, and highlights the case of an individual who received a lucrative programming job but who would otherwise not have even passed a standard recruiting screen due to poor high school performance and lack of a college degree.  Would prospective recruits feel violated by this black-box search and evaluation process conducted without their permission or awareness?  Both the individual and his employer say no.  Score one for lack of privacy being a good thing.

"When your data wanders to places you've never been", buried inside the business section, tells the tale of a woman who gets targeted by pharma direct marketers who have mistakenly identified her as a multiple sclerosis patient based on "big data" searches of publicly available information on the web.  She ends up feeling both violated, and worse, too daunted by the complex chain of data brokers and marketing companies behind the error to do anything about it.  Score one for lack of privacy being a bad thing.

It's interesting that neither of these articles really dealt with the obvious flip sides of each situation.  Information gleaned from outside of a traditional recruiting process can be used to discriminate just as easily as it can be used to create new job opportunities.  And my health and demographic information can just as easily lead me to valuable treatments and support communities as it can to subject me to unwanted marketing and possible discrimination.

A common thread in each of these articles is that neither was a case of collection or use of illicitly gotten data (such as SSN, DOB, etc), rather, the data mining leveraged information that was voluntarily provided by the individuals in question, albeit for other purposes.  Though the information was available in the clear on the internet and was not illegally gotten, the individuals probably thought of it as perhaps not private but at least shielded or too isolated to be useful through random or targeted public searches.  In both cases they were wrong, one pleasantly and the other not so pleasantly.

The "big data" privacy issue is not so much about what a bad actor would do if they could get rare data gems like my SSN or my bank account, it's about the inferential mosaic that could be assembled by good, neutral, and bad actors alike from the many small pebbles of information that I myself have strewn across the web, such as what I say on an affinity user site or a web-based survey or an Amazon review or a Yelp comment (or a public blog).

I'm reminded of the story of an app called "Girls Around Me" that matched location data from Foursquare with profile data from Facebook to pinpoint women in a particular location and automatically stalk their Facebook pages to get pictures, background information, and messaging capability.  Not what either the women or Foursquare or Facebook had intended when they opened up their data and their APIs.

What's scary is not that there are unintended consequences, it's that there are unintended AND unpredictable consequences.  In health care, Latanya Sweeney has launched an interesting project to show how individual health information routinely and legally diffuses through a broad array of companies and websites.  Patients probably know bits and pieces of it, but probably not the scale and scope of it, as shown below.

This chart is most interesting for what it doesn't show, rather than what it shows:  It doesn't include the patient-generated data behind the NYT articles noted above.  As big data advances in scale and scope, it is the information that we voluntarily share -- like on PatientsLikeMe and CureTogether and SmartPatients -- that will eventually get fed into "big data" black-boxes and used in ways both good and bad that we are unable to foresee right now.