Could sharing health data with Big Tech be a good thing?
Nov 15, 2019

Could sharing health data with Big Tech be a good thing?

HTML EMBED:
COPY
Deven McGraw, a former HIPAA enforcer, explains how Google might make some analytics work better.

The world learned this week that Google is amassing health data from millions of Americans via a contract with the huge health care system Ascension. As trust in tech companies seems to continue ebbing, concerns about the “Project Nightingale” contract seemed inevitable.

While this data sharing may be completely legal under the health privacy law known as HIPAA, lawmakers have opened an inquiry. But maybe this data gathering isn’t an example of something we have to feel too freaked out about. There could even be some advantages 

To understand how that may be true, I spoke with Deven McGraw, an attorney who was formerly a HIPAA enforcer at the U.S. Department of Health and Human Services. The following is an edited transcript of our conversation.

Deven McGraw: Let’s say they do some predictive analysis of the data, and they find that of all of the people with diabetes being treated in the Ascension health care system, only 50% of them are receiving all of the preventive exams that they’re supposed to be getting. If they can quickly identify who those patients are to Ascension and allow Ascension to intervene with those patients to get them in for their preventive exams, it improves the outcomes for those diabetic patients, makes sure they don’t fall through the cracks. That’s a pretty simple example. It’s actually an analytics example of the work that lots of companies do in the health care space already. The power of Google and the amount of data that they potentially could also bring to the table might actually make those analytics work better and work faster.

Jack Stewart: And there are laws in place. You’ve worked on the enforcement of some of them to protect our health care records and the privacy around them. Are they enough in these situations to protect us?

McGraw: It certainly is comforting to know that at least for this particular project, Google is going to be regulated by HIPAA — Google and any other companies that are stepping into the health care space in this way and gathering identifiable data. But that may not be enough, frankly, to enable people to trust the company. One big potential loophole is the ability of a contractor in health care, which is technically called a business associate, to be able to take the identifiable health information they’re collecting when they’re providing services for a health care institution and then to deidentify it. There are standards in the law about how to deidentify data, but they don’t require that data to be anonymized to a degree of zero risk of reidentification. There’s still some reidentification risk there, and when that data is in the hands of a large tech company, like Google, that has so much other data that it could use to reidentify patients, I think the reidentification risks may not be well captured by current law.

Once that data is considered to be deidentified under HIPAA standards, whether it could be reidentified or not, the data just doesn’t have those kinds of protections and restrictions that HIPAA would otherwise provide. I think that’s a big concern when you’re talking about a tech company like a Google or a Facebook, or an Amazon or Microsoft, where, again, the amount of data that they’re collecting through their other lines of business makes it that much more plausible that they could reidentify or be able to successfully link data to a patient and then use that data in their other lines of business. I understand that Google is talking about commitments that it’s made not to do that, and I think that that’s really important, but those are contractual commitments and not necessarily commitments for which a regulator could hold them necessarily legally accountable for.

Stewart: Ultimately privacy is always the thing that comes up, in particularly media reports around these issues. But what’s the real risk? What does it matter if some of our data is shared more widely?

McGraw: Potentially, people could be harmed by this data, like if the data are used to deny them benefits or to treat them differently, either from an economic perspective or a social perspective. I think the greater risk, which is often underplayed, is the risk that people will not go seek health care for sensitive conditions because they’re worried about what’s going to happen with their data. When people have privacy concerns that cause them to not go get health care, that has ripple effects. It doesn’t just impact them individually, but it impacts whether our data sets are accurate about the health of our populations and whether people are getting the care that they need and the services that they need. This is where I think privacy becomes critically important.

Stewart: On balance, do you think that this is a good development in health care analytics or not?

McGraw: I think it’s a positive development. I would call myself cautious, probably a little more cautiously optimistic than some other privacy advocates. I think there’s so much good that can come out of Big Tech coming into this space, but it has to be done in a responsible way. I’m not entirely sure that our regulations are quite up to the task, so I can completely understand why there’s been so much outcry about this.

Related links: More insight from Jack Stewart

The Wall Street Journal was first with the Project Nightingale story and has the details. As it says, “The data involved in the initiative encompasses lab results, doctor diagnoses and hospitalization records, among other categories, and amounts to a complete health history, including patient names and dates of birth.”

And it’s that level of personal data “sharing” that certainly sounds scary. As Bloomberg Law says, “Facebook Inc., Google Inc. and other technology giants face an uphill battle convincing patients their online health products will keep their private information secure and won’t yield targeted ads.”

Deven McGraw talked about the idea of anonymized data — that’s the reassurance we’re often given that any data shared is handed over as a big aggregate data dump, and it’s stripped of things like names and addresses. It’s still useful, as big data, to spot trends and patterns. The trouble is, it’s becoming increasingly clear that it’s very easy to reidentify individuals in this data. Researchers from Europe managed to figure out who 99.98% of people in anonymous data sets were when they had just 15 demographic attributes. And we give those away when we add our ages, genders or marital statues to our online profiles. Then-FBI director James Comey said in 2017 that “there is no such thing as absolute privacy in America.”

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Thanks to our sponsors