Authors

  1. Wielawski, Irene M.

Abstract

Partnerships between tech companies and health systems challenge privacy expectations and laws.

 

Article Content

Personal health information has become a hot commodity. Researchers, health care organizations, and most recently, tech giant Google are aggressively looking for ways to amass patient health data for a wide range of purposes, including better science and health system performance but also to develop new moneymaking products.

  
Figure. No caption a... - Click to enlarge in new windowFigure. No caption available.

The impetus is emerging technology, such as artificial intelligence (AI), that has the potential to improve diagnosis and treatment, fix flaws in existing digital systems (such as electronic health records [EHRs]), and help patients manage their health. But the degree to which patient privacy may be sacrificed to these goals has long worried ethicists and government regulators.

 

SHARING PATIENTS' RECORDS

Worry exploded into outcry in November 2019, when the Wall Street Journal exposed a secret deal between Google and the St. Louis-based health system Ascension that gave the tech company access to 10 million patient records without notifying the patients or clinical staff. Code-named "Project Nightingale," the collaboration aims to use AI and machine learning tools to analyze patient data, identify disease patterns, and help clinicians develop treatment plans. Ascension is one of the nation's largest health care systems with 2,600 hospitals, physician practices, and other facilities in 20 states and Washington, DC.

 

Ascension and Google quickly put out public statements, asserting their arrangement complied with the requirements of the Health Insurance Portability and Accountability Act (HIPAA). Nevertheless, the deal struck a nerve. On Twitter, Senator Richard Blumenthal of Connecticut decried it as "beyond shameful," saying Google had demonstrated a "blatant disregard for privacy." Investigations were quickly launched by the U.S. Department of Health and Human Services' Office for Civil Rights and the U.S. House of Representatives' Committee on Energy and Commerce.

 

GROWING CONCERNS ABOUT PRIVACY

Lawmakers and government regulators have been on edge for some time about tech companies collecting personal information on users of their products, including details about the way they shop; search; and use maps and location services, as well as chat platforms like Facebook. This information is then sold to advertisers, retailers, and other business partners. Although people may realize their identities and other information are up for grabs in these commercial interactions, it's the opposite when it comes to health care. People expect, based on long-standing clinical practice, that health care providers will keep their information private.

 

This explains the heat behind the reaction to news of the Ascension-Google deal, although this deal isn't unique. Last September, the Mayo Clinic in Rochester, Minnesota, announced a similar arrangement with Google but emphasized that Mayo would retain control over access to patient data, which it said would not be used in a way that could identify individual patients.

 

Still, it's a lot resting solely on the word of these private companies. Although these arrangements include promises to only work with anonymized or deidentified medical data-removing, for example, names, addresses, and other identifying information before analyzing the data-there's no industry or government standard to guide this or oversight to ensure it's done properly. This is what makes regulators and clinicians-not to mention the public-uneasy. Most people have had enough experience with hacks and other security breaches of digitally stored information to be skeptical of such promises, especially from the tech industry.

 

Indeed, since Facebook was implicated in 2016 in the Cambridge Analytica election scandal, in which raw data from 87 million Facebook users were shared with the Trump campaign, there have been numerous reports of surreptitious sales of personal information-including medical data-by commercial tech companies. Several health-oriented smartphone apps, including one to help women track their ovulatory cycles and another for people monitoring their heart rate, secretly sent users' information to Facebook, which in turn sold it to advertisers. An April 2019 study in JAMA Network Open found that 29 of 36 smartphone apps for depression and smoking cessation transmitted users' personal and identifying data to Google and Facebook for advertising and marketing purposes.

 

PARTNERSHIP BENEFITS

Nevertheless, researchers say big data collaborations, like Google's with the Mayo Clinic and Ascension, are essential for medical science to advance and for health care delivery to improve. And, despite the ongoing scrutiny of the Ascension deal, government health officials largely agree. The National Institutes of Health (NIH) has been working since 2016 to amass a similar pool of biomedical and behavioral health data through its "All of Us" research program. However, unlike the Mayo and Ascension deals, the NIH program amasses health data that have been provided voluntarily.

 

Another argument in favor of such partnerships is the urgent need for greater technological sophistication in health care organizations to manage not only data but operational tasks, such as communications, care delivery, staffing, and finance. On data management alone, the health care industry continues to lag behind banks and retail companies in cybersecurity investments, leaving hospitals and other clinical facilities vulnerable to hacks, data theft, even manipulation of medical device software. On the functional side, the EHR urgently needs overhauling to address data entry burdens and inefficiencies that clinicians have long complained about.

 

PREVENTING HARM

The question is how tech and health system collaborations can proceed without harming patients. Although Google and other tech companies-Apple and Microsoft are pursuing similar collaborations-have leaned heavily on claims of HIPAA compliance in public statements, HIPAA is widely seen by policymakers as inadequate for such ventures. Google didn't even exist when Congress wrote the law nor is HIPAA expressly aimed at protecting patient privacy. The HIPAA Privacy Rule only came into play in 2003 in response to concerns about the exposure of patients' medical details through electronic billing.

 

And, though HIPAA carries strong penalties for unauthorized disclosure of personal health information, the law is replete with exceptions that permit health care organizations to use medical record details without patients' knowledge or permission. These include billing and insurance transactions, administrative and oversight activities, research, cooperation with law enforcement and criminal investigations, and the broad catchall category "health care operations." HIPAA has come to be seen by the public and many clinicians as a guarantee of privacy outside these exceptions, but its statutory language is milder, requiring health care organizations only to provide "reasonable safeguards."

 

Lawmakers are aware of the disconnect between public expectations and statutory protections. Bills or drafts of such legislative proposals have been introduced in 25 states and Puerto Rico to regulate the "privacy practices of commercial entities," according to the National Conference of State Legislatures. The proposals target a wide range of issues regarding the collection, storage, and reuse of consumer data, including biometric information of the sort contained in people's medical records. At the federal level, a bipartisan push in Congress last year to pass a national data privacy law lost steam in the debate over how to enforce it and whether states should be free to pass their own rules. Versions are pending in House and Senate committees.

 

Besides public clamor, two developments are spurring the federal effort. One was California's passage in 2018 of comprehensive privacy legislation. The California Consumer Privacy Act, which took effect on January 1 of this year, ensures consumers can review the personal data companies have collected on them. They can then decide if they want the information deleted and to bar companies from selling it. The impact of this legislation is being closely monitored by federal and state regulators, as well as by tech industry lobbying groups, which tried unsuccessfully to carve out exemptions to California's law.

 

Also influential is the European Union's sweeping 2018 General Data Protection Regulation, which gives people control over their personal information. Businesses that process this information must ensure its security and take steps to prevent identification of individual subjects in data sets. Subjects may also revoke consent at any time. Several proposals in state legislatures have borrowed language from this regulation.-Irene M. Wielawski