Skip to main content

In 2018, the tech industry found itself in the spotlight amid a scandal involving a company called Cambridge Analytica, which had collected and used the data of millions of Facebook users, apparently without their consent. It caused a public outcry, congressional hearings, a $5 billion fine, and permanently changed the narrative about how social media companies use data.

Following a Supreme Court decision overturning Roe v. Wade, health data privacy gets its own Cambridge Analytica moment.

With viral calls to scrap period-tracking apps and fears that health records could be used to prosecute people seeking abortions who reside in states where it’s now illegal, several experts have invoked Cambridge Analytica as a benchmark for scrutiny that health data practices might rightly draw. They expressed hope that health care companies would review their practices, but also said lasting change will require federal policy changes and the teeth to enforce them.

advertising

“We’ve seen Facebook do better in terms of privacy policies and in terms of clarity and in terms of individuals being able to dictate what data is shared and what isn’t,” said Megan Ranney, a professor at the Brown University and director of the Brown-Lifespan Center for Digital Health. “And I think maybe it’s time for us in healthcare to do that kind of math. I would say it’s overdue, but maybe it’s time.

The 26-year-old Health Insurance Portability and Accountability Act, or HIPAA, generally governs how healthcare organizations can and cannot use patient data, and how they must secure it. But experts say HIPAA now has glaring weaknesses in the modern world of medical data. Kristen Rosati, a lawyer and former president of the American Health Law Association, noted that the law does not protect sensitive data from disclosure to law enforcement, nor does it cover countless health devices, apps or research in online, where people unwittingly share intimate health information that can be shared, sold or aggregated.

advertising

“It’s a really chaotic environment,” said Lisa Bari, CEO of Civitas Networks for Health. “And the lack of a national data privacy law hurts everything. It hurts people’s health. It hurts people’s privacy. permitted purposes.”

Consumer data is currently regulated by a hodgepodge of state laws as well as the Federal Trade Commission, which has rules that prohibit misleading claims. A bipartisan bill currently going through Congress would help provide consumer-authorizing rules for data collected by health apps. The Department of Health and Human Services’ Civil Rights Office also oversees the use of HIPAA-covered health data and has previously issued guidance on when reproductive health data should be disclosed to law enforcement. order.

The Wild West of healthcare data has led to many smaller-scale scandals over the years: Google’s efforts to grow its healthcare business with millions of Ascension patient records have drawn criticism. Flo, which is once again putting the spotlight on a period-tracking app, last year settled with the Federal Trade Commission allegations that it misled users about their data privacy. And just weeks ago, an investigation found that major hospital systems were sending health information to Facebook, which experts say may be in violation of HIPAA.

But never before has health data privacy been tied so tightly, or so publicly, to an event of such scope.

It’s critical that companies that collect data or manufacture health care products carefully consider the potential consequences, said Christine Lemke, co-CEO of Evidation Health, which helps companies conduct consumer health research. . She said there are analogs in healthcare to how Facebook’s algorithms prioritized inflammatory content that may contain misinformation during the last national election.

“This will lead to unintended secondary consequences. Someone is using data from a pregnancy app to put a woman in jail,” she said. Today, the issue is reproductive health – but in the future, information that leaks from health apps or services could be used in other discriminatory ways. “We should be mindful of these things.”

Andrea Downing, president and co-founder of the Light Collective, put it more bluntly. She said companies collecting data should “consider immediately how can this be used as a weapon against someone? How can it be used in the hands of the most evil person who does the worst things with it? … And then when we design something, whether it’s a study or a technology, we need to be in a place where we protect ourselves from that damage from the start.

The fact that companies don’t think more often about how their data or products are used could be attributed to naivety, negligence, and commercial motives. For some companies in the space, the bulk of their business comes from buying and selling data to pharmaceutical or adtech companies for marketing purposes.

Absent a change in federal policy, experts said organizations could take immediate steps to improve privacy practices — for some, doing the bare minimum is enough to ensure that the developers they work with work do not unwittingly send private health information. to third parties.

Even companies that aren’t covered by HIPAA can voluntarily submit to its rules, Ranney suggested. Rosati said consumer apps could provide users with a way to immediately delete their data, an option already required by some state-level privacy laws and likely to be included in a nationwide effort. Apps that send data to third parties could carefully consider whether this is necessary and scrutinize any partners they share data with. Some services may leave all data on devices or hide it behind user-controlled end-to-end encryption. Some sources have suggested that voluntary privacy certifications from a trusted nonprofit or Food and Drug Administration guidelines could help encourage better behavior.

Michelle Dennedy, who is developing PrivacyCode, a new startup to help companies manage these issues and has worked on privacy in technology companies for decades, said that in light of the legal liability created by data, organizations of Healthcare should “proactively invest in a few projects” to investigate what data they collect, how they use it, and what is really needed for compliance and to achieve positive health outcomes.

There’s already evidence that the gears may be turning on the voluntary switch: Flo announced that it would operate in an “anonymous mode” that would protect user data.

But while public pressure could encourage some companies to intervene, privacy experts said many companies will simply choose the path of least resistance, which means flouting privacy until they do. can’t.

“Whenever it comes to their bottom line, they don’t do the right thing,” Downing said. “We don’t need regulation for good players. We need regulation when no one is watching.

Even with a national privacy law in the works, experts say there is still a long way to go. Enforcement of existing laws remains light, and HIPAA still needs substantial reforms to reflect the advent of big data, information sharing, and research that the lawmakers who drafted it simply did not foresee.

Lucia Savage, privacy and regulatory manager at Omada Health, said that although she has seen consumer interest in privacy “inflate” since the Cambridge Analytica scandal, such discussions are generally relegated to ” Beltway arguing over privacy”. However, if more people start raising their fears directly with their elected officials, it could influence lasting change.

“Members of Congress love nothing better than a constituent who tells them a personal story,” she said. “It’s literally the most precious thing that can happen.”