The private sector is taking steps to protect online health privacy, but critics say so

Most people have at least one vague idea that someone is messing with data footprints created by their online activities somewhere: maybe their use of an app is allowing that company to create a profile of their habits, or they’re following horrible things in advertising.

It’s more than just a feeling. Many companies in the health technology sector – from mental health counseling to mailing to attention-deficit / hyperactivity disorder pills – have a striking leak in their privacy practices.

A guideline published this month by the Mozilla Foundation found that 26 out of 32 mental health apps have loose protection. The Foundation’s analysts have documented numerous vulnerabilities in their privacy practices.

Jane Caltrider, leader of the Mozilla project, said the privacy policies of the apps she used to practice drumming were very different from those of the mental health apps reviewed by the foundation – despite being much more sensitive to what was recorded later.

“Someone knows I don’t care if I practice drums twice a week, but if someone knows I see a therapist twice a week, I care,” he said. “This personal information is another pot of gold to them, to their investors.”

Gambling has become increasingly urgent in the minds of the people. Apps used by women, such as period trackers and other types of fertility-management technologies, are now the focus of concern about possible reversals. Rowe vs. Wade. Encouraged by social media, users are urging each other to delete data stored by those apps – a right not always granted to users of health apps – for fear that information could be used against them.

“I think this big data is looking at a day to calculate costumes,” said Sen. Ron Wyden (D-Or.). “They have to decide – will they protect the privacy of the women who do business with them? Or are they basically going to sell to the highest bidder?”

Dealing with this fear is a movement to better control the use of information through laws and regulations. While nurses, hospitals and other healthcare providers adhere to the privacy protections provided by the Health Insurance Portability and Accountability Act or HIPAA, there is a schematic shield for users in the growing sector of the healthcare app.

While some privacy advocates hope the federal government can take action after years of working, time is running out for a congressional solution during the midterm elections in November.

Enter the private sector. This year, a group of nonprofits and corporations released a report calling for a self-regulating project to protect patient data when it is outside the healthcare system, a method that critics compare to chicken house keeper proverbs.

Proponents of her case have been working to make the actual transcript of this statement available online. The initiative was formed over two years with two groups: the Center for Democracy and Technology and the Executive for Health Innovation. Ultimately, such an effort would be led by the BBB National Program, a nonprofit that was once affiliated with the Better Business Bureau.

Participating companies can hold a variety of data, from genomics to other data, and work with apps, wearables or other products. These companies will agree to audits, spot checks and other compliance activities in exchange for one type of certificate or seal of approval. That activity, maintained by draftsmen, will help patch up privacy leaks in existing systems.

“It’s a real mixed bag – for the general public, for health privacy,” acknowledged Andy Crawford, senior counsel for privacy and data at the Center for Democracy and Technology. “HIPAA has decent privacy protections,” he said. The rest of the ecosystem, however, has gaps.

Still, there is considerable skepticism as to whether the private sector proposal would create an effective control system for health information. Many participants – including some of the most powerful companies and components of the initiative, such as Apple, Google, and 23andMe – were excluded during the pregnancy. (A 23andMe spokesperson cited “bandwidth issues” and mentioned the company’s involvement in publishing the genetic privacy policy. The other two companies did not respond to requests for comment.)

Other participants felt that the project’s ambitions were skewed towards corporate interests. But that view was not necessarily universal – one participant, Laura Hoffman, formerly of the American Medical Association, said that for-profit companies were “disappointed by the restrictions on profitable business practices that exploit both individuals and communities.”

Broadly, self-regulatory schemes act as a combination of carrot and stick. Membership in the self-regulatory framework “can be a marketing advantage, a competitive advantage,” said Mary Engel, executive vice president of the BBB National Program. Consumers may choose to use apps or products that promise to protect patient privacy.

But if those corporations go astray – speaking of their privacy practices without truly protecting users – they could be rapped by the Federal Trade Commission. The agency may go after companies that do not fulfill their promises under the authority of the police’s unfair or fraudulent trade practices.

But there are a few key issues, says Lucia Savage, a privacy specialist at Omada Health, a startup that provides digital care for prediabetes and other chronic conditions. Savage previously served as the chief privacy officer for the National Coordinator for Health Information Technology in the US Department of Health and Human Services. “It’s not a self-regulating requirement,” he said. Companies may not join. And consumers don’t know how to look for a certification of good practice.

“Companies are not going to be self-regulating. They just don’t. It’s up to the policy makers, “said Mozilla’s Caltrider. He cites his own experience – emailing privacy contacts listed by companies in their policies, meeting only in silence, even after three or four emails. One company later claimed that the person responsible for monitoring the email address had left and had not yet been replaced. “I think that’s saying,” he said.

Then there’s the enforcement: the FTC covers the business, not the nonprofit, Savage says. And nonprofits can behave just as badly as any rogue robber baron. This year, a suicidal hotline was embroiled in a scandal when Politico reported that it had shared online text conversations with users considering self-harm and an AI-powered chat service with an artificial intelligence agency. The FTC action could be surprising, and Savage wonders if consumers are really doing well afterwards.

Disadvantages are found in the proposed self-regulatory framework. Some key terms – such as “health information” – are not fully defined.

Some data is easy to say – such as genomic data – health data. This is a thorn in the side of other information. Researchers are seemingly reusing common data – like one’s voice – as an indicator of one’s health. So setting the right definition can be a difficult task for any regulator.

For now, the discussion – whether in the private sector or in the public sector – is just that. Some companies are expressing their optimism that Congress could enact a comprehensive privacy law. “Americans want a national privacy law,” Kent Walker, Google’s chief legal officer, said at a recent event hosted by the R Street Institute, a pro-free-market think tank. “We’ve got Congress very close to passing something.”

This can be a tonic for critics of self-regulation – depending on the details. But a number of specific issues, such as who should enact possible legislation, remain unresolved.

The self-regulatory initiative seeks startup funding, potentially from benefactors, in addition to some arrears or fees that will be maintained. Nevertheless, Engel of the BBB National Program said the move was urgent: “No one knows when the law will be passed. We can’t wait for that. There is a lot in this data that is being collected and not being protected. “

KHN reporter Victoria Knight contributed to this article.

Related topics

Contact Us Submit a story tip

Leave a Reply

Your email address will not be published.