Why Your Private Health Data and History Isn’t Actually Private

Almost all of us can report those eerie “coincidences”—targeted ads on social media for, say, open-toed wedge sandals when you’d swear you were just daydreaming about a vacation someplace warm. Something in your search history or shopping patterns told the Facebook gods that you’d click on an image of shoes. It’s weird, you may think, but no big deal.

Companies are increasingly discovering, however, that there’s gold to be mined in a topic much closer to home and with more potential to harm you than footwear preferences: your health. Amassing personal medical information has become a huge business in recent years, with medical centers and data brokers selling or sharing everything from your diagnoses to lifestyle habits and activities that could raise your risk for future diseases.

In 2017 alone, medical data accounted for an astonishing 700 billion gigabytes, according to the California market research firm BIS Research. And some health insurers are piecing together pictures of us with information going back at least a decade, a tapestry woven from numerous threads like prescriptions we fill, products we buy, topics we do searches on, info we record on apps, and many others.

By tracking and linking data from a range of digital sources, companies can make shockingly accurate guesses about what worries you or what actually ails you, says David T. Grande, M.D., an associate professor of medicine at the University of Pennsylvania who has researched privacy issues.

This isn’t always bad. The more certain companies learn about us, the better they can create drugs, devices, and products that match our needs or tailor advertising to us so we see stuff we might care about. But there is little regulation in the U.S., so it’s anyone’s guess which businesses are spying on us in ways we wouldn’t knowingly allow—not to mention our not knowing what the heck they’re doing with the facts they uncover.

Data-privacy experts especially fear potential real-world consequences, such as people’s being prevented from purchasing life or disability insurance or being charged exorbitantly, or the possibility that an employer or a landlord could refuse to hire or rent to someone based on something learned online.

Security is not necessarily the same as privacy.

Putting aside for the moment the very real issue of illegal data breaches of medical practices or pharmacies, your health data may well be quite secure. But secure is not necessarily the same thing as private. We may give consent for our data to be used for various purposes without fully understanding the terms or their implications.

Once consent is given, many health care providers, hospitals, and the like are legally allowed to share and sell that data, says Elaine Kasket, author of All the Ghosts in the Machine. Privacy, she says, is best understood as the right to control what others know: “The problem is, we overestimate how much control we have.”

This content is imported from {embed-name}. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

And the Health Insurance Portability and Accountability Act (HIPAA) doesn’t fully protect privacy. Because of HIPAA, the massive troves of information housed at your doctors’ offices, hospitals, and pharmacies and with other health care providers cannot be revealed as your information without your permission. But HIPAA doesn’t keep data from being shared or sold—it only says your name and other identifying information must be stripped away first.

When enough anonymized bits and pieces are shared, data brokers using powerful computers can join those with publicly available data to knit your identity back together, says Christopher L. Dore, a partner in the Chicago plaintiff law firm Edelson P.C., which files class-action lawsuits against entities that overshare.

Research bears this out: Massachusetts Institute of Technology scientists looked at anonymous data collected from cell phone records and transportation smart cards and found that if someone had access to one month’s worth of that data, more than half of the people whose data it was would be identifiable. Medical data, of course, is particularly sensitive. “People would be really frightened to know how much of their medical data gets passed around in supposedly anonymous form,” Dore says, since it can so easily be reconstructed. “HIPAA is woefully inadequate for today’s world,” he adds.

Even prestigious medical institutions share data. A lawsuit filed by Dore’s firm in 2019 asserted that the University of Chicago Medical Center shared medical records of potentially hundreds of thousands of patients with Google, which intends to develop an artificial intelligence business. As a medical provider subject to HIPAA, the university removed all identifying information. But it did not seek patient approval or give anyone the option to say no, says Dore.

Businesses that have nothing to do with health also share your data.

And HIPAA laws don’t apply to the myriad businesses that have insight into your lifestyle, which is easily linked to your health. The fact that you’re sedentary for hours as you drive to and from work (info gleaned from tolls on your credit card) or the fact that you’ve been loading up on cupcakes instead of carrots at the grocery (tracked through shopper-loyalty cards) could indicate more about your health than even your blood work or your genetic makeup, says Eric Perakslis, Ph.D., currently a Rubenstein Fellow at Duke University and a former chief information officer and chief scientist (informatics) at the U.S. Food and Drug Administration.

Use a period tracker or a smart thermometer? Get your shut-eye on a smart mattress? How about that “smart” hairbrush you got for Christmas? When you click “agree” on the terms of use for the accompanying apps, you may be saying OK to having your data shared with commercial data brokers, who compile the info and sell it.

Devices that don’t seem at all medical can be used to infer things about your health—if you set a smart thermostat in your home very low, for example, an interested company could deduce that you’re menopausal. “When you drop little breadcrumbs around, they start to pile up, and someone can increasingly create a unique picture of you,” says Quinn Grundy, Ph.D., an assistant professor of nursing at the University of Toronto who has conducted studies on privacy.

doctor through the phone screen using stethoscope checks health

Andrey SuslovGetty Images

This is pretty much as creepy as it sounds. If someone in real life trailed you and took notes on your activities, purchases, and utterances, you would call it stalking. “But in the digital world, that’s common behavior,” a privacy expert told Dr. Grande for a paper he coauthored in JAMA Network Open. Topics you search for online, posts on social media, apps you download, and much more contribute to what Dr. Grande calls your “digital health footprint.”

Assume any device with a camera and microphone is watching and listening and reporting what it learns to commercial brokers who profit by gathering, linking, and selling the information. “The irony is, your doctor can’t talk about your health without your permission, but your credit card company or app manufacturer can,” says Perakslis. And if the data floating around out there is wrong, there’s almost no way to correct it.

Health data is a huge market.

Even companies that don’t sell data may share it widely. In one study, Grundy and her team created phony identities complete with birthdays, health histories, and other info and gave them to 24 health apps, then followed the trail. Fully 79{b5d304c96e94113bdfc523ff4218a1efff4746200bdb9eeb3214a56a1302f2e4} of the apps shared information, with some of it going to developers (presumably to improve the products) but the highest volume passing to advertising powerhouses including Alphabet (owner of Google), Facebook, and Amazon.

And there are still plenty of old-school ways to spy. For example, the administrator of a private Facebook group for women with the BRCA breast cancer gene realized that everyone who joined had access to the real names and posts of all the other participants. If one of them was, say, a pharmaceutical rep or a data seller, that person could know the cancer status of thousands of women. (She has since moved her group off the platform.)

“People would be really frightened to know how much of their medical data gets passed around.”

Don’t forget illegal actors: Health data is a common target of hacking, with more than 1 million patient records compromised this past summer alone. A single ob/gyn practice last year had half a million patient records accessed. The biggest risk from these breaches seems to be financial; a researcher from Johns Hopkins found that 71{b5d304c96e94113bdfc523ff4218a1efff4746200bdb9eeb3214a56a1302f2e4} of the more than 1,500 health-data breaches that happened in the past decade yielded enough demographic or financial facts to put people at risk of fraud or identity theft, while only 2{b5d304c96e94113bdfc523ff4218a1efff4746200bdb9eeb3214a56a1302f2e4} exposed sensitive medical information. But the consequences can be dramatic.

Several years ago, two men in South Carolina were arrested for illegally accessing state prescription records for oxycodone and other restricted drugs, then sharing them with attorneys, potentially handing those lawyers powerful ammunition in a divorce or child custody case. Dore expects health data breaches to become even more widespread, as medical institutions are often lax about security.

How to protect yourself by leaving a smaller digital health footprint

Experts agree that the problem is too big for any individual to solve. “It’s almost impossible to be a modern-day consumer and not leave these footprints behind,” Dr. Grande says. Still, you can somewhat limit the size of that mark. Here’s how.

Be selective about which apps you use.

“If you’re sharing medical information with an app, you’re putting it into hands you don’t really know,” says Dore (and it doesn’t matter whether the app is paid or free, or whether it is from a corporation or a nonprofit).

Perakslis advises asking yourself whether any benefit is worth the risk of having your info out there. If you have cancer in your family or are experiencing a high-risk pregnancy, for instance, a genetic-testing website or a pregnancy-monitoring app might make sense, but Perakslis doesn’t recommend it if you’re just curious.

Read before you click.

Another way to help plug the data dike is to, yes, read the fine print on privacy policies before agreeing to them, even though they’re often hard to decipher, says Chiara Portner, a corporate attorney specializing in data privacy and protection matters at Hopkins & Carley, a law firm in Palo Alto, CA.

One easy thing Portner suggests is looking for the date on the policy; companies that revise their policies annually may take the issue more seriously, she says. And if a site offers you the option to say no to sharing your data, take it.

Be cautious with what you share.

This goes for things like online surveys as well as apps. “If I’m lying in bed feeling depressed and an app asks me to log my symptoms, I’m not going to do that,” says Christine Von Raesfeld, 45, from San Jose, CA.

Von Raesfeld, who herself has had numerous chronic conditions since childhood, has spoken with many women who worry about how widely their diagnoses might be disseminated. This led her to cofound the patient-centric nonprofit People With Empathy, which focuses in part on data-sharing concerns. (She’s right to be vigilant: In 2018, Australian journalists discovered that a popular medical-appointment app was sending data to personal-injury law firms as part of a business-referral program.)

Be aware of “marketing partners.”

Lisa Weiler, 36, received a box of formula samples during her second pregnancy. The Los Angeles blogger hadn’t shared her news beyond close family. She had, however, shared her personal info and due date on a popular pregnancy-tracking app. Had she read its privacy policy, Weiler would have seen that this gave it the right to share the info with retailers. Other pregnant women have been shocked to learn that a discount program they signed up for at a maternity chain also allowed “marketing partners” to know they were expecting.

Ultimately, strengthening protections for consumers will require strong national legislation. “Current privacy regulations put too much responsibility on the consumer, who has little power in terms of transparency around data-sharing practices,” Grundy says. The European Union has much stricter laws than the U.S. (except, now, for California). “But until we have comprehensive regulation that ensures the right to privacy and shifts the responsibility from consumers to those who profit from their data, Big Data is indeed watching.”

This article originally appeared in the March 2021 issue of Prevention.

    This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io

Next Post

Self-improvement book encourages people to thrive and align with one another and the source that created them

Fri May 7 , 2021
Jason T. Beck releases ‘If the World Only Knew: It’s Time to Wake Up and Heal’ CAMROSE, Alberta, May 4, 2021 /PRNewswire-PRWeb/ — Jason T. Beck wanted to educate others on how God, the Universe, Life, Source of Creation, Divine Mind, Mother Nature, Unified Field, Creator, Infinite Intelligence, Source Energy […]
Self-improvement book encourages people to thrive and align with one another and the source that created them

You May Like