Incognito — June 2023: Your Mental Health Data is for Sale
May 25, 2023
Welcome to the June 2023 issue of Incognito, the monthly newsletter from DeleteMe that keeps you posted on all things privacy and security.
Here’s what we’re talking about this month:
Pretty much everyone agrees that mental health data should be private. But not data brokers. Read on to find out how data brokers sell your sensitive mental health data and how they’re getting away with it.
Recommended reads, including “Millions of Android Devices Are Pre-Infected with Malware.”
Q&A: Is there a way for me to reduce my digital footprint after an online job search and sign-ups to numerous company/HR websites and portals?
Imagine if you could buy anonymized information on people who have PTSD or a personality disorder in your zip code. Now imagine if you could get this same data, but this time, it was linked to actual people – their names and all.
Well, you don’t have to imagine. It’s actually happening.
For Sale – Your Mental Health Status
Turns out, there really are no limits to the kind of information data brokers collect and sell about us.
A recent report from Duke University found that data brokers openly market our sensitive mental health information for as little as $0.20 or on a subscription basis and with minimal buyer vetting. Some data brokers even offer “free samples” to convince potential buyers to purchase this data.
Sellable data sold by data brokers can include: Mental health conditions (depression, ADHD, etc.), the likelihood of having certain mental health conditions, DNA test data, health plans, connected medical devices, specific medication uses and prescriptions like antidepressants, etc.
But also: Some data brokers imply they can pair sensitive mental health data with personal information like names, addresses, incomes, social security numbers, the number of children in the home, and criminal records, among other things.
Where’s This Data Coming From?
Data brokers get your mental health information from many different sources, like the websites you browse and the apps you use.
For example, in 2019, Privacy International published a research study about how mental health websites share users’ data with data brokers, advertisers, and large tech companies, including answers to depression tests. The study looked specifically at Europe, but the same practice likely happens in the US.
Similarly, an investigation into mental health apps by researchers at Mozilla found that many are “letting consumers down in scary ways, tracking and sharing their most intimate information and leaving them incredibly vulnerable.”
Of the apps Mozilla looked at, 59% received *Privacy Not Included label, and 40% got worse since Mozilla’s previous investigation a year ago.
One app, which offers mental health assessments and online therapy, was found to load 799 trackers within just the initial download.
Wait…Isn’t That Illegal?!
You might be thinking of the Health Insurance Portability and Accountability Act (HIPAA). This is a federal law that was designed to keep your health data collected by covered entities like healthcare professionals and insurance companies safe.
Unfortunately, most mental health websites and apps fall outside the scope of HIPAA because they don’t qualify as “covered entities” under the law. This means they are free to do more or less whatever they want with users’ data, including selling it to data brokers for a profit. While some mental health apps have been confronted about selling out, we’re unlikely to see massive changes in the industry anytime soon.
And, since the data broker industry is also unregulated at the federal level, there’s no telling who has access to your mental health status—or what they’ll do with it. From your credit score to your reputation, having your mental health information out there could affect your life in ways you might not even imagine.
For example, to quote DeleteMe’s Head of Research, John Gilmore, “[People] may not know that their red flag is based on mental health data. Employers may be scoring lower confidence in this person because they’re considering potential mental health risk.”
How to Lock Down Your Mental Health Data
Remove your name from data brokers.Opt out of data brokers to stop them from selling your sensitive mental health data to third parties. Do this continuously – data brokers relist your information when they find more data on you.
Don’t share your personal information. When signing up for online accounts with websites and apps, use a masked email address and phone number. This will make it more difficult to link back the accounts to you.
Disable tracking. Any time you download an app, limit how it can track you via your phone’s privacy settings. On a desktop, use anti-tracking software and privacy-friendly browsers.
Our recent favorites to keep you up to date in today’s digital privacy landscape.
Google Makes Some Changes, Launches New Features
Brave Browser Introduces “Forgetful Browsing”
Brave Browser is launching a new feature called “Forgetful Browsing” that tackles first-party tracking (in contrast, most privacy tools focus on third-party tracking). When enabled, this feature will automatically log you out of sites when you close them and stop them from remembering your visits. The setting will be available in Brave version 1.53 on desktops and Brave version 1.54 for Android.
WhatsApp Lets Users Hide Individual Chats
WhatsApp introduced a “Chat Lock” feature that lets users lock a chat and prevent others from being able to see potentially sensitive conversations. When a chat is locked, it’s moved from your inbox and into its own folder that is only accessible with a password or biometrics, like your fingerprint. The chat is also hidden from notifications. WhatsApp said that the feature’s capabilities will be further extended in the near future.
Millions of Android Devices Are Pre-Infected with Malware
Cybercriminals have infected around 8.9 million Android devices with malware, said Trend Micro analysts at Black Hat Asia. The infected devices are mainly budget smartphones but also include entertainment systems, smart TVs, Android TV boxes, and even kids’ smartwatches, and are spread over 180 countries, including the US. The goal of the campaign is to steal information like SMS messages and take over online accounts.
You Asked, We Answered
Here are some of the questions our readers asked us last month.
Q: So technically password managers are safer than just remembering passwords, but I still don’t trust them. Are there any hidden settings or something I can enable to make them safer?
A: Honestly, one of the main reasons password managers are not as safe as they could be is down to how they’re being used.
For example, 1 in 4 password manager users reuses their master passwords for other online accounts. So, if their other accounts are hacked, their password manager is immediately at risk.
So first thing: create a strong master password. Make it as long as possible, and don’t use any personal information (your date of birth, the city you grew up in, etc.) within it. Consider using a passphrase (a series of phrases or words), and definitely use lowercase and uppercase letters as well as numbers and symbols.
If, even with a strong master password, you’re still worried about “putting all your eggs in one basket,” you can “pepper” your passwords.
Here’s a good guide on how to do that, but what it basically means is adding a password or PIN to all your passwords. This password or PIN doesn’t change between passwords. So, if your “pepper” is “avocado,” you’d add that to the end of all your unique passwords.
The idea is that even if someone somehow got access to your passwords, they still couldn’t log into your accounts because they wouldn’t know your “pepper.”
Other than that, always enable two-factor authentication.
Q: Is there a way for me to reduce my digital footprint after an online job search and sign-ups to numerous company/HR websites and portals?
A: What a good question.
The answer might be disappointing, though. Unfortunately, there’s no way to completely take back the information you shared. Especially considering that by now, it might have been sold to other third parties.
Still, you can delete any accounts you no longer need that you created and make sure your name is off data brokers.
You can also take more precautions next time, like:
Being mindful of the contact details you provide. For example, you could set up an email account specifically for job hunting or use a throwaway email (YourNameHiringCompanyName@mydomain.com) for each site so that if you get spam, you know who’s leaking your information.
Reading the privacy policies of any platforms/companies you sign up to, especially if you’re going to share your resume with them.
Leaving out personal information from your resume, like your exact address.
Back to You
We’d love to hear your thoughts about all things data privacy.
Get in touch with us. We love getting emails from our readers (or tweet us @DeleteMe).
Don’t forget to share! If you know someone who might enjoy learning more about data privacy, feel free to forward them this newsletter. If you’d like to subscribe to the newsletter, use this link.
Let us know. Are there any specific data privacy topics you’d like us to explore in the upcoming issues of Incognito?
That’s it for this issue of Incognito. Stay safe, and we’ll see you in your inbox next month.