Incognito — April 2025: 23andMe Bankruptcy & Data Privacy
Laura Martisiute
Reading time: 12 minutes

Welcome to the April 2025 issue of Incognito, the monthly newsletter from DeleteMe that keeps you posted on all things privacy and security.
Here’s what we’re talking about this month:
- Your genetic identity. As 23andMe enters bankruptcy, genetic data from 15 million people is up for sale.
- Recommended reads, including “Honda Fined $632K for CCPA Violations, Agrees to Privacy Reforms.”
- Q&A: Does dumpster diving still happen?
For Sale: Your Genetic Data
In March, 23andMe filed for Chapter 11 bankruptcy.
A big privacy question mark this month is: What happens to the DNA-derived genetic data 23andMe received?
The short answer is that genetic data given to 23AndMe will be sold to a new owner (as per the company’s privacy policy).
When a customer gives their data to a company like 23AndMe (i.e., not a medical services provider), that data now belongs to 23AndMe. It is their business asset.
And just like in any other business bankruptcy case, assets will be sold to new owners. It just so happens that 23andMe’s core saleable asset is genetic data from over 15 million people.
Right now, we have no idea who will buy all this data.
However, 23andMe says not to worry.
According to 23andMe:
- There will be no changes in how your data is stored, managed, or protected, at least for now.
- Any buyer will need to comply with any applicable laws in respect to your data.
- 23andMe is looking for a buyer who shares its commitment to customer data privacy.
Privacy experts have a different opinion. Specifically, various experts in genetic data use say that:
- DNA information is a “blueprint” of your biological identity, and its sale could have “far-reaching consequences.”
- Whoever buys the company can change the privacy policy eventually, including who they share your data with.
What you should do now: If you’ve used 23andMe, delete your account. Doing so will ensure your information won’t be used for future research, and your genetic samples will be discarded (if you asked the company to store them previously).
Someone Else’s DNA, Your Data
Never took a 23andMe test? Doesn’t matter – you’re probably still affected.
Thank your (extended) family.
As Professor Carissa Veliz, author of Privacy is Power, said:
“If you gave your data to 23andMe, you also gave the genetic data of your parents, your siblings, your children, and even distant kin who did not consent to that.”
If someone within your family (including uncles, distant cousins, etc.) did a 23andMe test, they’ve inadvertently exposed some of your data, too.
Future Privacy Risks
It can be nice to know that you’re 46% Scottish/Irish and 15% Vietnamese, but uncovering your ancestry is not where genetic data stops.
The truth is that, right now, we don’t fully understand just how much information you can get about a person from their genetic data.
We do, however, know that:
a) Your genome is likely to reveal a huge amount about you.
b) In the future, genetic data will reveal much more than it does today.
Dr. Ryan Davey summed it up well when he said:
“Your genome contains a LOT more information than we currently understand, and what we currently understand is also a lot more than what 23andMe reports back to their customers in the form ancestry, disease predisposition and physical attribute information. What about 10 years from now? Is it possible that genetic analysis could one day offer far deeper and more personal insights into who we are? It is certain to offer greater insight into disease risk (heart disease, mental health, cancer, etc.) and life-expectancy, but possibly also sexual orientation, behavioral tendencies (such as risk tolerance), spirituality, IQ, and more. This makes it a uniquely powerful source of information about you – information that is currently (and legally) used for medical, insurance and law enforcement purposes in the United States and elsewhere. What future applications will arise?”
When you share your genetic data with 23andMe, you don’t quite know who might end up owning it or what they might do with it. Genetic data is also currently under-protected by regulations, especially at the federal level.
There’s also the risk of data breaches, which even the most sophisticated companies struggle with.
Once your genetic data is in the hands of a third party, there’s no way to get it back or make it private again.
23andMe and similar companies say that your genetic data is anonymized. But a growing body of research shows that supposedly “anonymized” genetic data can be re-identified if combined with other data.
Good to remember: “Even if separated from obvious identifiers like name, it [genetic data] is still forever linked to only one person in the world.” – The Electronic Frontier Foundation.
What Does the Law Say About Genetic Data?
Federally speaking – not a lot.
The Health Insurance Portability and Accountability Act (HIPAA) and the Genetic Information Nondiscrimination Act of 2008 (GINA) are the two laws that deal with genetic information privacy.
However, HIPAA only applies to the results of genetic tests administered by your healthcare provider. It does not extend to direct-to-consumer (DTC) genetic testing companies like 23andMe.
And while GINA protects individuals from being discriminated against by health insurance companies and employers, it doesn’t cover other third parties or other kinds of insurance companies, either.
Fortunately, there is better protection in a growing number of states. California has had a genetic data protection law since 2021, but now more states, including Montana, Tennessee, Texas, and Virginia, have recently passed similar laws.
A Wake-Up Call for Personal Data Sharing
Certain risks to genetic data privacy – like an extended family member doing a DNA test – are out of your control.
But you can still take steps to protect your identity.
We give the same advice for protecting genetic-related data as we do for all data:
- Be careful what data you share, and with whom.
Genetic data is dangerous because it is totally unique to you. You cannot change your genetic signatures if they are exposed. But other data is like this, too.
Our team strongly advises considering other health data in the same light as genetic data.
Information like your mental health status, voice tone, and heart rate patterns is also very sensitive. Yet we often share it with a range of services like:
- Fitness apps and wearables.
- Therapy apps.
- Mental well-being apps.
- AI chatbots.
- Smart home devices (e.g., voice assistants, connected home devices).
Fitness apps, in particular, often collect extraordinarily detailed health and location data, whereas apps for therapy, meditation, or mental well-being deal with some of our most intimate thoughts and feelings.
What’s worse, these apps frequently share your data with third parties.
23AndMe is a reminder that when unalterable data goes from you to a third party, it is not necessarily at its final destination.
Even if that data pertains to your health or genetics, control over it is no longer in your hands.
Tip: Not sure if you should share your data with a specific product/service? Check out *Privacy Not Included, a buyer’s guide created by the Mozilla Foundation, a non-profit organization behind the Firefox browser. The guide evaluates how products ranging from smart home devices to dating apps collect, use, and share your personal data.
Readers’ Tips
Once in a while, we share tips from our readers.
Remember the awesome privacy guide one of our readers shared last September?
An expanded and updated version is now available: DISENGAGE: Opting Out—and Finding New Options—to Reclaim Your Life from Spammers, Scammers, Intrusive Marketers and Big Tech.
Here’s what’s new:
- New sections or information on corporate news, surveillance pricing, AI face recognition.
- Information relevant to the new US administration in 2025.
- Edits and additions based on what the author learned using many of the practices since they first wrote the original version two years ago.
- Fun, “Dummies” style boxes for advanced techniques, warnings, and “try it now” simple tasks.
The guide’s author says: “As always, the cost is $0, you don’t have to enter your email and you won’t be tracked. There are no affiliate links. There is absolutely no catch. This was a passion project!”
Check it out. It’s truly an incredible privacy resource.
We’d Love to Hear Your Privacy Stories, Advice and Requests
Do you have any privacy-related dating app experiences you’d like to share? Have you ever been targeted with a romance scam? And how did you spot it for what it was?
Also, do you have any privacy stories you’d like to share or ideas on what you’d like to see in Incognito going forward?
Don’t keep them private!
We’d really love to hear from you this year. Drop me a line at laura.martisiute@joindeleteme.com.
I’m also keen to hear any feedback you have about this newsletter.
Recommended Reads
Our recent favorites to keep you up to date in today’s digital privacy landscape.
Honda Fined $632K for CCPA Violations, Agrees to Privacy Reforms

The automotive manufacturer Honda agreed to pay a $632,500 fine and modify its business practices to settle allegations by the California Privacy Protection Agency (CPPA) for violating the CCPA, the state’s privacy law. The agency said Honda required excessive personal information for privacy rights and used an online tool that limited consumer choices.
Mozilla Calls On Tech Giants to Block Surveillance Firm’s Data Scraping

Mozilla is urging tech platforms to block surveillance firm ShadowDragon from scraping user data across 200+ websites, including Reddit, Tinder, Etsy, and Duolingo, for US government surveillance. ShadowDragon’s tool, SocialNet, compiles public data into detailed user profiles, typically without their knowledge.
Utah Passes Nation’s First App Store Age Verification Law

Utah passed the “App Store Accountability Act,” the first US law requiring Apple and Google to verify user ages at the app store level instead of at the individual app-level. Apple has introduced new age assurance features but opposes collecting sensitive ID data for privacy reasons. The law takes effect May 7, though legal challenges may delay it.
Musk’s xAI Acquires X, Sparks Privacy Concerns

Elon Musk’s AI firm xAI acquired X (formerly Twitter), merging the two platforms together. X user posts, messages, and images may now directly feed tools like Grok. While some opt-out settings exist, past user data is already in use, with experts urging users to limit sensitive data sharing or leave the platform entirely.
You Asked, We Answered
Here are some of the questions our readers asked us last month.
Q: Been hearing a lot about deepfake calls. Are there any “best” ways to protect against them?
A: Timely question!
In one survey, 1 in 3 US respondents said they got a deepfake call in the last year. And about half of them were successfully scammed.
Many deepfake calls pretend to be from financial institutions, delivery services, accommodation, and similar.
Here, “old” advice of never sharing personal information (account numbers, passwords, etc.) still stands, along with hanging up and calling the organization/service back using a phone number from an official source (e.g., their website).
However, an increasing number of deepfake calls impersonate family members and colleagues.
According to the Electronic Frontier Foundation (EFF), one of the best things to do here (apart from hanging up and calling the person directly) is to create a “password” to confirm you’re talking to who you think you’re talking to.
The EFF recommends choosing a password that is easy to remember and is not public information.
Q: A common privacy tip I used to see (but don’t really anymore) was to shred documents. Curious, does dumpster diving still happen?
A: Great question!
Dumpster diving is very much alive and well.
For example, in 2023, an Illinois woman was ordered to pay $15,000 for identity theft using other people’s information she found while dumpster diving.
And in 2019, a man was caught stealing from mailboxes and trash bins.
On internet forums dedicated to dumpster diving, you can likewise see that people find tons of sensitive data while dumpster diving.
One person said: “I found someone’s original social security card, an original birth certificate and a copy, plus this person’s W-2s for several years, and 2 credit cards. […] From what I could tell from online sources, it looked as if the parents were divorcing and the dad, who lives in my complex, had pitched all of his adult kid’s stuff in the dumpster for some reason. (We also found lots of jewelry, toiletries, clothes, and other less sensitive papers.)”
The moral of the story is that it’s still good practice to shred any sensitive documents you throw out.
Back to You
We’d love to hear your thoughts about all things data privacy.
Get in touch with us. We love getting emails from our readers (or tweet us @DeleteMe).
Don’t forget to share! If you know someone who might enjoy learning more about data privacy, feel free to forward them this newsletter. If you’d like to subscribe to the newsletter, use this link.
Let us know. Are there any specific data privacy topics you’d like us to explore in the upcoming issues of Incognito?
That’s it for this issue of Incognito. Stay safe, and we’ll see you in your inbox next month.

Don’t have the time?
DeleteMe is our premium privacy service that removes you from more than 750 data brokers like Whitepages, Spokeo, BeenVerified, plus many more.
Save 10% on DeleteMe when you use the code BLOG10.