This Week on What the Hack: Facial Recognition Software
This Week on What the Hack: Facial Recognition Software
A not-entirely innocent moment at a concert set the internet on fire. But what if that was just a symptom of a bigger problem? New York Times journalist Kashmir Hill, author of Your Face Belongs to Us, helps us explore the chilling reality of ambient surveillance, and how facial recognition could be the end privacy as we know it.
Episode 211
Ep. 210: “The Coldplay Couple and the End of Anonymity”
“What the Hack?” is DeleteMe’s true cybercrime podcast hosted by Beau Friedlander
[00:00:00] Beau: It started with a concert. Coldplay.
[00:00:03] CLIP: [Sound of Coldplay music and crowd cheering]
[00:00:09] Beau: A man and a woman appear on the jumbotron. They’re not kissing, but they’re clearly more than friends, and it’s even more clear that they’ve just been caught. You’ve probably seen it. The woman turns away, the man ducks down, and then Coldplay’s front man Chris Martin makes a joke.
[00:00:29] CLIP: [Chris Martin speaking to the crowd]
[00:00:34] Beau: Then the internet does its thing. The couple is identified. The guy is a guy, I’m not gonna say his name because I believe in privacy, and the woman also, her name is known if you feel like Googling it. He’s the CEO. She’s the head of HR, same company. They’re both married, just not to each other. From there, TikTok goes to town. Google Trends spike. LinkedIn pages are scrubbed, spouses are tagged, bookies take bets on when these people are getting divorces, and the Internet’s anonymity problem—specifically, its AI-powered facial recognition problem—got its 15 minutes of fame yet again. I’m Beau Friedlander, and this is What the Hack, the show that asks, in a world where your data is everywhere, how do you stay safe online?
The Rise of Facial Recognition
[00:01:47] CLIP: We’re going to use our cameras and put some of you on the big screen. This is the way we’re going to get to say hello to some of you.
[00:01:59] Beau: Hey everyone. Welcome back. Today I want to talk about something truly fascinating and very concerning. It was the subject of the book, Your Face Belongs to Us: A Secretive Startup’s Quest to End Privacy as We Know It, written by New York Times reporter Kashmir Hill, a friend of the pod. We had Kashmir on to talk about it. Kashmir’s book focused on a startup called Clearview AI. Now picture this: at the time, that’s two years ago, this company had built a facial recognition library consisting of over 30 billion photographs from the internet. Scraping means they just took ’em. Yeah, billions. The killer app part? Selling access to that massive database to hundreds of law enforcement agencies. The tech is pretty wild. It’s helped identify people from super challenging sources like ATM videos and CCTV surveillance tapes. It’s all real In one story, agents got an ID from a surveillance photo of a man whose reflection was barely visible in a pair of sunglasses and matched him to a photo from a fitness expo online. That same facial recognition technology trained on billions of public images is now functionally available to anyone with a browser. Maybe not with as many images; hundreds of millions, not billions, but tools like PimEyes let users upload a photo and search the web for matches, even across really obscure websites. No warrant, no just cause, just curiosity. Here’s Kashmir Hill.
[00:03:36] Kashmir Hill: The big breakthrough is until now in society, when you were walking around in a big city, I’m not talking about in a small town where a lot of people know who everyone else is, but in a big city you could walk around anonymously. You can go into a store and by condoms and kind of assume no one’s going to know who you are and what that means for you as long as you use cash. Otherwise, lots of people tracking it. You know, you can have a sensitive conversation over dinner and pretty much assume that the strangers won’t be able to connect the dots about what you’re talking about. You can go to a protest and assume the police aren’t gonna take a photo of you and be able to track everyone who was there. You can go to a Planned Parenthood and if you walk out and there’s protesters outside, they’re not going to take a picture of your face and know who you are. So what Clearview AI was doing was taking away that ability to be anonymous because they were making faces searchable in the same way that Google made our name searchable, and kind of organizing all of this online information that has been gathered about us over the last two decades, some of which we’ve put out there ourselves, and just making it available with a click of your face.
[00:04:58] Beau: AI-enabled facial recognition is now just a part of daily life, and that’s how TikTok users identified the Coldplay couple.
The Digital Panopticon
[00:05:12] Beau: The philosopher Jeremy Bentham famously designed a prison he called the panopticon. The building was arranged so that a single guard in a central tower could see every inmate, while the inmates could never tell if they were being watched. The idea was that the possibility of being watched was enough to make prisoners police their own behavior. Now, imagine that but on a global scale, and we all take turns being the guard. We’re not just in a panopticon; we’ve all become both the guards and the prisoners. We the public have become the agents of our own ambient surveillance. This has nothing to do with cheating CEOs or problematic HR situations; it’s just a touchpoint. Let’s zoom out.
[00:06:14] CLIP: No need for your ID because your smiling face is all you might need to get through airport security faster this summer. Our consumer investigator Chris Camero shows us how facial recognition is growing at airports coast to coast, including a brand new rollout today at SFO. Starting this summer, some savvy travelers will jet through special new lines at the airport, and they won’t even have to pull out their IDs. Essentially, your face becomes your ID. United Airlines is rolling out new touchless ID at SFO and several other airports. Here’s how it works. If you have TSA pre-check and check in online, you can upload your passport photo in advance. By doing that, you then get to bypass normal bag drop and security lines. Instead, a TSA camera quickly confirms who you are. That camera is comparing your passport photo, and with facial recognition, that’s how you’re getting through security.
[00:07:11] Beau: That’s not a black-and-white case, right? The past few decades have provided many reasons to impinge on the right to privacy when it comes to air travel safety. Okay, so it’s one thing to use Face ID to speed up a security line—it’s another when that same technology is the plaything of idle curiosity. But even as far as law enforcement goes, it doesn’t always work. Not right, anyway.
Bias in the Algorithm
[00:07:39] Beau: In 2020, a protestor in New York was arrested weeks after attending a Black Lives Matter rally. The New York Police Department used Clearview AI on some security camera footage. There was no warrant. Worse, there was no workflow or contingency plan in place if the facial recognition program was wrong. And that actually matters a lot because the protestor was African American.
In the 1950s, Kodak film famously had a problem capturing images of people with darker skin tones. This was due in large part to something called “Shirley cards,” which were light-skinned models the companies used to calibrate their film. Today, AI facial recognition systems have their own version of Shirley Cards. They trained on datasets that are overwhelmingly comprised of lighter-skinned individuals, mainly white men. Now, privacy is non-partisan. If a training bias means that people who aren’t from a certain demographic aren’t as accurately ID’d, which is the case here…the arrest of this guy after the Black Lives Matter rally should have been a duh moment, but as with all things in our move fast and break things economy, the memo that history had written about the issue was ignored. The same year, Detroit Police misidentified Robert Williams, another Black man, using facial recognition. He was arrested at his home in front of his family for a crime he didn’t commit. Here’s Kashmir Hill again.
[00:09:16] Kashmir Hill: So Robert Williams is a suburban dad in Michigan. Lives outside of Detroit and has two young daughters, a lovely wife. One day he’s at work and he gets a call from a police officer saying, “Come turn yourself in. There’s a warrant for your arrest.” And it’s two days before his birthday, so he thought it was a friend pranking him. Robert Williams was arrested for the crime of looking like someone else. The way that these systems work is they’ll rank a bunch of candidates in the order of what they’re most confident is a good match. So Robert Williams was ninth on that list, and the human analyst thought he was the best match. And the police did not do much more investigating beyond that. They showed his photo in a six-pack photo lineup to the security analyst who had watched the surveillance footage and she said, “Yeah, I think it’s that guy.” And then they pulled pawn shop records and Robert Williams, he wears actually expensive watches, Breitlings. And he had sold one once at a pawn shop, so to them they’re like, “Okay. Facial recognition says this. Eyewitness agrees with the computer that he looks a lot like this suspect, and he pawned a watch once and this person stole a watch.” And so he wound up arrested and held in jail overnight, charged, had to hire a lawyer to fight this case. It’s crazy. It can go really wrong if police aren’t pairing this with a real, robust investigation, because these systems do not always get it right.
[00:10:46] Beau: Now, this is not just junk history coupled to a couple of outlier tales. These aren’t edge cases. According to the ACLU, facial recognition systems misidentify people of color at disproportionately higher rates. And yet the tech is quietly being deployed in airports, malls, concert venues, even churches. Again, there ought to be a law. But there isn’t, so we’re going to get into that in a second. Private companies don’t have any obligation to tell you if your image is being used, you don’t have the right to opt out in most states, although we’re going to talk about that too. And as Kashmir Hill said to us when we interviewed her, your face doesn’t belong to you. It’s scraped, indexed, stored, and made searchable. And once that happens, it’s not entirely yours.
[00:11:37] Kashmir Hill: For so long, the researchers in this field thought that facial recognition was just a uniquely human intelligence and that a computer would never be able to do this, to recognize that you’re the same person when you’re smiling and when you’re frowning or when you’re wearing a hat or sunglasses. And I was talking to the people who made the breakthroughs that put us on the path to Clearview AI, and I said, “Did you ever think to the future? When this works really well, are you worried about it being used by authoritarian governments? Or were you thinking about race and bias when you were only using photos of white men to test and train your technology?” And they said, “No.” It was hard to imagine how it might be when it got good, so it was this chain of people who were saying, “Someone else is going to think about the downsides of this. But first we just want to make it happen. We want to accomplish this. How far can we take this? What are computers capable of?”
Normalization of Surveillance
[00:12:40] Beau: John Oliver, host of Last Week Tonight, had a pretty good take on it. Here he is.
[00:12:46] Last Week Tonight: If you want a sense of just how terrifying this technology could be if it becomes part of everyday life, just watch as a Russian TV presenter demonstrates an app called Find Face.
[00:12:57] CLIP: If you find yourself in a cafe with an attractive girl and you don’t have the guts to approach her, no problem. All you need is a smartphone and the application Find Face. Find new friends. Take a picture and wait for the result. Now you’re already looking at her profile page.
[00:13:18] Last Week Tonight: Burn it all down. Burn everything down. I realize that this is a sentence that no one involved in creating that app ever once thought, but just imagine that from a woman’s perspective. You’re going about your day, when suddenly you get a random message from a guy you don’t know that says, “Hello, I saw you in a cafe earlier and used the Find Face app to learn your name and contact information. I’ll pick you up from your place at eight. Don’t worry, I already know where you live.”
[00:14:00] Beau: Even if Clearview AI and off-the-rack versions like PimEyes, Amazon Recognition, and Face First, even if they didn’t exist, the Coldplay clip could have still gone viral because we’ve become part of the algorithm. Part of the machine. We tag, we search, we cross-reference. We know where to find wedding registries, Instagram handles, and Zillow listings. We’re not just observers anymore; we’re spies. Because of that, we’re part of the problem. Now, what does the problem look like? I mean, take your pick. Dozens of media outlets requesting to repost the original TikTok? That was news. There are real problems in the world. Corporate brands hijacking the moment for memes and now in my case a podcast episode, actual betting markets, right? Gross. Chris Martin has been doing the same bit spotlighting random fans for years. It’s not his fault. The woman who posted the clip, Grace Springer, she almost didn’t do as she told a UK news program called This Morning.
[00:15:13] The Morning: [unintelligible] where it’s 5 AM. Thanks for getting up, Grace.
[00:15:17] Grace: Of course, guys. How are you?
[00:15:19] The Morning: Nice to see you. So Grace, just talk us through the moment. You were filming because you wanted to get caught on camera, right?
[00:15:25] Grace: Exactly. I was hoping to see myself on the big screen, and I love to capture moments, so that’s why my phone was out in the first place.
[00:15:33] The Morning: All right, so you were filming the camera as if to hopefully it was going to come on you.
[00:15:38] Grace: I was being hopeful.
[00:15:40] The Morning: And then obviously you caught that moment. You saw it, you saw this lovely couple, if you think about it, if they had just relaxed and just chilled out, that probably…would you have ever even uploaded it? I don’t think you would have, but it was that reaction. And at what point did you think, “You know what, I’m going to post that on social media”?
[00:16:01] Grace: So, in the moment when I filmed it, I didn’t think much of it. But of course, everyone was kind of chattering. There was over 50,000 people at the concert, so it was a hot topic. But it wasn’t until after the concert where I was debriefing the moment with my friends and I said, “Let’s review the footage. Let’s see if it really looks that bad, and I think it does.”
[00:16:26] CLIP: Either they’re having an affair or they’re just very shy.
[00:16:29] Beau: Searching “CEO of company” because I’m not going to name it because I’m not participating, on TikTok, which was an official suggested term on Google, it pulls up an endless, endless doomscroll to end all scrolls of hot takes. Major news outlets have shared the same clip of course, and largely untouched, on their own platforms. Again, there’s really horrible things happening in the world, and here I am talking about the couple from Coldplay. And here you are listening. That might be the whole problem.
The Path Forward
[00:17:10] Beau: Here’s Kashmir Hill again.
[00:17:16] Kashmir Hill: There are similar services now that while Clearview AI is limited to police use, there is a public-based search engine called Pimeyes. And any of you could go there right now, and you’re supposed to upload your own photo. You have to check a box saying this is me and I’m over 18, but they don’t have any technical measures in place. I have a subscription so that I can see the whole photo. If you just go to it right now and do it, it will show you a bunch of little, like your faces, but you can’t see the whole photo or where it came from. I have a subscription, and I can do 25 searches a day, which I don’t know why I would need to search my own face 25 times a day, and yeah, it’s $30 per month. The database is not as robust as Clearview AI’s. You know, it’s hundreds of millions of photos instead of billions, and they didn’t scrape the social media sites, and I actually was thinking about it a lot this month because of a woman in Virginia named Susanna Gibson who is running for office in a local race. She’s a Democrat, she’s a nurse, and the Washington Post reported that a Republican operative told them that she had extreme sex acts on the internet with her husband for tips. And these videos were not linked to her name, but were linked to her face, and I wondered how the operative found them. You know, was he watching pornography and he stumbled on it, or did he run her face through something like Pimeyes and that’s how he turned them up? I just don’t know.
[00:18:51] Beau: These surveillance tools are frequently used by law enforcement and federal agencies, as we’ve discussed, against everyday people: undocumented immigrants, tourists, librarians, doctors, carpenters, people who own banjos. I don’t know. Anybody. People who never consented to being tracked in the first place. And while Clearview AI insists its technology is strictly for law enforcement use, the reality is messier. Reports show it’s already been made available to retailers for loss prevention, including Kohl’s, Walmart, and Macy’s. And yeah, I mean, I guess law enforcement. A lot of these places have holding cells in the basement or somewhere in the establishment. The company says it wouldn’t sell to adversarial nations like China, Russia, or North Korea. But when asked about other countries, Clearview says it’s “focusing on the U.S. and Canada.” And taking it one day at a time. But when you’re handling something as sensitive and powerful as this sort of biometric surveillance, “We’ll figure it out later” is not a good enough answer. A tool this invasive demands clear limits, strict oversight, and meaningful accountability, not vague promises, corporate discretion and a shrug. Because once the infrastructure’s built, it’s very hard to roll back. Clearview does allow people to opt out of their database if you happen to live in California, Colorado, Connecticut, Illinois, Utah, or Virginia. But if you do opt out, that may not stay that way. You may not stay opted out, because in the same way that our data pops up over and over and over again online, you can remove it, it gets sold again. Boop, it’s right where you thought it wasn’t. Whack-a-mole. I don’t care how fast you are. You’re not going to get it all, and so Clearview AI is scraping for people’s images all the time. If they come across your photo again, there you are. Boop. Again, you’re there. You can’t really sue. I mean, unless something really horrible happens. I suppose you can, but who are you going to sue? Clearview’s defense is just going to say that if it’s public, it’s fair game. So maybe someone should have sent the memo that Clearview was going to be created 10, 15 years ago when we started posting our images online before we knew this was going to be an issue. And maybe that’s the real problem here. It’s not just that the system has flaws. It’s that this is how it was designed to work. Scrape first, deal with the consequences later. So what now? Well, this is where regulation comes in. We’re talking biometric privacy laws, transparency, and hard rules about who can use this stuff and what they’re allowed to do with it. Because hoping companies will police themselves? That’s worked out zero times so far. Here’s Kashmir Hill.
[00:22:03] Kashmir Hill: This is something that I worry about is the compounding of surveillance technologies, one on top of another, and it is what we are seeing happen already in places like China where they’re tracking the phone, they’re tracking the face, they’re making inferences between the phone and the face and then putting that on top of a social credit score where you are judged all kinds of behavior, including paying your bills on time. And then they decide what access to services you have. You can’t buy a train ticket if you have a low credit score, and you can’t really subvert the system because they’re tracking your face and tracking your phone. Yeah, I mean, I do find the idea of this just continuing unfettered with no rules, no control over how our information is used… I think we will have a very chilling world. Very hard to be free in that world. But privacy laws work. And if you don’t like the idea of being in one of these data bases, if you live in Europe, you can get out of them. Here in the United States, if you live in California, or Connecticut, or Virginia, or Colorado, you have an access and deletion law. You can go to Clearview AI and say, “I want to see my report, I want to see what you have on me, and I want you to delete my face.” And in Illinois it looks like Clearview probably violated the law there. Sometimes people are very resigned about technology and they say, “There’s nothing we can do. It’s taking our privacy.” And it is not true. There’s many ways in which we have constrained technologies. We passed wiretapping laws, and that’s why the millions of surveillance cameras that surround us in the United States only record our images and don’t record our conversations. There is something that you can do on the individual level. If you’re so lucky as to live in one of those states, you can delete yourself, but at a more systemic level, we need to pass laws. That is how you decide what happens with technology.
[00:24:24] Beau: The people caught in the Coldplay video weren’t famous. They weren’t criminals. They were just there exercising their perhaps not entirely constitutional right not to be caught having an affair. Welcome to the Wild West, period, of ambient surveillance, not just by governments, not just by companies, but by each other. All of us. And so the next time you’re at a concert, on a train, or walking into a pharmacy, ask yourself, is my face mine? Or is it already someone else’s content? And while we’re at it, what about your face as a key to make stuff work? Again, it’s no longer yours. Let’s say company X, not X but a company swears they will do the right thing with your face. What happens when it’s acquired or goes out of business? Will the new company honor the original privacy policy? Maybe. Promises are broken all the time. Terms of service are rewritten. Your click, someone else’s payday. Now, what about adding a threat-actor in the mix? It could be a rogue employee or a criminal, a stolen hard drive, a breached server, all these things. One breach and your biometric data, the literal map of your face that gets you into your app to listen to Coldplay or withdraw money from your bank is up for grabs. You can’t change your face the way you can a password. There’s not much you can do about any of this, but you can stay aware. And the other thing that you can do is ignore the next Coldplay people moment. We make the world by looking at it these days. If you change the way you look at things, maybe the things you look at will change. Your Face Belongs to Us, a Tale of a Secretive Startup and the End of Privacy by Kashmir Hill is available wherever you get books, even in paperback and audio. Paperback came out last year. It’s an amazing read. It will scare you, but in a way that I hope will make you think more about these issues and maybe get you to write to your local legislator asking them to do something about it. And now it’s time for our tinfoil swan, our paranoid takeaway to keep you safe on and offline. Ambient surveillance. Fun stuff, right? Your image is being used for stuff that you did not say yes to. South Park’s recent roasting of President Donald Trump pushed the envelope on that score. The episode got a lot of publicity for some lewd depictions, but the end of the episode matters for a totally different reason. It uses AI technology to depict President Trump having a Christlike moment in the desert. It’s extreme parody. It’s a slog. He’s hot, he’s removing his clothes till the end of the schtick when he’s completely naked. It looks just like Donald Trump. It’s not Donald Trump, obviously. Why we should all be paying attention, and you might not like the way they depict Donald Trump, it’s simple. The same thing could be done to someone you care about. We can all be targeted by cybercriminals who know who we know, and they know that by going online and seeing where we live and who else lives at that address. From there, you can find images of people connected to other people, make a video. he threat is real. You’re not gonna be able to tell the difference. You won’t. You might think you will, but it’s not true. So if you are confronted with something that seems outlandish, I want you to pause and I want you to get online and start Googling whether or not it’s real. Because in the same way that the internet was able to identify the Coldplay couple, the internet is also able to identify a fraud, in this case misinformation using AI. Just look for the numbers. More people will always understand that something is fraudulent than not. When you go online, there’ll be a consensus, and that is what you need to be looking for. We will find a consensus about our privacy at some point. I have no idea when. Right now, I think everyone is just enjoying being their own government agency with the ability to surveil anything they like. Now, I hope that ends, because what I do, so long as it’s legal, is nobody’s business, and I hope you feel the same way. So stay safe out there. And that is our tinfoil swan.
Learn More:
- Read more about 2025’s most viral data privacy moments
- Learn how to keep your information safe so you can be a tougher target for scammers
- Read more about the dangers of facial recognition technology
Our privacy advisors:
- Continuously find and remove your sensitive data online
- Stop companies from selling your data – all year long
- Have removed 35M+ records
of personal data from the web
news?
Exclusive Listener Offer
What The Hack brings you the stories and insights about digital privacy. DeleteMe is our premium privacy service that removes you from more than 750 data brokers like Whitepages, Spokeo, BeenVerified, plus many more.
As a WTH listener, get an exclusive 20% off any plan with code: WTH.



