Skip to main content

This Week on What the Hack: Flock Safety Privacy Concerns

This Week on What the Hack: Flock Safety Privacy Concerns

Flock Safety says their cameras have never been hacked. Benn Jordan and Jason Koebler watched themselves on one from their homes. This week: a deep dive into the vulnerabilities, the no-opt-out tracking, and a simple question: Does any of it makes us safer?

Episode 242

https://www.podtrac.com/pts/redirect.mp3/pdst.fm/e/tracking.swap.fm/track/tcQd6Q6C0RUUlOHq1Ytj/mgln.ai/e/51/pscrb.fm/rss/p/traffic.megaphone.fm/TPG9165748574.mp3
Loading title…
0:00

Ep. 242: “Surveillance in America Part 2: Flock You Very Much”

What the Hack?” is DeleteMe’s true cybercrime podcast hosted by Beau Friedlander

Beau: Last week we talked about: The camera at your front door, the one you bought, installed, or had installed, the one you wanted; The fine print, you’d probably hate if you actually read it; and the way being watched changes how you live even when you have nothing to hide. That was the opt-in version. This episode is about the other kind. You don’t get a say. You can’t opt out.

NBC Bay Clip: Frustration is growing in Richmond as the fate of Flock license plate reader cameras remains uncertain. Council member Caesar Zepeda argues the cameras could have helped police track down a missing minor believed to be a victim of human trafficking.

Beau: Flock license plate readers are everywhere. You’ve probably driven past one without even knowing it. A little box on a pole outside a school. At a trouble spot. Blink, and miss it. The pitch for these things was super simple: catch stolen cars, solve crimes, keep neighborhoods safe. Scan a plate, run it against the database. Yada, yada, yada. Then they introduced AI cameras that can zoom in on your phone and read it literally. Others, they can listen for gunshots, screams.

ABC Tampa Bay: Picture this cameras popping up all over your neighborhood, quietly snapping license plates 24/7.

Beau: They come preloaded with constitutional concerns. I’d call them violations, but I’m not a lawyer.

NBC Bay Clip: Police say federal agencies gained access to their cameras without their knowledge. 

ABC Tampa Bay: They’re called Flock cameras and they’re used by cops across the country to hunt down bad guys and to find missing people.

Beau: And of course they’re being used to commit crimes by the all-too-human people sworn in to enforce the law.

ABC 12 Clip: New tonight, a Milwaukee police officer charged with using police equipment to track a romantic partner appears in court for the first time.

ABC Tampa Bay: It works until you start asking questions, wondering, who’s watching me?

CBS LA Clip: Last week you told us this data was accessed by law enforcement outside of California, which is against state law.

ABC Tampa Bay: The Institute for Justice called the use of Flock cameras, warrantless mass surveillance, and has concerns that police are abusing the ease of access the system provides.

Beau: Privacy and surveillance are opposites, and for most of history, surveillance was reserved for a specific thing: to keep tabs on criminals and for law enforcement. Well, and they were connected. J. Edgar Hoover wasn’t shy about bending those use cases for political ends, and I’d argue that’s still happening, but for most of us, surveillance wasn’t personal. It wasn’t aimed at us. Now, that’s changed. Flock and the companies behind them built a network of around 100,000 devices right now. Maybe more. It’s not just spies and bad guys anymore. It’s everyone. Flock itself markets this, explicitly calling it, and I’m quoting them, “The only network powered by cities, neighborhoods, and businesses all working as one.” But here’s the thing: You can’t opt out of that. Today we’re gonna be talking about what it actually means to live inside that surveillance system (you already live in it,) but you never agreed to. Now that you know ,it’s gonna feel different. Now, who built it, we’re going to talk about that, and who profits from it, same people, whether it works, and if anyone’s actually making sure it’s safe. I’m Beau Freelander and this is What the Hack, the podcast that asks, in a world where your data is everywhere, how do you stay safe online?

Beau: We’re gonna jump right in. I reached out to you because of your coverage on Flock.

Benn: Yeah, I mean they…

Beau: This is Benn Jordan, who makes music and breaks things on YouTube. He became interested in Flock Safety the way a lot of people have recently The cameras showed up in his area and he got curious. That curiosity turned into one of the most watched investigations into a company anywhere on the internet. I don’t really know where to start, but I’m gonna start with license plate readers, just because I… I don’t know. That’s where I’m gonna start. So according to Flock’s own site, right? They’re built to capture vehicle details, not just footage, with the caveat that they’re able to do a lot of stuff that’s like really, really beyond what an LPR does.

Benn: Louis Rossman did a really good job of like, a lot of times I hear people sort of mimicking what Louis Rossman said about it when he first did his video, which, he said, stop calling them LPRs. Let’s call them AI cameras. ‘Cause that’s what they are. And if it makes people see them and, and say, okay, what, actually ask a question like, what does this capture and what is this saving? And where’s this data going? Which is the question that, you know, I initially started asking which began my research. So, yeah, I mean, they could get license, obviously license plates, but also like vehicle damage, bumper stickers, color. They have patents for, I mean, everything from detecting people’s race and, you know, things like that. Like they have the… but you know, a patent doesn’t mean that it’s being put in use. Then their Condor cameras can detect dogs, people, it could detect some different characteristics about people. And then it- the cameras themselves track you. They literally move and they like, make noise when they move. Like they’re actually incredibly dystopian. Like they’re way worse than you would imagine. From a distance, they look like a normal security camera. But yeah, I mean, you walk past ’em and they go like, and like, follow you around and it’s actually like what is happening? This is crazy.

Beau : Flock Safety was founded in 2017. They sell themselves as a public safety company, but here’s what they actually are: data brokers with cameras. Drive past one of their devices. Your license plate gets photographed, location and time, and fed into a database. That’s the functional equivalent of a GPS tracker. If you pass enough of these things, right? But it doesn’t stop with your car. Flock’s pan-tilt-zoom cameras are showing up in public parks, pointed at running trails, pointed at playgrounds. Flock says they don’t use facial recognition. That’s technically true today, but they’ve already built a product called Freeform that lets a police officer or you know, a law enforcement officer, type in “man in blue shirt and cowboy hat” and get visual matches instantly across thousands of cameras. That’s not facial recognition, that’s worse. Appearance-based tracking with no warrant, no limit on who gets searched. A vague parameter “cowboy” that that almost ensures mistakes will be made. It ensures it.

King5 News Clip: You’re looking at a Redmond Police drone video of an arrest that happened in August. They’re handcuffing Thor Andrews Sr. whose Ford Fusion was pinged by Flock cameras in the area. Thor Sr. says police didn’t verify his identity until after he was in handcuffs, despite him telling them that they have the wrong guy multiple times. Police say they followed Flock’s alert and followed protocol and said Thor Sr. became hostile during the arrest, but police narrative shows PD knew the Ford Fusion was registered to Thor Sr. and not Junior. Thor Sr. says he wanted to share this video and his story to show that Flock cameras can make mistakes.

Beau: Now Flock holds a patent for a way to track humans by race, gender, height, weight, and clothing across their entire network. Adding capabilities like this don’t require new hardware. You know, if they want to add facial recognition later, it’s just a software update and you again, never agreed to any of it. There’s no terms of service. No opt out, no DeleteMe.

Jason Koebler: So basically Flock has this like huge network of cameras. Something like 8,000 towns have Flock cameras.

Beau: This is Jason Koebler, co-founder of 404 Media. Before that, he spent a decade at Vice’s Motherboard. Jason’s one of the most cited technology reporters in the country, and he’s been covering Flock for a while.

Jason Koebler: In order for these cameras to be effective at finding where license plates have gone and therefore people have gone, their cars have gone their time, you know, you need to network them together in some way so that when you search it, there could be a hit. And the way that it works right now is, if you want to search Flock’s network, you have to reciprocally give access to the network. So Flock is not gonna let you search a national network of all their cameras unless you are giving access to the cameras in your town.

Beau: Law enforcements pay to access these databases. So do businesses, so do homeowners associations. The more people who do that, the better the service. In theory.

Jason Koebler: And so they’ve created this incentive for police to basically opt themselves into this network.

Beau: It doesn’t matter where they are in theory. A Connecticut cop can flag your license plate and until it’s unflagged, every time a Flock camera sees your car anywhere in the network, it will generate an alert. No warrant, no suspicion required.

Jay: The data that flock is collecting should not be collected.

Beau: This is Jay Stanley. He’s a senior policy analyst at the ACLU where he spent more than two decades writing about surveillance, AI, and what happens to civil liberties when technology moves faster than the law. If you’ve read a serious policy argument against this kind of technology, there’s a very good chance that Jay wrote it.

Jay: We think that if a license plate reader scans your license plate, like any officer sitting there can call it in, and your car is not wanted, it’s not on any legitimate watch list, and there are big questions about how the watch lists are created too, then that data should be flushed right away. Why are they keeping track of you on where you’re going if you’re not suspected of involvement in wrongdoing? And so the Flock’s database should not exist at all. But it is true that they have become a very powerful entity because of that. They’re sitting on enormous amounts of information that reveal things about lots of people’s lives.

Beau: Now you might be thinking, okay, but at least it works. Right?

NBC News Clip: The viral video recorded by a witness shows four children crying face down on the pavement, two of them in handcuffs being detained by police. The youngest is six years old.

NBC News Clip: I don’t give a damn what nobody say. That’s police brutality.

NBC News Clip: Brittany Gilliam says she took her daughter, sister, and nieces out to a nail salon Sunday morning when officers with the Aurora Colorado Police Department arrive.

NBC News Clip: And next thing I know, the police pull up silently behind them and had guns drawn on the children.

NBC News Clip: Police said they were responding to reports of a stolen vehicle, but soon discovered it was not the family’s blue SUV that was reported missing. Rather, it was a motorcycle with the same license plate number from a different state.

Beau: Okay. Well, you make an omelet, you’re gonna break some eggs. Like, it’s known surveillance stops crimes. At least that’s right.

Benn: I guess, like, I’m very agnostic about it, like, because of the research that has been done. Like, I can’t say they do work. I can’t say they don’t. I can find examples where crime went up after Flock was installed, or where crime crime went up after LPRs were installed. Or I could find examples where it went down.

Beau: Benn Jordan again. 

Benn: Most of the examples, the crime went down, but that’s because crime went down nationally between, you know, 2020 and 2025. And so, and that’s when Flock started getting installed everywhere. So that phenomenon is what their paper that they wrote that they continuously cite points to is the national drop in crime rate. But they just, you know, put their logo in front of it and say that we did this.

Beau: Flock says their cameras reduce crime. What they don’t mention is that some of that data preexists the cameras. The crimes are supposedly stopping, but there’s a deeper problem. Crime isn’t a natural category. It’s a decision. It’s a cultural thing, right? It’s a decision about whose behavior gets counted. White collar crime costs Americans more every year than every street crime combined. And I bet you the number is, like every street crime combined over a period of 10 years compared to a year. You don’t see cameras going up on Wall Street or all over private islands in the Caribbean.

Benn: Yeah, I mean, oh yeah, that too. They’re a little leaky with the with the actual dates of when they start and, yeah. And they’ve claimed in Oakland, for example, yeah. They claim that they reduced crime in a period before they even installed the cameras. But you know, the reality is that like lowering crime, like technology that lowers crime, it’s almost impossible to find data that actually suggests one way or another. Because like, we haven’t even figured out if like increasing or decreasing police helps crime that much. It’s so… like when you think of like crime down to like the macro level and then down to the micro level, it’s like, it’s so different everywhere. And like I even say this, like, I grew up in South Chicago and then I moved down here to Georgia and like, crime is almost like, it might as well be two different words. ‘Cause like, the types of crime are so different. Like, I don’t have any gangs here. That’s not something I worry about. But like, there still is crime, but it’s like people dumping their garbage on the ground. So it’s, it’s so many different- it’s like completely different worlds. But I think like, one of the common things that you’ll hear either a police officer or even just like somebody who’s just, you know, wants a safer world, they’ll say, okay, well, if you install a security camera on any given street, then car break-ins, it’ll lower the amount of car break-ins. It’ll lower the amount of car theft or, you know, assaults on that street. And it’s like, it won’t lower them in the city though, like it won’t lower them generally. Like it gets down to a point where it’s like, yes, if you put a bunch of surveillance in one area, then people will just commit more crimes, you know, the next block over where there is no surveillance. It’s not like they’re just going to say, you know what, I’m gonna become a doctor instead of dealing drugs; I think I’m just gonna, you know, become a neuroscientist. That’s just generally how it works. And so when you fill an entire city full of surveillance, like, one would just assume that what would happen is criminals would then just modify the way that their enterprise is running by obfuscating the way that they’re committing the crimes. So it’s actually harder for things like community policing to take place. So again, it’s like, I can’t say it makes it worse, but I can’t say it makes it better. Like, it very, very complicated and I think if somebody claims one way or another, it’s like they’re just kind of…. that’s their own opinion. And that’s not based on any sort of data.

Beau: Right. I think Mark Twain said there’s, you know, three kinds of lies: lies, damn lies and statistics. And it’s just however you wanna twist it. You know, but we’re also not looking at a situation like Xinjiang, China, where the Uyghur population lives, and there’s a camera literally every hundred feet or 50 feet-

Benn: Yeah. Mm-hmm.

Beau: Everywhere.

Beau: So Flock doesn’t really work the way it says it does. The data’s shaky, the errors are real, and the people paying for it are you and me. That’s not even the part that should keep you up at night, because everything we just talked about assumes the system is working as intended. It assumes the data stays where it’s supposed to stay, that the cameras are doing what Flock Safety says they’re doing, and that the only people watching are the ones with badges and on the job, not to check out somebody they saw at a pharmacy or a supermarket or anywhere. It assumes that there’s some rhyme or reason to this and it’s been checked out and vetted by lawyers and it’s in keeping with the constitution and yeah, it’s all on the up and up. That’s what it assumes. So surely a private company selling this kind of infrastructure to law enforcement has security that matches the sensitivity of what it’s handling. Right? Flock is known for having machines that are running on outdated software.

Benn: Most of the cameras are on Android 8 Things. Yeah. So it’s not even Android 8. It’s Android 8 Things, which is like a separate version of the OS that was made for headless devices when the Internet of Things started becoming popular. So it’s like even worse, they never made another version. It was like, it was like a one-off, one of Google’s many one-offs that they just abandoned immediately.

GainSec: So Condor actually are… I’ll be annoying about it.

Beau: This is John Gaines who goes by GainSec. He’s an independent cybersecurity researcher, red team security guy. That means he finds vulnerabilities for a living and what he found in Flock systems was so extensive, so damaging that he published a formal white paper documenting the most serious of the vulnerabilities.

GainSec: Condors are actually pretty dumb cameras. They actually keep compute boxes, which is the block box above my head. That’s like the latest thing. Falcon Sparrow Flex cameras are the all-in-one license plate readers. Both of them run Android. I think the chip set’s like 12, 13-years old now. And it’s running an operating system that doesn’t exist. Android Things, so, yeah.

Beau: Here’s the thing about a hundred thousand cameras running on software no one patches anymore. Somebody’s already in statistically, mathematically, however. It would be stranger if nobody was. Somebody’s in there. If a company like Flock wants to do business in the United States with a certain piece of technology, they should have to pay for all the research to make sure it’s safe.

Benn: Like Arby’s has to do this, right? Like, they have to pay a fee, they pay a license fee, and then an inspector comes and make sure that they’re not serving rancid meat to people. Your hairstylist has to do this. They have to pay a license fee and make sure that it’s clean and that they’re not cutting off people’s ears or, you know, whatever they check for in that place, like. Your home inspector, like literally everything has this. Why would we not apply the same system to Flock Safety? A great, absurd example is that if I got into my car right now, at least my van, and I drove past a Flock safety camera, I have to get emissions test on it. I have to get a driver’s license, I have to get a test when I get the driver’s license to make sure that I know how to drive the car. I have to take a written exam to make sure that I know all the ins and outs of driving on the road. So that’s regulated and I have to pay for that either through my taxes or my license fee. And then I have to have a plate with that I renew every single year. So you could make sure that you could keep tabs on me, so I’m not doing anything crazy and that way. And yet this camera that is collecting all of my information every single time I pass it doesn’t have to do any of that. Like, they have absolutely no inspection process whatsoever and no test to pass to get there. So that’s like, I feel like that is the easiest argument to sell in this entire debate over mass surveillance. Like it’s just like, why would you not regulate them to some degree, to just a basic degree to the same one that you would regulate any other business. And I think it’s just laziness. 

Beau: I’m not sure it’s laziness. Because I think we’re talking about an ecosystem, a business ecosystem where, it’s business first and all the other questions later and it’s, you know, move fast and break things and so when it comes to a company that is claiming to provide a resource to law enforcement that is so poorly put together that it actually invites people to break the law.

Benn: Yeah. I think it’s, when I say laziness, I mean, I think move fast and break things is kind of laziness. Like, I feel like it’s laziness in like, you know, it like really it’s just a race to get to this business model where everyone is supplying you with the labor or the thing that you’re selling and the money for that. And so it’s like very similar to like, you know, Uber Eats or DoorDash or something where it’s like, I mean, this is the way that I’ve been explaining it to police departments who wanna know more about the business model. It’s like, okay, well let’s just look at any tech business model. Like let’s say DoorDash, you are paying a third party to pay a driver much less, and then that person’s picking up food from a restaurant that is also paying the same third party and they’re just collecting money from everybody without having any actual employees or without having… with having very few actual employees to manage these autonomous people or AI or whatever it is. And I mean, so with Flock safety, you have one city, they’re renting a bunch of cameras, they’re supplying data to the system that then shares it to the next city over, and they’re renting the cameras. And the reason those cameras are so valuable is because the data from City A is going to them. And then you just expand that in a big old mesh across the United States. And so you end up with this fast-growing system that is incredibly recklessly scaled. And the reason I can say that is because of the security posture. Because like, we were able to find things that we feel like we shouldn’t have been able to find. I mean yeah. ‘Cause it’s running Android, right? Like that’s a great example right at the bat. Like, why is this not running its own OS? Why is this not just running Unix? Like, what, there’s so many, like this isn’t hard. It’s not like that difficult to set something like that up. It’s just that they did not pause. 

Beau: You know, when they’re developing the product, Hey, CEO you should have built the privacy and the security in when you were thinking about the thing you wanted to build. It should have been like on the drawing board and it wasn’t. And so you screwed up and now here’s the deal. There’s five more just like you behind you. So Flock goes away tomorrow, there’s gonna be, there’s other people who want to have what you’ve got.

Benn: Yeah. Really. Yeah. And I mean, as coming from a touring musician, at some point I stopped playing any shows that weren’t at legal venues. And yeah, I mean, a lot of these shows were like electronic music shows and stuff. So illegal venues are very popular in that world. And, and I mean, without being hyperbolic, I witnessed the situations at my shows where, where people died, like, because they’re, because of these, like that one of them was an illegal party up on like a mountain in Washington and somebody OD’d and an ambulance couldn’t get there. And it’s like, yeah, that’s why you don’t have parties on top of mountains in Washington. That’s why it’s not-

Beau: [inaudible] a helicopter.

Benn Jordan: Right, exactly. Like, and so I mean, so it’s like, okay, well yeah, this is risky. I’m putting, putting my audience at risk by doing this. In even simpler ways with like, there was a lot of like warehouse parties that had fires or a bunch of people died. One in Oakland, that was like, horrific. Anyway, but point being like, I guess I learned over time why, as much as I wanna say, you know, no rules, no rules when it comes to having a good time, like, well, yeah, burning to death is not a good time.

Beau: So that’s the business. Move fast. Don’t ask for permission. You know, I guess apologize, but no one’s asked for an apology yet. Worry about the consequences later. Unless, of course there are consequences pretty much right away. Now, you watched a Condor camera. You were, what? You… Did you hack into the Condor camera? No, you didn’t. It was publicly accessible. It was not protected by password or anything. And you watched a person through that camera, who was looking at a video on his phone.

Benn: Yes. Yeah, yeah. I was able to actually watch his- I mean, it, it zoomed in enough to where I could read the text on the screen. It zoomed right in. I mean, they have pretty remarkable zoom lenses, which, you know, props. I guess they’re, yeah. It’s kind of funny to like see the new tech be like, kind of good camera quality and stuff like that because the old ones are actually pretty a abysmal, like the ones that you usually see, the Falcon cameras. The tech in them is quite terrible for an LPR. And so that’s sort of been the running joke for the people that I’ve sort of been working alongside in the research is like, yeah, at least they’re garbage. Like, it would be worse if they were really good. But yeah, it seems like the new ones are, have a lot more power and a lot more capability.

Benn Clips: A few weeks ago using a commercial search engine, I very easily found the administration interfaces for dozens of Flock safety cameras.

Beau: This is from Benn’s YouTube video titled This Flock Camera Leak is like Netflix for stalkers.

Jason Koebler: I’m not exactly sure how he got into this other than there were Flock cameras by his house and he was annoyed by it and started looking into it.

Beau: Jason Koebler again. 

Benn Clips: None of the data or video footage was encrypted. There was no username or password required.

Jason Koebler: But he did a few really great videos about Flock and he reached out to me after some of our reporting and said, hey, I have something else I wanna work on. Let’s work on it together.

Benn Clips: Whether you wanted to watch this footage live in real time or look at footage from a month ago, you could just point and click your way to it like you were watching Netflix. These were all completely public-facing for the world to see, and some of them still are.

Jason Koebler: And what he had found was using Shodan, which is an Internet of Things search engine, he found that there were maybe like 60-ish Flock cameras that were left exposed to the internet. 

Benn Clips: And with John Gaines help that number quickly grew to nearly 70.

Jason Koebler: So these were Condor cameras, and these were just streaming directly to the internet with no login whatsoever.

Benn Clips: You don’t have to be an expert to find and gain access to this. You don’t even have to type anything in to see every single person, vehicle and activity that took place in these locations in the last 31 days.

Jason Koebler: And he’s like, can you like, help me figure out what’s going on here and do some of the reporting on your end? What happened was, I was able to geolocate some of the exposed cameras doing a lot of Googling.

Jordan Clips: Okay, so I’ve driven up to Bakersfield, California from Los Angeles, to check out this Flock camera that Ben Jordan, GainSec and myself found streaming directly to the internet.

Jason Koebler: The video got and the reporting got a lot of attention, but I think that it really was a moment where it’s like this company is not just a license plate reader company. It’s kind of a holistic surveillance operating system for police, and they are deploying a lot of this technology in a relatively haphazard way. 

Jordan Clips: And basically what we’ve found is that this is streaming unencrypted, totally insecure, no password required directly to the internet.

Jason Koebler: This is not what you would want from a surveillance tech company. You don’t want this stuff just like streaming where anyone can watch it.

Jordan Clips: And so I’m gonna go out there and walk my dog. I’m going to record this camera recording me, and then I’m gonna watch myself on the internet because this is streaming directly to the internet. Insecure.

Benn: I can literally walk up to a camera, read Flock’s statement about how they’ve never been hacked, go home to my personal computer, download it off of Flock’s ecosystem, show it to the world, get media attention worldwide for this, and they will still stand up in a hearing and say, Flock has never been hacked.

Benn Clips: Flock is committed to continuously improving security. If a person were able to gain physical access to the device leased to the city, which is illegal, they still would not be able to gain access to the footage as the data is only stored for a very limited time duration on the device following its transmission to the cloud. None of these vulnerabilities affect our cloud platform where the vast majority of all evidence and metadata is stored.

GainSec: They were just exposed to the internet. So if you knew or you found, the data connections IP, right, you can just like directly browse to it. And then it was a, a nice little web application.

Beau: Here’s GainSec again.

GainSec: The compute boxes of the license plate readers, or the computing devices that were connected to them in the newer instances, all have this hardcoded debug sequence, where, if you push the button less than a handful of times, it would activate its hotspot, and it happened that the hotspot had the same password across all the devices as well. Hardcode password which was not secure.

Beau: There’s a button on the back of these cameras. Get the push sequence right, and the camera’s broadcast their own wi-fi signal. I’m not gonna say more, but there’s more. A lot more.  When I listen to Flock’s CEO, he’s a 38-year-old guy named Garrett Langley, he actually said, we’re not forcing Flock on anyone in a Forbes interview. 

Benn: Might be my favorite quote of his.

Beau: But can we opt out?

Benn: Yeah. My favorite quote of it, like, I, I can’t….

Beau: Ben Jordan again. 

Benn: I mean, I literally can’t go to the store without passing a Flock camera. If I touch it, I get arrested. If I so much as like, lay a finger on it, right? I can’t know what data of mine is being sent off or is being kept. I don’t know who’s accessing it. It’s likely that whoever’s operating it doesn’t completely understand who’s accessing it. And then, and I mean, it’s just a web of bullshit, like, and it just never stops.

Beau: So like, you know, you can hack right into this thing. So you can’t go to the store without passing a Flock camera and you don’t know if I’m in there. ‘Cause I might be, and I might be like, well what, when does Ben go to the store? ‘Cause I’m a huge fan and I want to meet him. And it’s the beginning of a new adaptation of misery. You know, you’re screwed. 

Benn: My initial Flock Safety, video idea, like the initial rendition of it was to work with John Ossoff, who’s a Georgia Senator, and the one and only Marjorie Taylor Green, who I, you know, I don’t wanna spend time with Marjorie Taylor Greene for many reasons, but I did want to be on both sides of the platform, and she is somebody who seems to be pretty skeptical about tech and surveillance. So it was like a- and she’s also, you know, 30 minutes away or 45 minutes away from here is her district. But the idea was can I track you both for a week? Just track all of your locations using the Flock ecosystem and, you know, could we make that possible? And, you know, could I get your consent to do that? And I spoke to one of their teams and one of their teams was like, I think we can make this happen. And, you know, I was working on it and then my attorneys were like, there is no way to do this without going to jail. Like, because you’re gonna be touching Flock’s Ecosystem on their cloud. So even if it’s senators, even if it’s, you know, Congress people, no matter who it is, it could be the president himself, unless you think you’re going to get complete immunity or something, there’s no way you can avoid going to jail and you will be guilty. And so from there it kind of like boiled down to, all right, well let’s just talk about like all of the security vulnerabilities in this camera and get John Gaines here to talk about it.

Beau: These cameras aren’t off-gassing something that you can just free-range grab?

Benn: I mean, we found the credentials to get into the actual Flock system.

Beau: But it’s still touching their system.

Benn: Yeah. Oh, yeah. Yeah. So I can’t, we can’t go there, but we have them and I mean, they weren’t hard to find. It turns into like almost like a Fermi Paradox type of thing, where it’s like, to imagine that nobody’s doing it is kind of impossible. It’s like very statistically unlikely that nobody is in the ecosystem using it nefariously. And I would I would actually be sheepish about saying that for legal reasons. If Senator Wyden and Krishnamoorthi, representative Krishnamoorthi, didn’t send a letter to the FTC saying the exact same thing, saying this is a national security risk, a massive one, and we need to deal with it right now. Which of course fell in deaf ears as far as I’m aware. Like a lot of people have asked me since these videos come out that it was like, so could you just track anyone? And it’s like, yes, I haven’t though, so I cannot tell you with a hundred percent, I can’t give you a hundred percent yes. But I can tell you that every single piece of information that I have suggests that I could, if I wanted to. I just don’t want to because I don’t want to go to jail.

Beau: Here’s what we know. Flock cameras have had serious security vulnerabilities and for a stretch of time, live feeds from at least 60 of their devices were streaming openly to the internet. No password, no login required. Anyone who knew where to look could watch. And some of what they could watch was kids on a playground. That’s the accidental part, the part that they can and presumably will fix. What we haven’t talked about yet is the part that’s working exactly as intended.

Jay: And a lot of this is because the move to cloud services, which is a big part of the story here.

Beau: This is Jay Stanley from the ACLU again.

Jay: Cloud services are centralizing. That means that it’s more central power. It gives them the power, for example, to ban review of their products. Flock won’t let this company IVPM, which is sort of like a consumer reports for security products, won’t let them review it. And they can do that because they have this centralized control. And it gives them, it makes them a monopolistic company because they have this giant network. It makes it very hard for competitors to arise and to fight them.

Beau: When you build something this big, this fast, the tools stopping about stolen cars. Law enforcement used to need a warrant to track your movements. That check exists for a reason. A judge, someone independent saying, yeah, there’s enough cause here to follow this person. Flock found a workaround. The data comes from a private company collected on public streets, sold to police as a service. No warrant needed, just a subscription. The Supreme Court hasn’t ruled on whether any of this is constitutional yet.

Jason Koebler: And what has happened is there is a program that allows local police to work with federal law enforcement on immigration enforcement.

Beau: Here’s Jason Koebler again.

Jason Koebler: And so what’s happening is local police are searching for license plates, basically at the behest of ICE and then sort of giving them information about, you know, where a specific car has been and a person has been in the history of their movements. And when we showed this and we started talking to the police departments that were doing it, or that they’re having their cameras searched, they were saying, we had no idea this was happening. They had no clue that any of this was happening. And that’s just like an example of how you can kind of accidentally end up doing surveillance for the feds in a situation like this.

Beau: Edward Snowden hasn’t much to say publicly in a while, but he was the one revealed more than a decade ago that apps like Angry Birds were leaking personal data: Location, user IDs, political preferences, straight to intelligence agencies, because I guess even people in that line play games. At the time it felt like the quiet part staying quiet. Now the quiet part is being said out loud. This is the era of that, and ICE is using stingrays. Palantir’s involved. Add Penlink, Paragon Cellebrite, Smart Link, Mobile Fortify, facial recognition tools that can identify you from a phone pointed at your face in the street. These aren’t separate programs. They are, but what I’m trying to get at is they’re an ecosystem, and when you start connecting that ecosystem to a network of 100,000 Flock cameras, you’re not looking at crime prevention anymore. You’re looking at something that doesn’t have a good name yet in democracy. I mean, it’s never going to have a good name, but it has plenty of names in places that aren’t one.

Jason Koebler: Yeah, so what I would say is that if you can imagine a vector for surveillance, ICE has also imagined that vector, and they’ve found a contractor that will provide that sort of software, that sort of capability to them. The list is endless. I mean, we don’t need to go into all of them, but it’s things like, location data that they’re buying from third parties, that the location data comes from your phone. You know, it comes from apps that you may have given your location data to, sometimes through the advertising industry. I think that because they’re buying so many different capabilities from disparate companies, and honestly a lot of these companies have just kind of like popped up in the last few months. I don’t think that they’ve integrated this all into some sort of master database, although there is a lot of fear that that may be happening.

Benn: I mean, everything gets to a breaking point, right?

Beau: Ben Jordan again. 

Benn: Like, everything eventually gets there. And I mean, this is even in like the, under the microscope of just Flock Safety. It’s like, you know what, at some point if you piss people off enough and if you make ’em uncomfortable enough, they’re just gonna shoot the things, right? I’m surprised that’s not happened already, and I’m glad that’s not happened already. Like as much as I wanna be like, yeah, get ’em off the poles, I’m really glad that in this situation, there’s a good guy and a bad guy. There’s one side that’s been misleading people and saying things that aren’t true and citing facts that aren’t real. And that antagonist side has been making lots of money hand over foot. And then there’s another side that’s been paying money out of pocket for legal resources so they could do everything by the book and work with cities and with law enforcement agencies that they don’t necessarily align with at all, and with politicians that they don’t align with at all to make sure that the public is safer and that they have some expectation of privacy, or at the very least, so they understand what it is that is connected to the pole that’s pointed at their house. And so thank you to all of those who haven’t shot the cameras off the pole. Like I think that’s actually really important right now because, but I do think that at some point, if…. yeah, sure, if like ICE starts accessing these through the police departments that they’re sort of gobbling up in Georgia, for example, and it leads to massive arrests and the Trump administration starts arresting, people who are already legally here, I mean, I think that’s already happened, but if, let’s say that continues even worse, then, like, yep. You can expect people to start, you know, hitting them with baseball bats or spray painting ’em or whatever it is. You know, there was, , sort of an interesting side note on this when it, I have this older video, where the University of Chicago said that they had invented an AI algorithm that, or AI model that could predict crime in particular areas of Chicago with a 90% accuracy rate. And they could do it down to the city block. And so I went to University of Chicago and met with the sociology professor who was spearheading the project and then some of his developers and I asked them randomly, I was like, how often have you gone to this neighborhood? This neighborhood was Englewood and I grew up in West Englewood. So it’s this, it’s one of the worst neighborhoods in the country. It was then, still is now And so I asked him like, how often are you going over there? Like, are you familiar with this and this and this? None of them had ever been there. And I was like, wait a minute, wait. Whoa, whoa, whoa. It’s literally 15 minutes away. You’ve been studying the statistics of this neighborhood for years and you never drove 15 minutes away? I can’t believe you’ve never been there in your life. And so oddly enough, the same day I had a meeting with a nonprofit there called Purpose Over Pain and Pam Mosley, I believe is her name. She’s the woman who runs it, but it’s run by a group of mothers who lost children to gun violence. And they started this nonprofit that more or less has a hotline that you can call if you feel like something’s going to happen, a murder’s going to happen, somebody’s gonna get hurt, somebody just was hurt. And it was so interesting ’cause like visiting Pam, she was like, oh, I could do better than their 90%. And, and I was like, how? And she’s like, I talk to people. It’s very easy actually. It’s not that profound.

Beau: The solution to a lot of the problems that we face are community, like sticking together as a cohesive group of concerned individuals and acting as a group.

Benn: And I think one of the, one of the reasons why things like Flock bother me is because I believe that in a perfect world, the police are… very few cops would be armed. You would have your like, oh shit, force for like, really, really bad situations. But like, I want a cop to be welcomed, like… I want a cop to be able to solve problems. I want a cop to be trained in dealing with somebody overdosing, and I want other cops to be-

Beau: Deescalation too, right? 

Benn: Yeah, yeah. And like, not to like bust out another side. I’ll tell this very fast. My father used to be a cop and I didn’t meet my dad until I was later in life. He didn’t raise me, but he used to be a police officer and he told when the whole George Floyd thing was going on, he told me a quick story where he was like, when, I mean, I was a cop in the middle of nowhere in Georgia, rural Georgia. My next backup was 30 to 45 minutes away by radio. I got a call with a guy who was going to kill his wife, like he was threatening to kill her. He had a gun. I had to go up there by myself, and I had to, I knew him by name. I knew him [inaudible]. He was like somebody who lived in the area and I talked to him, talked to him about his life, talked to him about the consequences, what would happen if he shot her and if he shot her, you know, that he didn’t need to end his life over it. That everything is resolvable. And I had a good 30 minutes out there in the hot Georgie evening, being bitten by mosquitoes, talking to this guy and talking to him out of his tree and getting him to put the gun down. And then, you know, we had him arrested and, you know, whatever went on from that. That opposed to that happening in a place like Los Angeles or, you know, in Houston or some place where like you can immediately get a hundred cops if you need it. They can just put on the riot gear and arm themselves and come in. Like, that’s the difference. Like a bunch of people pointing a gun and saying, get on the ground. And like, versus one guy coming out who’s familiar to that person and saying, yo, you’re having a crisis. Let’s talk about it right now. Like I’ve never, unless it was a car accident where I needed a report, I’ve literally never called the police in my life. I’ve never done it, because anytime that I’ve seen the police called, it makes things worse. The only research I’ve seen out of looking through criminology research forever, the only research I’ve seen that like really does suggest that a certain thing does improve crime rates or improve this catching crime or solving crime in whatever it is that you wanna call it. The only thing that actually improves crime, that is community policing, 

Beau: That’s dudes with bicycles being like, what’s up?

Benn: Yep. Yep. Yeah, and I mean, that’s like, you could, you could go through tons of research papers, finding stuff that heavily suggests that community policing reduces and increases the solving of crime. And so it’s like if police are so concerned about these statistics that, you know, that they’re looking for from Flock that are so dodgy, there it is. There’s your statistic.

Beau: Now it’s time for the tinfoil swan, your paranoid takeaway to keep you safe on and offline. If you’re feeling helpless right now, that’s probably the right instinct, and realistic. You can’t opt of this. You drive past one of these things, you’re in the system. But you’re not powerless either, at least not yet. Write to your local and state lawmakers. Tell them you want standards. Tell them you want them to pass laws, and you want these systems tested, and you want the companies to pay for that testing, and you want accountability. And keep listening. We’ll try to get into some of the smaller, more practical things you can actually do, but right now, the only thing you can do is get in touch with your lawmaker, and then if you hear us talking about this topic again on this podcast, which I promise you, you will, write to them again. Okay, that’s in. We’ll see you soon, and thanks so much for listening. This episode of What the Hack was produced by me and Andrew Steven, who also did the editing. Our theme music is by Andrew Steven. If you think you heard Benn Jordan’s music in the mix, you’re right. There’s some other stuff, but there’s some Benn Jordan too. Check him out on Bandcamp or wherever you get your stuff. What the Hack is a production of DeleteMe, which was picked by the New York Times’ Wirecutter as the number one personal information removal service. You should be using it already. If you’re not and you want to, well, you can. Here’s what to do. Go to joindeleteme.com/wth. That’s joindeleteme.com/wth and get 20% off. I kid you not. 20%, 20% off. That’s joindeleteme.com/wth. Now, stay safe out there.

Learn More:

SHARE THIS EPISODE
Hundreds of companies collect and sell your private data online. DeleteMe removes it for you.

Our privacy advisors: 

  • Continuously find and remove your sensitive data online
  • Stop companies from selling your data – all year long
  • Have removed 35M+ records
    of personal data from the web
Special Offer

Save 20% on any individual and
family privacy plan
with code: WTH

What the Hack Podcast
Dive into the latest episode of ‘What the Hack?’, your go-to podcast for real stories, shocking cybersecurity breaches, and mind-blowing digital hacks.
Want more privacy
news?
Join Incognito, our monthly newsletter from DeleteMe that keeps you posted on all things privacy and security.
Icon mail and document

Exclusive Listener Offer

What The Hack brings you the stories and insights about digital privacy. DeleteMe is our premium privacy service that removes you from more than 750 data brokers like Whitepages, Spokeo, BeenVerified, plus many more.

As a WTH listener, get an exclusive 20% off any plan with code: WTH.

Listen to Recent Episodes

This Week on What the Hack: Flock Safety Privacy Concerns

Episode 242
March 11, 2026
50:48 min

This Week on What the Hack: The Surveillance Economy

Episode 237
February 2, 2026
46:45 min