Skip to main content

This Week on What the Hack: Digital Surveillance and Privacy

This Week on What the Hack: Digital Surveillance and Privacy

You’re not paranoid. From Ring doorbells to AI-powered camera networks, digital surveillance has gone ambient. This week we trace how Big Tech turned data into infrastructure, why police are buying access, and what the ACLU says about privacy, protest, and the quiet ways democracy changes when everyone’s being observed. Also discussed: home assistant privacy settings, 404 Media, and Benn Jordan’s YouTube channel.

Episode 100

Loading title…
0:00

Ep. 241: “Somebody’s Watching You”

What the Hack?” is DeleteMe’s true cybercrime podcast hosted by Beau Friedlander

Beau: Do you feel observed, watched? There’s nothing new about surveillance. 

Get Smart Clip: What’s up, Chief? Why the Magenta Alert?

Get Smart Clip: 86, I’m deeply concerned about the conference this afternoon. We have reason to believe that CONTROL may have been infiltrated.

Get Smart Clip: I’ll need some special equipment for that, Chief. We can talk under the cone of silence.

Beau: But it has been in the news a lot lately.

TODAY Show: Since the release of these chilling images from the night of Nancy Guthrie’s Disappearance, authorities say tips have been pouring in. Video surveillance has become a critical element for law enforcement in helping solve crimes.

CBS Clip: Secretary of Defense, Pete Hegseth gave Anthropic a Friday deadline to grant the U.S. military unrestricted access to Claude. Now, the key friction point: Anthropic does not want its technology used for autonomous weapons or the mass surveillance of Americans.

Beau: Surveillance used to be about spy craft, the James Bond version of surveillance. Like, there’s a microphone somewhere in the house. There’s a camera hidden in a wall and it’s maybe behind the pupil of a painting, you know, like it’s all cliches. Some of it’s true. Some cliches are true because they happen a lot. But…. now we’re just all being spying on. Everyone. Maybe you keep talking about how you need a new power tool and the next thing you know you’re getting served ads for power tools and you haven’t Googled it. Nothing. They just start popping up. Digital surveillance as it exists today started before anyone even called it the surveillance economy. It started with cookies and trackers. There’s no such thing as total anonymity. Philip K. Dick once said, there will come a time when it isn’t, they’re spying on me through my phone anymore. Eventually it’ll just be, my phone is spying on me. That time is now. French researcher Yves de Montjoye works with huge data sets to demonstrate how easy it is to re-identify anonymized information like where you are based on pings sent to your mobile phone.

Yves de Montjoye Clip: On average. Knowing four places and times where someone was is enough to uniquely identify and potentially re-identify in one of these data sets of like, you know, 1.5 million people. Four points is sufficient to uniquely identify someone 95% of the time.

Beau: They only needed four points, four pings. You were here, here, here, and here to identify you. Not maybe you. You. You’ve seen your data on the people search sites, right? It’s the tip of the iceberg.

Benn: Honestly speaking, the amount of credentials, it’s easy to find. Or if you don’t feel like finding them, if you’ve been on the dark web long enough, you could buy them. 

Beau: Now we say make yourself as hard to hit, hard to spy on, hard to collect data on as possible, but what does that look like now that there’s surveillance cameras everywhere? I mean, 2009, Google’s CEO, Eric Schmidt, said…

Eric Schmidt Clip: If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.

Beau: Wait, like, like British street artist Banksy for example? ‘Cause like, Banksy wouldn’t be Banksy if you could see Banksy doing what Banksy does. Now, in 2010, Banksy actually did make a new piece featuring a pink TV with the words reversing Andy Warhol’s quip on 15 minutes of fame. It said in stencil letters, “In the future, everyone will be anonymous for 15 minutes.” Yeah, that’s not happening anymore. Welcome to the first installment of a two-parter on surveillance. Today we’re gonna talk to Jason Koebler of 404 Media, musician, composer, and YouTuber, Benn Jordan, and Jay Stanley from the ACLU about the new surveillance. I’m Beau Friedlander, and this is What the Hack, the show that asks, in a world where your data is everywhere, how do you stay safe online? Now, if you don’t know 404 Media, fix that right now. It’s some of the best independent tech journalism out there covering surveillance, AI and the stories legacy outlets won’t touch. You may have missed their Super Bowl ad.

Jason Koebler: Yeah, we bought a regional local ad. Yeah.

Beau: Jason Keebler, co-founder of 404 Media and former editor in chief of Vice’s Motherboard. To be honest, I don’t think that I would wear a hoodie from any other media outlet, but I’d totally like one of those 404 Media hoodies, just saying.

Jason Koebler: Thank you. That means a lot, trying to become a fashion brand on the side.

Beau: So about that Super Bowl ad. 

404 Super Bowl Ad: In a world increasingly controlled by big tech, AI slop, and social media algorithms controlling what you see, 404 Media focuses on something else: real information you can actually use.

Jason Koebler: We did have a Super Bowl ad. It was very much a stunt. I was inspired by the Verge doing something similar like 10 years ago. And basically like local TV stations that air the Super Bowl are allowed to sell their own ads that run only on their channel. And so we found the smallest media market in the United States, which was in this town called Ottumwa, Iowa. And we put the ad on YouTube and wrote an article about it and the process and all that. So, I mean, a lot of people did end up seeing it and seemed to like the idea behind it, but this was not like a major super, it wasn’t like, it was a stunt more than anything, I guess I’d say.

Beau: The reason I asked you about your Super Bowl ad was there’s another Super Bowl ad for the Ring cameras that, that freaked a lot of people out. It was about their search party feature.

Ring Super Bowl Ad: Pets are family, but every year, 10 million go missing and the way we look for them hasn’t changed in years.

Jason Koebler: Yeah, so Search Party is a feature that Ring launched back at the end of September, I believe. It is a feature that uses AI to link Ring cameras in a neighborhood or a town all together to look for lost dogs.

Ring Super Bowl Ad: One post of a dog’s photo in the Ring app starts outdoor cameras looking for a match. Search party from Ring uses AI to help families find lost dogs.

Jason Koebler: And so basically if you lose your dog, you can take a picture of the dog, you can upload it to Ring’s website, and then all of these cameras will be like automatically networked together and use AI to, to look for the dogs. And when this was launched in September, it was somewhat controversial but didn’t get that much attention. But then they did this huge Super Bowl ad. It had CEO, Jamie Siminoff sort of talking about it. And, you know, they used it to find this dog in the ad. And I think one of the images that they used, was a visual of like all the cameras kind of searching together, like networking together with like these blue overlays in a kind of scary-looking way. And I don’t know, I feel like there was just like a massive backlash to this.

Beau: When I saw those cameras, the first thought I had was, oh God, what if that was Ahmaud Arbery? And, and I mean honestly, the very first thought I had was, that’s not what it’s gonna be used for. So there was this backlash. And I guess the reason I thought that is because I know Ring has a relationship with law enforcement.

Jason Koebler: Yeah, it’s really interesting and I mean, it’s a little bit of a complicated backstory, but this guy, Jamie Siminoff, the founder of Ring, he launched it on Shark Tank .

Shark Tank: First into the tank is a product to ensure you always know who’s at your front door.

Jason Koebler: He proposed it on Shark Tank was the origin story of Ring.

Shark Tank: My name is Jamie Siminoff. I’m from Los Angeles, California. My product is the door bot. I’m seeking $700,000 for a 10% stake in the company. 

Jason Koebler: And famously it didn’t get funded.

Shark Tank: Okay, Jamie. It’s that moment when I say you’re dead to me because you don’t want to take my offer. I made you a very valid offer I think under the circumstances. Respectfully, Mr. Wonderful, we’re gonna decline.

Jason Koebler: But basically after that, the way that Ring became popular was they entered into these partnerships with police all over the country and they went basically town by town and they incentivized police to pitch these doorbell cameras to people who lived in their towns. So they would give police free cameras and say like, give these away in sweepstakes or at public meetings. They gave them like discount codes. And in return, as part of this partnership, Ring basically offered police the ability to request footage from anyone in that town. And this was like 2017, 2018, and we tracked these partnerships at the time, back in my old job, and it went from, oh, there’s like a few dozen places doing this to like thousands, within a year or so.

Beau: Thousands of police-

Jason Koebler: Thousands of police departments. So just basically like thousands of towns like signed up for this and were kind of doing like Ring’s marketing for them. And Ring became like very popular during this time. This did spark like some sort of backlash, like the Electronic Frontier Foundation, the ACLU. There was like tons of negative reporting about Ring and about how they were kind of building this like consumerist surveillance state more or less. And there were kind of a lot of high profile cases where, like, paired with Ring at the time, and they still have this, it’s an app called Neighbors and it is basically like a place that Ring camera owners can upload footage and then other people can comment on it. What ended up happening was like, anytime a quote unquote “suspicious person” would walk by a Ring camera, they would be recorded and this footage would be uploaded and then people in the town would be like, what’s this person doing here? And very often that would be a black person or a Latino person or like a delivery driver or someone doing their job. It became this kind of like mini police state I would say. There was a big backlash to this eventually. And Ring sort of like took a step back from this at some point, like 2022, 2023, and they canceled the program where they will allow people to give footage to police without a warrant. I don’t know how impactful it was that these partnerships were canceled. But at that same time, Jamie Siminoff, the founder, left Ring. He, he left the company entirely and then he came back last year-

Beau: Yeah, 2024. He leaves, comes back in April of ’25. It’s now owned by Amazon.

Jason Koebler: Yeah, He was basically like, “Daddy’s home. We’re going back to our mission. We’re pro-cop, we’re anti-crime. We’re doing the partnerships again.” And critically, he was like, we’re actually gonna add AI to all of this. Like, we’re gonna try to automate the entire process. And that is sort of like where things stand now. This is explicitly like pro-police surveillance technology, and they are trying to figure out how to implement AI and network them together.

Beau: That’s the shift. We’re not talking about a doorbell camera anymore. We’re talking about a privately owned, AI-assisted surveillance network with the potential to scale nationally. And it’s run by a founder who just came back with a very particular political vision and the resources to do for law enforcement what, at least in his opinion, and the opinion of people like him, law enforcement has been unable to do for itself. That sounds like hubris to me. But anyway the real danger here, that is it. It isn’t neutral technology. It’s a powerful tool in the hands of people with a very specific idea about how it should be used.

Beau: Here’s Jay Stanley, senior policy analyst with the ACLU’s Speech Privacy and Technology Project. Jay Stanley, thank you so much for joining us. My first question is, why is the A CLU interested in surveillance?

Jay: At the end of the day, surveillance is a question of power and freedom, and when people surveil you, they have power over you.

Beau: So what about Ring’s Super Bowl ad?

Jay: Yeah, I mean I think that the Super Bowl ad was a wake up call for a lot of people about just how powerful centralized cloud video services are. This is not like the old- fashioned, way-back-when camera from like the Obama administration. We now, because of AI and because of cloud centralization, any camera that’s tied into that system is much more powerful than it used to be, including Ring cameras, because AI, if you have a pool of 10,000 hours of video, it used to take 10,000 hours or maybe 5,000 or 2000 man-hours to search that. But now just like large text corpuses can be keyword searched, you know, you can ask an AI, find me somebody in a red sweatshirt who’s carrying a briefcase, and it will find those people for you. And that’s what the dog, you know, search technology was based on. But of course, everybody intuitively knows that it’s spooky because in five minutes, and in fact, Ring company even said this in internal memos, they’re gonna use it for other things. They said, we’re gonna use this to end crime in America. And you know, the amount of surveillance that would be required to quote unquote “end crime” is frightening to most people and rightly so.

Beau: What is the danger of, a private sector, you know, a business having access to this much information about this many people?

Jay: Yeah, I mean, this is mass surveillance, which is the biggest, most concerning form of surveillance because it collects data on everybody all the time without individualized suspicion of wrongdoing, which is the standard that the government normally needs to meet to invade your privacy.

Beau: Siminoff recently, I think it was recently said, you know, you can now see a future where we are able to see- he’s talking about search party- we’re able to zero out crime in neighborhoods, zero out crime in neighborhoods. This is a private sec- this is not law enforcement. This is a company selling a camera and surveillance equipment saying they’re gonna zero out crime. Crime. Who’s saying that what the crime is? ‘Cause my concern is like, you do go back to this thing where these viral videos with people just deciding someone’s committed a crime and the whole neighborhood is participating in and prosecuting it.

Jason Koebler: Yeah, I find this language to be actually very important to talk about, so I’m glad you brought it up. Which, yeah, one, it’s like what types of things are being criminalized? It’s like, by and large, like why this technology was created was for package theft, which costs Amazon, I don’t know, probably billions of dollars a year. But also, what does zero out crime mean? Because it’s not gonna zero out tax fraud. It’s not gonna zero out domestic violence. It’s not gonna zero out –

Beau: Insider trading.

Jason Koebler: Yeah. All of these sorts of things it’s not going to zero out, but like the goal is to eliminate crime that homeowners believe is like possibly a nuisance or things that they don’t want to see in their neighborhoods.

Beau: Yeah. That keeps them so that they don’t have to pull out their little dumb guns.

Jason Koebler: Exactly. Exactly. And like how, how that would even work is very unclear because at least where I live, it’s like the police don’t really have time to be dealing with some of these lower-level crimes. It’s like, I don’t know, you call 911 because your window has been smashed. And they’ll be like, we’re not coming. Like, we’re too busy. We’re gonna come some other time. We’re dealing with like murders and shit. Yeah.

Beau: Some of you listening might still be wondering what the problem is. Sure. It’s not perfect, but surely stopping more crime is better than stopping less crime. I mean, we have to do something about the Bernie Madoffs of the world. This isn’t about Bernie Madoff. This is about zeroing out that, in quotation marks, a “certain kind of crime,” street level, visible, the kind that gets you on the local news, not fraud, not corruption, not the crimes that never get caught on camera because they happen behind closed doors. Who decides what counts? People with no constitutional guardrails, and the answer shifts with whoever’s in power. Right now the prevailing winds favor installing powerful new surveillance tools, and once they’re in place, that’s it. You better get used to them. Forget porch pirates and lost dogs. Nobody’s gonna stop and ask, “Should we be doing this?” once it’s all in place?

Benn: The common argument is like, I don’t have anything to hide.

Beau: That’s Benn Jordan. He’s a genius. Not to put too fine a point on it. 

Benn: And people have said that in person to me, and I’ve asked them, well, unlock your phone and give it to me. I’m gonna go in the other room for a bit. Like, and well, no, I’m not gonna. Okay ,so you do have something to hide. What are you saying? Do you have like child abuse material on your phone? What are you saying? Because that, this is the ridiculous argument that we’re having when somebody wants privacy.

Beau: You forgot to say, I promise not to take any money.

Benn: Yeah. Right. Yeah. I promise to not take any money away.

Beau: Benn Jordan. He records his music as the flashbulb among other aliases, but you may also know him from YouTube where he does deep dives on technology, privacy and surveillance, and amazing experiments like converting an image into a sound wave and then training his rescue starling, yes, the rescue bird, to memorize and sing the sound wave back. 176 kilobytes of retrievable data stored in a bird’s brain.

Benn: But the reality is, you have a ton of things to hide, like where your belongings are in your house, you’re hiding. You generally don’t want the general public to know that you. You don’t want the public to know, the password to, you know, where your secret key is under the rock or your passwords or your…

Beau: Or that you were against the war on Iraq, or whatever.

Benn: Yeah. Yeah. I mean, but yeah, generally that “I have nothing to hide” is like such a privileged statement from someone who’s never had their identity stolen. For somebody who’s never been stalked, for somebody who’s never been sued for, who’s never had to deal with discovery for, like, there’s so many different scenarios where something that you thought was private ends up getting into the wrong hands and used against you. Like, ’cause that’s just the reality we live in now. And so whenever somebody says that, I just want them to think like long and hard about all of the things that they very much have to hide and that they might not even realize. Like, you might be talking about having a chronic pain issue or something like that on social media. Guess what? That ends up being put into open source intelligence, which is accessed by insurance brokers. Like, that ends up costing you down the line.

Beau: So Benn reached out to Jason and 404 Media to tell them what he had learned about some of these surveillance cameras, not Ring, but Flock Safety, which makes an array of different products. Some of them are just license plate readers and some of them go way beyond that into AI-assisted stuff. That is why I reached out to him. But to be honest, I also wanted to know about his farm situation. There seems to be an animal in his lap in every single YouTube video he posts.

Benn: I have like 24 chickens I believe. The numbers go up and down as like, you know, there’s too many roosters and then there’ll be a hawk attack and then, you know, things like that.

Beau: I haven’t gotten chickens for that reason because there’s some coyotes that live just over there, and I feel like I just don’t want to be in a Hanna Barbaric cartoon for the rest of my life.

Benn: It’s funny how often like the two worlds of hacking or surveillance and chicken protection blend for me.

Beau: So where are we in this privacy nightmare? Is it over? Are we post-privacy or are we just in the throes of pre-digital privacy? Just haven’t figured it out yet?

Benn: I mean, everything gets to a breaking point, right? Like everything eventually gets there. And I mean, like what would happen if I used my platform to popularize a browser plugin that, for example, we had 10 people and we all swapped cookies? Like, for example, like data tracking cookies. So like, let’s say, you know, a woman named Maggie was another user. I don’t know who she is, but some of her details are coming over to my browser and some of mine are going there and now she’s going on Instagram and now she’s gonna start getting ads for things that are targeted towards me. And then all of a sudden ads on Instagram are gonna work as well. And now companies are gonna be mad because their conversion rates are lowering and their return on investment is going way down because people’s data is all jumbled up. This is just like off the top of my head. And I think that in a lot of cases, that’s what generally seems to happen in things like this. Like the, there’s always sort of a middle ground. Because otherwise we, we would have cameras installed in our house, you know, from whoever, Google, Twitter, you know, X, like, it would be a lot worse.

Beau: Well, we do already have cameras from Amazon in our homes, and microphones.

Benn: Yeah, that is true.

Beau: So we actually talked about, we actually talked about this with Al Franken.

Al Franken Clip: Our witnesses are Alexa, Amazon. Is that right?

Al Franken Clip: Just Alexa.

Al Franken Clip: Siri, as I understand it.

AI Franken Clip: How can I help you?

Al Franken Clip: Well, for now, be quiet. Okay? Alexa, you too, Siri.

Beau: It’s just another way we were frogs, you know, slowly getting boiled. If you wanna hear that whole episode, which is awesome, and then a throwback to when there was a three-host roster here on this show, we’re gonna drop it on Thursday as a bonus episode. You know the old phrase: money doesn’t know where it came from. Well, for a while, data didn’t know what it was good for. Websites were like, do you mind if we gather this or that? You know, stuff about you? And people were like, yeah, yeah, yeah, sure. Whatever. I just wanna use the service. People posted, did their thing and all the data that they generated didn’t really mean anything yet. Well, to them. Think of a kitchen that looks like there’s nothing to eat in it, right? Maybe there’s some spices or whatever. There’s some oil, a few vegetables, stuff in the pantry, but nothing really good. Now, a chef, a competent chef, walking into that situation is gonna be like, are you kidding me? I can make all kinds of things for you. What do you want? That’s what big data did with our information, with all the stuff scraped from social media and all of the information that we plugged into sites thinking it didn’t matter. Well, it did matter. Now that situation is in the physical world, and it’s not just on your smartphone, Meta glasses or like a face computer basically that can take pictures and tell you what you’re looking at. It’s AI-assisted. So sure, we all had that collective cringe moment with the Super Bowl, with its search party ad, because it wasn’t about dogs. If you’re like me, you’re like, whew. You know what else that could be used for? Yeah. All kinds of things.

Benn: I’m so glad that like the general public took that away from that ad.

Beau: Here’s Benn Jordan again.

Benn: I’m really proud of society or something like that. Because I feel like it had you shown me that ad before it ran, I would’ve been like, oh man. Like, people are just gonna be like, no, not my dog going missing. That’s my worst nightmare. And then just sort of, you know, appeal to it. But yeah, it seemed like generally everybody was made uncomfortable or, you know, the vast majority of people were made uncomfortable by that ad and started asking more questions like, what are these capable of?

Beau: Now, a surveillance camera is useless if there’s not a person on the other end looking. A mic, same thing, a tap, a wire tap. It doesn’t work if someone’s not on the other end recording it or listening. We’re now in an age where all of that can be automated. And not just automated, but the understanding of what is being collected can also be automated, and it can be looked at with intelligence that is agentic, that is not human, right? That is powered by artificial intelligence, and therefore it can be done very fast. And that is a game changer. So what we’re really interested in is how surveillance works online. And start to think about how it jumps from being just a part of the marketing machine of the internet to back to good old-fashioned or good old-scary Big Brother-style surveillance. Because it wasn’t that for a while. When Meta was doing it, it wasn’t about Big Brother. It was about big sales, and now it’s gone from like, oh, we can use all this data to sell people stuff. We can use all this data to figure out what you want, to now, we can use all this data to control people. Which was what Cambridge Analytic was about, right? For example, and now we can use all this information to control people’s behavior. How? By putting cameras everywhere. Ben, you made a series of videos about Flock Safety cameras. Flock sells a range of public safety cameras, right? Their basic license plate readers aren’t AI cameras. While some of their other models include AI powered analytics, there’s some that can hear somebody you know, screaming, gunshots, all kinds of stuff. So we’re gonna get more into the specifics next week, but in your research and reporting with 404 Media, in one of your videos, there’s a part of that video that just blew me away. And when you started to say that’s where you were also brought to tears, I was like, oh, come on, shut up. But yeah, when I see this dude walk into a playground and there’s a camera trained on a playground, already super problematic. Because these cameras are hackable.

Benn: Yeah, like why?

Beau: And an older, a person who is not a child walks onto the playground and gets on a swing and starts swinging, and you ask the question, would he or most adults feel free enough to do that if they knew there was a camera taping them?

Benn: And that kind of opens up this whole thing. ‘Cause I mean that, first of all, that’s something that I do. I mean, everybody likes going on a swing set, but…

Beau: I love a good swing.

Benn: An adult man walking onto a swing set at a playground by himself when other people are around is creepy and unacceptable socially. So you don’t do it when you’re being watched.

Beau: Unless you happen to be there with someone small, or you’re really..

Benn: If you have kids. Yeah. Either I have the option of doing it…to be socially acceptable, I have the option of doing it when nobody’s around and nobody’s looking. Or I could, I guess I could like get a fake child and you know, like a little doll and just bring it.

Beau: Borrow a friend’s kid. yeah.

Benn: Dress up one of my dogs.

Beau: “I’ll be right back. Just come here.”

Benn: Yeah. And so, you know, it’s reasonable to assume that had this person known that that camera was watching him and that I could watch him swing 30 days later, 31 days later, that they would’ve said, Hmm, no, I’m not gonna do this. I’m not gonna be, I’m not gonna be seen swinging in a playground, but this is also like, we do a lot of stupid things when we’re not being watched. We sing, we practice accents, we try cartwheels, but a lot of not stupid things like we learn to play the guitar, we solder for the first time. We code for the first time. Like, there’s so many things that we do that we need privacy for. Like, we need privacy to explore ourselves and to really find the weird, quirky parts of our personality and refine them before showing them to other people. And I mean, we’re just, you know, humans are supposedly a social species and we need to feel accepted from other people. And so we need to try these things out with privacy and cameras rob us of that. And I think that that’s something that like… it’s like studied in the corporate world where it’s like, okay, so if you have a little Caesar’s, is it better to put five surveillance cameras on the area where they’re putting the pasta sauce on the dough? Or is it better to just leave them alone and let ’em do it? And it turns out it that it in a lot of research, and I mean, I don’t like, have direct citations for this. There’s plenty of it though, that if you just let ’em do it themselves, they’ll figure out their own way of doing it and they’ll actually provide, not to get like too Marxist here, but they’ll actually provide organic labor or labor that is something that they have worked on themselves to refine, to work for them, but possibly even be learned from. And then there’s, you know, there’s like the concrete labor where you’re just telling somebody instructions like they’re a computer or something and they just do it and they hate their life. And so, I mean, when you have a surveillance camera on somebody at work, they are generally going to try to appear to work hard all the time rather than trying new things and improving. Honestly, if you were to secretly record me working on music, a music session, and then watch my stream every Thursday where I check out music software and make music, I swear on that stream, I’ve never, I’ve been doing it for years. I’ve hundreds of streams at this point built up. And I don’t think I’ve made one piece of music that I’m happy with ever. Like, not even starting a project. It’s always been garbage. And it’s because I’m being watched. It’s because like every single click that I make, you know, could be scrutinized. And even though I have like, the kindest community, my streaming channel is almost hidden. Like, it’s hard to find on purpose.

Beau: Benn, can you just tell me, like, pretend I don’t know anything. What’s the Hawthorne Effect?

Benn: Uh, I mean, Hawthorne Effect is just simply stated, people behave differently when they’re being observed or when they’re being surveilled, I guess. So, the initial experiment, which I mean the initial Hawthorne Effect experiments were pretty dodgy. That’s not really great research, but that’s sort of what intrigued people initially, but it is a real thing where if you put, if I have you assemble, if I have you order something from Amazon, a piece of furniture and assemble it by yourself, you’re going to do it differently if I put a camera over you and you know that I’m watching you assemble it, even if I’m not grading you, even if I’m not like going to hire you to build more things for me, like just knowing that you’re being watched, I mean, it takes a certain amount of mental energy. I think the crazier research papers and the more recent ones that I’ve read, like the one that absolutely blew my mind was that in areas with high levels of surveillance, people were less likely to recognize faces, which is like, there’s gotta be a weird, you know. But when you think about it, it’s like, yeah, because when you’re, when you have a bunch of surveillance cameras everywhere, you feel like you’re in a hostile environment. And when you’re in a hostile environment, you’re not gonna be as likely to remember faces because they’re not friends, they’re foes. You’re in a place where you feel like you’re gonna have more foes than friends, which like, you know, goes back to like our, our visceral psyche where we have like, friend foe, you know, things like that, and social identity theory and stuff. So it actually makes a lot of sense. Like, yeah, the brain’s not gonna waste energy remembering faces if it feels like it’s not with its kin or something.

Beau: We’re living in a world where almost everything you do can be observed, recorded, uploaded to the cloud, analyzed by AI and routed wherever someone decides it should go for reasons good or bad, and usually reasons you don’t know anything about. Are you still comfortable walking into your local library and checking out a book? Or do you read it in the stacks instead because you don’t want a record of what you’re reading sent to who knows where? Are you sure there aren’t cameras in the library trained on the stacks? Like, they can zoom in. Are you even thinking about it?

Jay: We shouldn’t have to live that way. It has the potential to, inhibit us in in ways that we don’t even know.

Beau: Here’s the ACLU’s Jay Stanley again.

Jay: Showing up to protest, voting perhaps, assembling with other people for various reasons. And we think that people should have the right to exercise their first amendment rights to self-expression, to assembly without feeling chilled, without feeling they’re being watched all the time. You know, you may be the most sterling, free, pure as snow person in the world, but if you’re driving along the highway and a police cruiser pulls in behind you, most people get kind of nervous. They don’t like the feeling of having police right behind them. And I think that same feeling could be created by drones overhead, by license scanners, by constant video surveillance. And we shouldn’t have to live that way. Nobody wants to be watched.

Beau: So how do these things slide into our lives and become permanent? Because we’re gonna opt in. We’re not even gonna notice we’re doing it. Just hit a button because we’re not paying attention. Something pops up, you know, you got a camera, I don’t know, maybe some coyotes, slinking around your barn if you have a barn. Maybe somebody keeps stealing packages from your front porch. Whatever it is, you’re not gonna be paying attention because all this stuff just happens at the speed of life, and they’re working at the speed of what’s legal to get an adoption on the other side of that equation. And that’s where I think we’re gonna get into trouble.

Benn: So a lot of people have gotten this. 

Beau: Benn Jordan again.

Benn:  I have, because I left Ring installed on one of my tablets and it was like a tablet I used for telescopes. And I like opened it up and I had a Ring notification that had a picture of a- this is like five days ago, had a picture of a dog that looked kind of like strikingly, a lot like one of my dogs saying, we found that this lost dog was found on your property. Can you blah, blah blah, like enable this feature to see it. And to see the full picture I would’ve needed to agree to terms and conditions and update and blah, blah, blah. But just looking at the thumbnail, I realized it wasn’t my property and it wasn’t any property around me. Like it just didn’t look like it was even in the region where I’m at. And it definitely wasn’t my dog. And yeah, nothing illegal is going on, but they’re trying to trick you to agree to agree to terms of service.

Beau: Was it a Ring email or was it a hacker?

Benn: No, it was Ring. It was definitely.

Beau: It was.

Benn: Yeah, a lot of people have gotten it too. And then people click it and then it says, oh, you gotta agree to these new terms. And they say, yeah, yeah, yeah, whatever. Okay, I gotta see if my dog’s missing. You know, I gotta see if this dog’s on my property, what’s going on here? And it’s complete bullshit. It’s just made up. It’s just like, it’s just an ad to open the app and agree to new terms of service and then agree to allow the dog tracking, whatever it is that they’re trying to do. But I mean, that’s a great example of like, are they your friend or are they your folk? Because they’re literally trying to trick you to agree to something that they know you’re not gonna read so they can advance the platform to get more of your data, which makes them more money. 

Beau: So while we slide into becoming used to this technology being everywhere, there is another problem too. We’re sliding away from the rule of law as determined by our peers and elected representatives. Private companies are beginning to be the arbiters of what’s what in in the uber weird-scape of for-profit law enforcement.

Jason Koebler: A lot of these companies, Ring included, are hiding behind the idea that we live in a democracy and that we have laws. And so what they’re saying is like, well, we’re just following the law.

Beau: Here’s Jason Koebler again.

Jason Koebler: We’re providing this technology to police or, you know, in Ring’s case, it’s like they’re selling the technology to consumers, but then it’s going to police. And they’re saying, well, it’s up to the police basically to like use this stuff lawfully. But we, you know, we respect the police and we want to help them. And that’s all fair and good kind of. But they have built this incredibly powerful technology that police buy access to. And it’s like unclear whether if the government were to build systems like this, like whether they would be lawful, but because they’re buying access to something that someone else is doing, it’s sort of sidesteps the law.

Jay: Yeah, I mean, we’re seeing private companies being put in the middle of law enforcement in a way that’s never happened before in history.

Beau: Jay Stanley. 

Jay: A police department in 1970, you know, probably bought their flashlights from private company and this and that. But now we have companies like Flock and Axon saying that they wanna provide the quote unquote “operating system” of police departments. We have companies like Palantir working very, very closely with the federal agencies, as well as companies, you know, other companies like Axon and there is a danger that it’s an end run around the Constitution because- not only the Constitution, but other checks and balances like FOIA and open records laws, the Privacy Act, don’t apply to private companies. You can’t FOIA Flock or Axon to find out what their internal memos are about and policies are. And there is a law called the Fourth Amendment is Not for Sale Act that has been proposed that would basically ban law enforcement from, you know, there’s a lot of very personal information about you that they could never get with a warrant because they don’t have a suspicion of you. And since they can’t get it with a warrant, they just go buy it from these data brokers. And the data brokers are basically doing what the East German Stassi used to do, which is compile as much information in dossiers about as many people as possible. They’re doing it for commercial, not political reasons, but they’re sitting there and the government goes to them, does a complete end run around the constitution. There is a proposed law called the Fourth Amendment is not for Sale Act that would ban law enforcement agencies from doing that. And it actually passed the House of Representatives. It has bipartisan support. It died in the Senate, but we and others are still pushing for that, and that’s a part of a solution here.

Jason Koebler: And then the other thing is that our privacy laws are so outdated. They’re just, they’re just really not set up to kind of grapple with what is happening. And I guess like to be specific about that, you know, you can take pictures of anyone that you want if you’re in public. And you know, you’re allowed to put cameras on your property and face them out into public and take pictures. And I’m not disputing that. I think it’s a core part of the first amendment and like journalists use this all the time. Like otherwise, a lot of what we do would be criminalized. But what’s happened now is that all of these cameras have been networked together and they’ve been added to a database. So it’s not just like they’re taking a picture or a video of a moment in time. They’re constantly filming and they’re going into a database and they’re analyzing that and then they’re being networked together. And like very complex maps of your movements and your activity are being built and that was unimaginable when these privacy laws were written. And we haven’t really I feel grappled with whether they’re still adequate these days.

Beau: Well, and whether people are aware of the fact, because most people aren’t reading the privacy policies or user agreements to the products that they’re using, if they’re aware of the fact that they may be involved in the hunting down of a person. ‘Cause a search party, another name for a search party is a posse. And it feels that way sometimes, like people are being unwittingly dragged into these posses of surveillance, and we don’t have privacy laws that would protect us from it. GDPR would protect against this somewhat, wouldn’t it?

Jason Koebler: Yeah, I think that it would. I think GDPR is a start. There’s been some state laws that have really limited what types of crimes can be investigated using some of these technologies. Like Illinois, for example, has a law in the books that says, you know, their license plate readers can’t be used to enforce immigration. There’s a law in Colorado that’s being considered right now, or a bill that would prevent it from being used for abortion enforcement. And I think that these are good starts, but the other thing you mentioned is that consumers don’t read the terms of service, and that is objectively true. There’s been studies that show consumers don’t read it, but why would you, first of all? And second of all, you can’t alter these terms of service. You either have to agree to them or you don’t get to use the product or own the product. And it’s not just consumers who are not reading these things, though. It’s like our reporting has shown that police and towns are buying this technology and then they are violating the laws of their own state, in the case of Illinois, the one I just mentioned, because their license plate readers are or were being used to do immigration enforcement, and they don’t even know that their cameras are being used for this purpose because they’re just not sophisticated about how this technology is being used, what the terms actually say, what the law actually is. You know, we exposed like dozens of police departments in Illinois violating the state law. And what happened? Like, a slap on the wrist. Nothing really happened because of it.

Beau: Look, I don’t know what the constellations of this new world of surveillance for profit is going to look like, but we know it’s going to evolve. It’s just starting and it’s a really important thing to pay attention to and learn about. And if you listen to this episode and you did learn something, pass it on ’cause it’s super important. And come back next week, because part two is gonna focus on Flock Safety in particular. We’re doing a deep dive. Now it’s time for the Tinfoil Swan, your paranoid takeaway to keep you safe on and offline. Your home assistant is listening right now. That’s not a bug, that’s a feature. It has to listen locally on the device to catch its wake word: Siri or Alexa or whatever. That part stays on the device. The problem is false triggers, and they happen all the time. Your TV says something that sounds like Alexa, your kid says something that sounds like, Hey, Google. Or they say seriously? And suddenly audio you never meant to send anywhere is going to the cloud. And once it’s there, by default, you’ve agreed somewhere in the fine print, at least with Google, that a sample of those recordings can be reviewed by contractors to improve the product. Alexa, quietly removed the opt out for that in 2025. Draw your own conclusions. Here’s what you can do. Stop them from saving your voice recordings. Opt out of product improvement programs where they exist. That includes LLMs if you’re using them, and audit what third party apps have access to on your assistant. The full step-by-step Alexa, Siri, Google guide to opting out from from these things is on our website at joindeleteme.com/podcast. Find this episode. It’s all there, thanks to Sarah Huard. 20 minutes. Your home assistant goes from surveillance device to a dumb speaker that sets timers for you and stuff. This episode of What The hack was produced by me and Andrew Steven, who also did the editing. Our theme music is by Andrew Steven. If you think you heard Benn Jordan’s music in the mix, you’re right. There’s some other stuff, but there’s some Benn Jordan too. Check him out on Bandcamp or wherever you get your stuff. What the hack is a production of DeleteMe, which was picked by the New York Times’ Wirecutter as the number one personal information removal service. You should be using it already. If you’re not and you want to, well, you can. Here’s what to do. Go to joindeleteme.com/wth. That’s joindeleteme.com/wth and get 20% off. I kid you not. 20%, 20% off. That’s joindeleteme.com/wth. Now, stay safe out there. See you around, and come back next week. We’re gonna be talking about Flock Safety.

Learn More:

  • Learn more about data brokers and their impact on digital surveillance and privacy
  • Learn how to keep your information safe so you can be a tougher target for scammers
  • Discover more from What the Hack on digital surveillance and privacy and other topics
SHARE THIS EPISODE
Hundreds of companies collect and sell your private data online. DeleteMe removes it for you.

Our privacy advisors: 

  • Continuously find and remove your sensitive data online
  • Stop companies from selling your data – all year long
  • Have removed 35M+ records
    of personal data from the web
Special Offer

Save 20% on any individual and
family privacy plan
with code: WTH

What the Hack Podcast
Dive into the latest episode of ‘What the Hack?’, your go-to podcast for real stories, shocking cybersecurity breaches, and mind-blowing digital hacks.
Want more privacy
news?
Join Incognito, our monthly newsletter from DeleteMe that keeps you posted on all things privacy and security.
Icon mail and document

Exclusive Listener Offer

What The Hack brings you the stories and insights about digital privacy. DeleteMe is our premium privacy service that removes you from more than 750 data brokers like Whitepages, Spokeo, BeenVerified, plus many more.

As a WTH listener, get an exclusive 20% off any plan with code: WTH.

Listen to Recent Episodes

This Week on What the Hack: The Surveillance Economy

Episode 237
February 2, 2026
46:45 min

The Cursor Moved at 2 A.M.

Episode 230
December 23, 2025
49:26 min