Episode 222: Heather Barnhart Hunts Digital Ghosts
Episode 222: Heather Barnhart Hunts Digital Ghosts
This week on “What the Hack?” forensic expert Heather Barnhart (SANS Institute) decoded Osama Bin Laden’s phones and helped convict quadruple murderer Brian Kohberger. Learn how surviving her own harassment fueled her mission, plus the digital “pattern of life” hack and simple steps to combat AI deepfakes and stay safe online.
Episode 222
Cold Open
Beau: It is Cybersecurity Awareness Month, so we’re gonna be talking to a forensics expert.
Heather: That’s my award right there for, it says the elimination of Osama Bin Laden
Beau: From chasing global threats to trying to deal with personal ones. Heather Barnhart from the Sands Institute has, uh, a great story and we’re really lucky to have her this week.
Heather: People don’t have to do a job or a forensic job to not fall victim to these threats that exist out there in the wild.
Beau: I’m Bo Friedlander and this is what the hack the show that asks, in a world where your data is everywhere, how do you stay safe online?
Start
Beau: Heather is the Digital Forensics and Incident Response Curriculum Lead. And Head of Faculty at the SANS Institute. And if you don’t know what the Sans Institute is or if you don’t know what it stands for, you’re not alone.
Beau: It’s CIS admin audit network and security. Uh, what does that mean? It means that they’re in the business of keeping people safe online. Basically. She’s recognized for smartphone forensics for the most part, but also windows, uh, forensics, and is just overall a person who can find things out in ways that would surprise you, which is what we’re gonna talk about today.
Beau: Heather, welcome to the show.
Heather: Thank you for having me.
Beau: Oh, I am, I am beside myself, giddy, ready to get into this because not everybody gets as geeky as I do about this stuff, but I really do care and I’m, I’m fanning out a little bit right now. So thanks for joining To start, I would love if you could share the story of the anonymous online harassment you experienced early in your career and how that shifted your perception of personal safety and, and maybe also digital privacy.
Heather: When Unknown Dialer came out, I believe that’s the name of that Netflix documentary, correct. Where the mom was catfishing her own child and stalking her. It reminded me completely of that situation. I had, I think I was 27 years old. And I was new to teaching with sans. And when you’re teaching, you provide a lot of test data.
Heather: A lot of it. So right now on my desk, I have Androids, iPhones, you name it. And I’m constantly creating data sets with it. You don’t think about your own personal information ending up in these data sets. So what happened, I was speaking at conferences. I was doing that circuit heavily. I was doing my Sands course, so my phone number was out there everywhere.
Heather: So how this person got my number, I don’t know. But it went through the Nimbus app, which is one of those apps where you can, in 30 seconds, become someone else with some fake phone number. There’s no verification or validation.
Beau: So Nimbus, uh, uh, that’s come the sort of way back machine stuff. It’s a, it’s a relic of the, uh, messaging app world. Um, telegram, signal, WhatsApp are probably the ones you’ll recognize now. And those are the, the, that ones that work best now. But back in the day, Nimbus, that was the, that was the whizzbang that got you there, uh, quietly and hard to track.
Heather: text messages started with weird comments about my feet. Now, here is a special fact for you.
Heather: I am BEEP and I wear like a size BEEP shoe, so it’s very weird. It looks like I walk on little pegs. I have really tiny feet.
Beau: don’t know. I have, I’m the
Heather: small. My
Beau: the, I’m
Beau: the opposite. Heather. I’m BEEP and I wear a size BEEP shoe. I should be really tall. Yeah.
Heather: my, my eight year old’s foxot is almost the size of mine, so it’s just really odd. It looks like I should tip over.
Beau: If you heard those bleeps, that’s because that’s personal data. You don’t need to know our shoe sizes. We, you know, wherever we can, what the heck doesn’t share that kind of information.
Beau: So I thought it was a joke. My male coworkers always joked with me, your feet are so small, you’re weird.
Beau: So you started to get texts about your feet and um, and at first you thought it was possibly someone who knew you because you have, you think, in your opinion, you have unusual fe, um, and maybe the opinion of your colleagues, which maybe is something you should talk to HR about.
Beau: But at any rate, we have this situation that that’s, you know, I’m making a joke, but it’s very serious. Somebody had, has started to reach out to, in a way that doesn’t feel good. Did the texts stay just weird or did they, did they start, did they go dark?
Heather: They went dark fairly quickly. I remember at one point they asked for a picture of my feet, and I took a picture of my male coworker’s feet, and the individual lost their mind in a text.
Beau: how did that, how did they lose their mind? Like, like several texts or like.
Heather: Several texts in a row saying, profanities threatening me if I do that again. That they know where I am. They mentioned even the weather where I lived once
Beau: And so, and was it Correct.
Heather: Yes,
Beau: Okay. And was the weather, not just regular weather? I mean, the
Heather: no, it was, I remember it specifically said something about, it’s chilly today in Washington dc. Hope you’re keeping those feet warm. It was weird. It was always about the feet. But it was
Beau: so, so you’re, you’re, you’re now dealing with someone, like at that point you were aware of the fact that you were dealing with a person who was unwell.
Heather: Exactly, Ann knew where I was.
Beau: Yep. And so what was the next thing that happened?
Heather: I reached out to Nimbus. Um, once I got to the point where. The messaging went even darker and said that they were going to wash and worship my feet, whether they were attached to my body or not.
Beau: Okay, I stop right there. I’m now in the Silence of the Lambs and I am completely finished.
Heather: And that’s where it’s not a joke. Like it’s not funny anymore. It’s not male coworkers. We used to prank each other all the time. It’s not a prank now, it’s a really creepy. Thing and I reached out and they wouldn’t, they wouldn’t help me. They said, you just have to block that number. They would not tell me where this person registered from, their name, their real phone number.
Heather: None of that was provided to me. It was just block them and good luck.
Beau: All right. Now, from where I’m standing that is not a great answer. Um, what do you know about Nimbus? That, that, I mean, is that, is, is that just, that doesn’t seem like a great practice for a company to have, um, with regard to, uh, you know, there’s, there, there are different kinds of exposure and when you’re talking about somebody who works at a government level, it it, it actually should carry some more weight and it didn’t.
Beau: What did you, did you dig in to find out more about Nimbus or to just leave it there?
Heather: I worked Nimbus as a forensic examiner just to see what’s possible to be there, but without someone else’s device, you can’t really prove anything from my device. You could see the messages great, but there was no way without server data. And cloud logs to trace that back, but also go back in time. This was 2007 or 2008 when this was happening.
Heather: So if this happened today, very similar to that net Netflix documentary with Pinger, you can trace it and it is doable with a search warrant. But I didn’t have a search warrant. It was
Beau: No, and Nimbus was like, no way.
Heather: exactly.
Beau: doing it. And which, which in their defense is probably the right answer on some level. Um, but not when a, because no crime has occurred, but, but there is a crime that’s occurred. It’s harassment. Harassment is a crime. Um, where, where, where did you go from there? Were you able to, did, did you have to just drop it?
Beau: I mean, you blocked the phone number. What happened?
Heather: I blocked the phone number and just hoped it wouldn’t occur again, and in Nimbus’s defense. Like you just said, I feel like digital or electronic terrorism, or whatever you wanna call it, it didn’t really exist then. It’s not like it is now with online bullying of kids and all the sextortion and extortion things.
Heather: It was different then. I feel like it was new when people weren’t aware on what to do and they had no idea who I was. I could be pretending to be you and say, I’m being victimized. Who is this person’s number?
Beau: but did that personal experience influence your work in any way?
Heather: It absolutely did. In me switching my messaging a little bit from this is just what I can recover when I do these cool jobs into this is how to protect yourself so people don’t have to do a job or a forensic job on you. How to not fall victim to these threats that exist out there in the wild.
Beau: So it gave you, uh, the, the victim side eye view of what it feels like to be targeted.
Heather: Absolutely.
Beau: Yeah. And, and, and, and I hate to use the word victims, target is almost better because, um, you know, as a victim it feels like you lack agency. And as a target, you can move bob and weave and do things to evade the people who are targeting you.
Beau: Um, how’d you get started? What, what was your start in Digital forensics? What initially, you know, got you interested in, it’s a really technical field, right? I mean there, when you got started, I’m sure it was like majority female. No.
Heather: Very funny. I was in the US Air Force and I was on the back of a C one 30 flying to my drill weekend. You know, when they promised one weekend a month, two weeks in the summer, that’s it. That was pre nine 11. That did not happen. Uh, some guy came up to me and said, are you the forensic girl? And I said, I am the forensic girl, whatever that meant.
Heather: I just needed a job. And he said, can you do computer forensics? I did not even know what a hard drive was on the first day of work, but I knew. I knew federal rules of evidence. I knew how to write SOPs. I knew I was smart, that I could learn technical things, and I truly thought this would be a stepping stone to get me to secret service, to be a latent fingerprint examiner.
Heather: And that was when I was 22 years old, and here I am 23 years later still doing this.
Beau: So you thought of this as like a way into the secret service and in, in a very specialized area in this special service. So, so latent fingerprints. Come on. Um, and now we’re talking about the stuff of movies where they’re like, well, actually there’s a fingerprint on the eyeball.
Heather: This is true and fingerprints are boring and I didn’t realize that. I think so many people make that mistake in college. You’re like, oh yeah, forensic science sounds awesome. And I was one of the first four people in the United States to graduate with an undergrad in true forensic science. But it was latent fingerprints and bloodstain pattern analysis.
Heather: I’ve never done that a day in my life other than college.
Beau: No, but I wa I wonder is the, the underlying principles of studying a crime scene and finding evidence probably is the same in di in the digital world and in the world of splattered blood and latent fingerprints on things that you wouldn’t think you’d be able to find a fingerprint on.
Heather: Absolutely,
Heather: And that’s how I got the job. I explained to them I know rules of evidence and I tried to spin myself as. I’m young, you can train me to do anything, but I can train your department how to properly handle evidence at crime scenes or search warrants, and that was the initial thing they brought me in for.
Heather: Write our standard operating procedures. Heather, show us what we should be doing.
C Break 1
Beau: So. People, uh, may not know that Heather Barnhart’s name should be a household name because, uh, she’s worked on some really high profile cases, including the analysis of Osama Bin Laden’s digital media, and, uh, the criminal investigation into Brian Berger, who was recently convicted for the murders of four University of Idaho students.
Beau: Um, and, um. You know, the backstory for the Coberg uh, investigation is really interesting to me. I, I, I wanna get into that in a second. But overall, you know, including the, the Bin Laden work and Coberg and in general, what led you to that work? How has the role of digital forensics, um, fundamentally changed in law enforcement over say the last decade?
Beau: ’cause it has.
Heather: yes, it changes. Digital forensics changes every single day. We have to adapt and adjust and pivot, and that’s one of the reasons I love it. It is like doing an ultimate jigsaw puzzle every single day. You learn something new, you can’t find a missing piece. That’s just how my brain works.
Heather: Encryption always a hurdle. So obviously as people who want privacy, you want data to be encrypted, so getting access to that when something bad happens is a hurdle. Um, how data is stored. The amount of devices, like, look at me right here. This is my primary device. This is another device. We are so tied to these devices.
Heather: They are the digital witness to everything we do in our life. Every single thing you think about it, it’s the most personal piece of evidence that can help solve crimes. So we need to understand the underlying matters. We need to understand where tools can help you, but also where tools could paint you into a corner.
Heather: And that’s where it can be tricky. Some people are like, I press buttons. The tools said this. Guess what? The tool doesn’t know if it’s, it was syncing from somewhere else. If you created it, if it was randomly created by Apple or Google. A smart examiner who is trained has to decide that, and that’s where I think a lot of the Karen Reed trial stuff got spun out of control with a timestamp because people don’t understand the art of validation.
Beau: Tell us about it. I wanna understand. No, but I wanna hear, ’cause like, well this is a, this is fascinating stuff and I think, you know, for all of our aspiring criminals, they’re gonna want to understand how to not get caught. I’m kidding. And if you’re out there listening, because I know you’re out there listening so that you can be better at getting away with whatever it is you think you’re getting away with.
Beau: Just stop listening. This is not for you. Um, but, but that’s what I used to say when I was eating too much chocolate. That’s not for you. That’s someone else’s chocolate. So what,
Heather: You are 100% right though. Like how did Brian Berger learn to delete all his stuff? Seriously, if you think about it, criminals are getting smarter at cleaning stuff up.
Beau: Well let’s get into that. ’cause Brian Berger wasn’t. Perfect. And that was a really interesting thing. Now, Brian, Brian Berger, um, if you don’t know, was a graduate student, also an aspiring murderer who now is an, uh, a real murderer. Um, what he wanted to do was. As I understand it, he wanted to, to commit the perfect crime.
Beau: He wanted to get away with this murder, the quadruple murder that he set out to do. And, and he, I mean, honestly, Heather look to me like he just thought he, he didn’t know what he didn’t know.
Heather: Exactly. And you know what? Olivia Gonzalez, during the victim statement, she said it perfectly when she looked him in the face and said, you are not perfect. You think you are his imperfections. And how he actually, he left a huge gap in his data by trying to be perfect and clean it up. That gap didn’t meet his normal pattern of life,
Beau: He turned his phone off, right? He turned
Heather: yeah, he turned his phone off, but he also, on his pc, he deleted data all the way back to October 12th, which makes me think that’s where he did his research on the victims.
Beau: Now I regularly delete my data. I. Know that my ISP knows what I’m doing, but the regular person in my devices doesn’t. And, um, as a forensics person, do you have access to information that has been deleted, say, on a local device that has, uh, downloaded information from nodes elsewhere?
Heather: Potentially. So it depends on time to evidence, and this is huge here. With Brian Berger, I think it took 47 days to identify who he was and get his devices
Beau: they identified him not by digital, but by DNA, correct?
Heather: DNA. And then the car, I believe would’ve been second.
Beau: Yeah.
Heather: His car would’ve been the second thing. The digital, I think is what put the nail in the coffin for him.
Beau: He left a, the sheath of his knife on the scene somewhere, which I don’t understand to this day.
Heather: It was under Maddie’s body. That may be a little graphic for here. I think he was, I think Zanna interrupted him and he had to rush to go deal with that situation and left the sheath under Maddie’s body.
Beau: So that was a mistake and he that, that included some DNA evidence. Now he was in the process of trying to possibly replace his car. When he got caught, as I understand it now, the stain that he couldn’t get out actually was not from the crime scene. It came from his devices. Can you talk about that?
Heather: Absolutely. So you, your pattern of life, you delete things all the time. It would be more difficult for me to say, oh, this murder was committed of four kids, and suddenly there’s deletion, or your phone was turned off. Now if you turned your phone off and did that, that would be interesting. So when we do digital forensics, we obviously have periods of time that are of interest.
Heather: So right before the murder, right after, one of the first things we noticed was he turned his phone off about two hours before the murder, but he turned wifi off two days in advance. So he really wanted to remain undetected. Then turned the phone on right after. The only other time he turned his phone off would be the battery dying, and we only had a glimpse from June through the end of December when he was arrested.
Heather: But in that period of time, four times total was his phone turned off and I think at least two his battery died. When he turned his phone off the night of the murders, his phone was at 100% charge.
Beau: Well, a little sidebar here is, um, tell me about what you, maybe you don’t know anything about it, but tell me about the kinds of temporary malware adware that can exist on a smartphone that. Are not able to survive a hard reboot, restarting their phone, but will actually sit on the phone as long as you don’t reboot it.
Heather: Ransomware is a huge one. So most things that are used to extort, you are just sitting right there in memory. As soon as you reboot, it goes away, but then you have to be concerned about how did it get there to begin with, so there’s some kind of vulnerability that allowed it in. Now, restarting your phone is also different than powering it off for a long period of time.
Heather: So if you’re, if you’re trying to, let’s say I get a weird popup. I’m like, oh, I’m uncomfortable. I reboot my phone, I immediately turn it right back on. So you would see and you could timeline and say, alright, this was running in the background. The user clearly didn’t like it. They rebooted their device, it powered back on.
Heather: Heather went on her way. But if I turn my phone off for four hours and four kids are killed during that time and there’s data that the FBI analyzed that puts you pinging cell towers and doing other things, it’s not looking so good.
Beau: No it’s not.
Heather: And then you have DNA, and then you have your vehicle caught on the camera too.
Heather: So there’s a lot of things that really pinpoint it all. I think the digital evidence proves the intent.
Beau: Yeah, so we had the, the phone blackout and then you had, uh, the post murder behavior, which involved him doing all kinds of searches that seemed to point to him, talk to his mom for a long time on the phone multiple times, but then you also were able to see that he was fascinated with serial killers who did something similar, right?
Heather: He talked to his mom for hours every single day. That is actually a normal behavior, which is weird. Yes, every morning, every night, hours and hours and hours,
Beau: And that, and that struck you as strange. I can see how that would be the case. Um, uh, but was that part of the, the, the, the post murder behavior that you guys were looking at or not?
Heather: Initially, it drew our attention that he reached out to, as he called, his mom, mother, and his father. Father. So he didn’t call him mom and dad. It was mother and father
Beau: Always.
Heather: within hours of the murder. He talked to his mom for about an hour and a half, and we found out from other investigators that were working it, I believe the FBI, that he was back at the scene of the crime while on the phone with his mom.
Heather: Um, there was another thing that we saw where I initially thought he was deleting text messages because there was, it just looked like a gap in a conversation. Imagine you and I talking right now and I send you a text with a link. You would think that it’s all through text, but we realized very quickly that he was on the phone with his mom and she sent him a text about Zena’s or Zena’s dad making a comment that she fought the murderer.
Heather: So there were conversations that he had with his mom about the murderer. I know on 2020 and other interviews, they wanted to know if I thought that his mom knew. I have no idea. I can make every assumption on the planet, but there’s so many people out there that already do that on social media that I don’t think they need my opinion to.
Needs a new setup/intro into this…
Beau: So tell me about the serial killers, Heather.
Heather: After, actually one time before the murders, he was researching, um. Serial killers in general. The Gainesville coed killer was one that was of particular interest to him. Danny, I don’t know if it’s Danny Rowling or Dan Danny Rowling. I guess it just depends on how you pronounce his last name. But that was one a particular interest who also murdered for with a K bar knife.
Heather: It was very similar in Gainesville, Florida. Um, after the murders, he became particularly obsessed on his online history, so Christmas night. Imagine it’s Christmas, you’re going to bed relaxing after the holiday. He laid in bed and looked at serial killers, one after the other. So there is some website that you can go out to and learn where serial killers grew up, um, who their first victims were.
Heather: Did they ever harm their pets? Why did they do it? Did they, were they abused his children? So it’s bios after bio, after bio, and where he made a
Beau: himself up or something.
Heather: I don’t know. It’s like bedtime reading. I don’t know if it relaxed him or how his brain worked, but he would click instead of just viewing it. Every time he viewed it on his Android, it downloaded.
Heather: And I think when he cleared all his history, he assumed it would go away and it didn’t because it was in a downloads directory hidden from him on his Android droid.
Beau: Hmm. Does iOS work the same way?
Heather: it does.
Beau: Hmm. Good to know. We’re gonna get into how to make all that go away at the end of the show, but no, so we’re not, because if you’re listening to this for the wrong reasons, screw off.
C Break 2
Beau: All right. So, um, the, the, uh, other question I have in this world is like, what exactly did you do on the Osama Bin Laden digital media, and what did you have access to?
Beau: What were you looking at?
Heather: Hmm, that’s a good question. So that
Heather: is my award.
Heather: That’s my award right there for, it says the elimination of Osama Bin Laden and it makes me feel very important. Um, and I have a little coin that’s on my coin holder over there that has a little star in a circle of Pakistan and that’s it. It’s just like the star where he was when Seal Team Six went in and eliminated him.
Heather: Uh, I worked Bin Laden for five quiet years, so obviously I worked in a classified environment. You can’t talk about it. You just research everything you can about how they walk, where they are. Can you find locations? Can you provide that back to the military and, um, intelligence agencies on where we thought he was, who he was training, any potential threats that were coming to the United States or anywhere else in the world.
Heather: But it was a silent mission that I truly thought would go nowhere. I never thought they would find him because it just seemed. It just seemed like it was dead ends all the time. But I will never forget standing and I had a really little house at the time in Tyson’s Corner of Virginia, and I was walking down my hallway and I heard on the news, Osama Bin Laden has been killed.
Heather: And I came running out. It felt like risky business, like sliding on the floor, sliding in my socks and my phone rang and I was like, no way. I’m gonna get to touch this media. I went into work and I worked all night long, and I was primarily responsible for his phones. He had multiple phones and just going through all the decoding and making sense of it and sending Intel reports out.
Heather: But I think the reason this this case became declassified was as quickly as we were seeing it, CNN was reporting it, which blew my mind. I was like, how is this possible? We’re all locked in these classified caves doing this information. It was immediately, the media had all the information so quickly. It was wild to me.
Heather: I have no idea. Clearly someone was saying something is the way it seems
Beau: were presumably looking at this data in a skiff or something.
Heather: Yes.
Beau: Now, uh, how many phones did he have?
Heather: I, so I don’t know how many he had in total. I saw five.
Beau: Geez. Uh, and all Android or everything.
Heather: You ready for this? Nokia
Beau: of course Scandinavia to the rescue and, um, and flip phones or, or smartphones.
Heather: Not smart phones, but not flip phones either. They were like the little bricks, the little block phones.
Beau: Got you. Final question on the Osama bin Laden phones. Were there any really average everyday messages like, don’t forget to pick up eggs.
Heather: Yes, but I couldn’t really read them because I don’t know Farsi or Urdu. So we had translation, but normal behavior, medical stuff, he was sick. He had a lot of wives. So it wasn’t just all you would hope, like, it’s like, oh, it’s all the bad things he’s going to do. And that’s the hard part is sifting through noise and seeing what is normal, what is not, and what do these things mean.
Heather: But normal behavior too.
Beau: But, so the noise, it’s the same thing with Brian Berger. You’re looking for the, the, the parts that don’t. Fit a pattern and that’s when something major happens.
Heather: And Brian Berger is a strange bird in the fact that he almost had no noise. He had no friends, truly no friends. He talked to mother and father. He had a few like. Through Washington State University, like a group chat that he just seemed to lurk in. No friends, no text messages, no calls to other people. It was, he was an isolated being.
Heather: It was creepy to watch his stuff and you know what? Or work his stuff. My son said to me. Um, my husband and I, we have four kids blended from eight to 13. So cybersecurity, their lives are erected because we’re always threatening them of the worst because we’ve seen really bad things.
Heather: But my son said to me, he was like, mom, what if you testify in this trial and the jury doesn’t find him guilty, and he comes here and gets you? And I was like, I never, it never crossed my mind. It made me a little bit afraid, like, holy cow, should I even take cases like this? Do I wanna help? Do I wanna be involved?
Heather: Because what if that does happen?
Beau: Well, there are, you know, you know where I work. Um, so there, there are ways to make that a lot harder to have it happen. And, um, and we can talk about that offline, but, um, the answer is you can be unfindable and you know that, um, and sometimes harder, sometimes easier, but it’s doable. Um, you’re one of the leading experts in smartphone forensics. Um, so what are the. Key challenges investigators face today when trying to acquire, decode, and interpret data from encrypted apps. ’cause that’s, you talked about it, you touched on it, but you know what, what is the, what’s the big problem set these days?
Heather: I think the biggest issue is not getting to the Android or iPhone soon enough, and the device is overriding the data on behalf of the user. So the purging that makes it really, really hard, and there’s nothing we can do about that unless Google and Apple stop that, which you’re not going to because a user privacy, that is the hardest thing.
Heather: The tools are expensive. That’s another one that we hear all the time. But if you want. Full file system access, which means all the files off the device. You need that.
Beau: But so you’re, you deal with a recency problem. You have to be, it does have, the investigation needs to be happening in a, in a, in a timely way within, within kind of eye shot of an event.
Heather: It does, otherwise you’ll lose everything that’s deleted. A lot of location artifacts that can really help you. And if you’re working malware, there is a file, the privacy dashboard on Android, it lasts 24 hours only. Every 24 hours, it’s overwritten. But that thing would tell you everything you need to know about camera, microphone, and location.
Heather: Perfect for stalker wear.
Beau: Okay. Um. How is, you know, I think, I think going back to your origin story with the person who, um, targeted you using Nimbus, how is AI up application making cyber crime even harder to, um, you know, sort through the noise?
Heather: AI makes. Alright. Here’s a good analogy that I heard. Someone said that AI. Throws hay on a haystack. We, as examiners are always told, you have to find the needle in the haystack. So when AI is throwing hay on it, it makes it really, really difficult. Um, where AI, I think is scarier because tools are getting better at saying, okay, is this AI or real?
Heather: The tools are helping us with that. The different generations are being targeted. So the victimology and the amount of victims, it’s through the roof and outta control from teenagers with sextortion. To our parents’ generations with ai, deep fake scams, stealing all their retirement, which is awful. That is running rampant
Beau: And romance scams.
Heather: And the romance scams of like our typical generation,
Beau: you mentioned your children and you mentioned sextortion and children in Sextortion, sadly, um, go hand in hand these
Beau: days,
Beau: um, because. Younger people do not know how to navigate that kind of situation.
Beau: They’re already adult by their hormones, and now somebody has figured out a way to weaponize that against them. Um, for, for money. Um, what are the biggest cyber risks that the different generations face? We know what kids are looking at, so Gen Z, we, we have a sense of that. Millennials a little different.
Beau: Gen X. We just don’t answer anything. So like, you know, I think we’re safe, fa, safer, uh, it might be a little lonely. The romance scams might start kicking in their face, you know? So are there different approaches that each generation should take, uh, at cybersecurity awareness month? So is there, like, is it, is it a generational thing or are there some basics that that are, that are intergenerational?
Heather: A little bit of both, but it is generation specific. So let’s start with Gen X. We are the first. To have to deal with this. We’re the first with ai. We’re the first with kids having smartphones and devices like growing up, my dad’s worst fear was, I’m gonna kick him off the phone when I’m trying to log in with a OL and I’m gonna have a note where something then poor was written in my backpack.
Heather: So we have to learn how to deal with it ourselves, and then also educate our children and educate our parents on what’s going on at Sands. We have these tip sheets. That were created to secure the generations, and you can go out to the website and you can grab them, but it has every single generation and things that they should be aware of.
Heather: Also, kind of the tech, like the tech knowledge that that generation should have. And it just reminds you on what they learned on and what they think is possible. But the greatest threat to the older generation. So if you think of parents, grandparents, it’s ai deep fakes. Targeted towards stealing retirement.
Heather: That is the biggest one. Um, they will be a familiar voice. The Microsoft one is so sad, like Microsoft needs to remote into your system and help you with something. There are so many people that I know even my age, that are falling for that Microsoft
Beau: And they’re going and finding your financial apps or your financial fi your, your, your, your credentials and going in and emptying out your accounts.
Heather: Absolutely a Gen X threat that is coming out a lot other than. The sextortion email saying, Hey, I have a remote access tool. Anytime you see rat or remote access tool installed on your computer, it’s fake. Just delete the message and ignore them.
Beau: A hundred percent and buy, and if anyone ever wants access to your computer, and it’s not me, ’cause I don’t care honestly, and I’m not gonna help you either. Uh, no, but seriously, it’s just a no. I don’t even understand. I think if, if a cybersecurity person we’re to be able to just pass a. Unilateral federal law, outlawing rats might be the one that I’d pick.
Heather: Mm-hmm. Absolutely.
Beau: Go to a store if you really have that kind of problem, like go to a store.
Heather: Yep. Another one that’s happening with Generation X and even a little bit of the millennials. Is your data has been breached? Provide us with your financial information, your social security number, where you live, all this personal information so we can protect you. And you’ve never been a part of a breach, but you just gave all your sensitive information to a stranger.
Heather: It’s terrible.
Beau: Well, you know, the funny thing is in the among data brokers that sell, you know, the, the people finder sites that we deal with, um, some companies use blind opt-outs. We don’t, but some companies do. And um, what that does is it sends out a blanket. You know, this person doesn’t want to be in your database. You that also what that does is it sends your information out
Heather: Yes.
Beau: a lot of people, right? And, and, and I, I think like, there’s just gotta be some basic rules of the road. Uh, you know, which is if you know, if you have a choice between providing your data and not providing your data, A classic example is at the doctor’s office where you’re asked for your social security number.
Beau: The answer is no. They need it to bill you, they need it just in case your insurance company doesn’t pay, let them chase the money. The answer is no. They can’t legally make you give the social security number. And, and no has to be, you know, one of the reasons why older adults are, um, targeted successfully more than any other demographic is because they’re polite.
Heather: That is true,
Beau: they don’t wanna say no. They have manners. Has your career. Um, and personal experience with harassment shaped the way you approach safety online for yourself and your family.
Heather: Absolutely. And I try, so I have social media buckets. I have people ask all the time, is it safe for me to post pictures of my kids? What if AI is used to make harmful pictures of them? And it could be, um, if I am on LinkedIn, I use that professionally. So I try to keep my kids and my personal life off LinkedIn as much as possible.
Heather: I sometimes like to humanize who I am and put things out there so people don’t think I’m just some nerd sitting behind a keyboard all the time. Like I do have a life and I am a mom and I am a wife, but anything that’s meta related, I control who my friends are. I do, I’ll do random checks, and it’s actually cleansing to your soul to sit there and spot check and be like, who is this person?
Heather: If you’re, if you’re not wishing them happy birthday, delete them Seriously. If you see people, why are you still friends with that person? Delete them. Because what if they got hacked and they’re no longer that person? So if it’s not someone you’re constantly talking to, I recommend deleting them. But I try to control where I put any personal information out there.
Heather: I don’t let my kids share location at all, except with parents. That’s it. You cannot share location with friends. There’s no need. It’s usually used for bullying. I tell my son all the time, I’m like, you know how we knew where we were in the eighties? You would see a pile of bikes outside someone’s house, go ride your bike, go ride your bike.
Heather: No phones in the bedroom is another huge one. It’s the same rule. Like think about it as kids. You weren’t allowed to have boys or girls and friends in your bedroom. You shouldn’t be able to FaceTime in your bedroom. Um, nothing good happens after dark on phones or devices for kids. Take them away. Keep them out of their bedrooms.
Heather: It’s okay for them to be mad at you. Your job is to protect them because the second we give a device, threats are entering.
Beau: Say it again. Nothing good ever happens after dark with a child and a device. Nothing.
Heather: It’s true. Even some adults, even our generation, you have a few drinks, you’re bored, you’re on your phone. That’s when the romance scams creep in. That’s where you get busy and you’re multitasking and you’re clicking stupid things.
Beau: Yeah. And if
Heather: Nothing good happens
Heather: after dark.
Beau: if that doesn’t resonate for you because you like having problems, um. But you like being healthy. Blue light’s bad for you. It makes your sleep suck. So just get off your devices anyway and get a projector. Watch your TV on a screen. Um, so honestly, it’s totally true.
Beau: I have a projector and you know, we have shared, and I’m gonna, we, and, and you may have noticed if you’re listening, that there were some bleeps in the beginning. Those bleeps were bleeping out, Heather shoe size. My shoe size, Heather’s height, my height, and the reason we beep those out is because I think a lot about social engineering.
Beau: It’s cybersecurity awareness month. Social engineering is just being tricked online. And the more someone knows about you, the easier that is. And so if I say, Hey Heather, I know you wear a size BEEP shoe, and I have these wonderful thick felt slippers that I just found online. I wanna send them to you. What’s your address now? I know where her address is not cool and, and Heather and I are now friends, but I bet you she doesn’t know what my birthday is and I’m not gonna get anything through Facebook on my birthday and that’s fine. Um, so what is a low effort action people can take today to dramatically?
Beau: Increase their digital security and resilience. Resilience is different from
Heather: I give three things? Can I
Beau: Yeah. Yeah.
Heather: Okay. Item one, log into your Google account and check how many devices have access to your Gmail and Google Cloud. It’s so easy you log into google.com, click on your devices. Delete the ones that shouldn’t be there. Old phones.
Heather: Old child devices, get rid of ’em, delete it. It’s better to delete everything and re-add. Do that. I would say twice a year. The next thing, if you use third party apps for social media, which we all do, set multifactor authentication, it’s one flip of a button. It can save someone hacking you very easy. And then third one, I actually got this from a reporter when I was doing one of my interviews, have an AI deep fake password.
Heather: So if you get a call right now and I’m like, help me. Someone has me kidnapped and it sounds like my voice because AI can be trained on my voice. And you’re like, oh no, I’m actually gonna send $300. So you release my friend. What’s the passcode?
Beau: Phenomenal advice, uh, in my family. We also just have a safe word in general, and that means someone calls me up from my family and says, I have a problem. I say, what’s the safe word? And.
Heather: like the eighties. I had a password if someone wanted to pick me up from dance class and drive me home. It’s just like that.
Beau: It’s the same old thing because it’s the same old story ’cause it’s the same old human race.
Heather: Yep. Maybe we should add a fourth. Don’t talk to strangers online. Everyone’s a stranger unless you can pick up the phone and meet them in person. Everyone is a stranger. Don’t
Beau: I’m gonna do one better. Imagine the scariest person. Whatever scares you the most, imagine that person and now every time you get a text from some a number you don’t recognize that’s who it’s,
Heather: Hmm. For me it’s Michael Myers from Halloween.
Beau: oh man. It is Cybersecurity Awareness Month. That does fall in October. All right. Is there anything that I left out that we want to make sure folks understand as we trundle through Cybersecurity Awareness Month?
Heather: Hmm. I have the best job. I think I have the best job on the planet. Anyone that’s trying to pivot careers, or if you’re younger and you’re like, I wonder where my life should take me, do digital forensics. It is amazing. It’s fun. It keeps your brain sharp.
Beau: I love that. That’s the thing you left out. I love my job too.
Heather: I do. I love it.
Beau: Heather Barnhart, thank you so much for joining us. I really enjoyed talking to you.
Heather: Thank you so much for having me.
Tinfoil Swan
Okay, so now it’s time for our tin foil swan, our paranoid takeaway to keep you safe on and offline. The tin foil swan this week could not be simpler. And it’s not because they think you’re a criminal. Uh, here it is. Turn your phone off. Turn your phone off when you do things that you do every day. And leave it off for five minutes on either end or 10 or 15 or whatever. You don’t need your phone all the time. Here’s the deal. Have you ever turned on your phone and it says you ready to go to Pilates or yoga or, sorry, I’m, I’m giving you a beat on me.
Uh, I don’t do either of those things actually, but, um. It’s time to go fishing bo, whatever it is. The reason it’s doing that is because your phone knows where you’re going because your internet service provider knows where you’re going. That’s not cool. And there’s a way around it. If you’re not a criminal, if you’re a criminal, leave your phone on.
’cause I want, we all want to know where you’re going, but if you’re just a normal person who doesn’t wanna share information with the, with ISPs and through them advertisers about what you do and when you do it, the easiest way to do that is to change up your game. Turn your phone off every once in a while.
Remember that everything you do is tracked that device that we all carry in our pockets, our phones, it’s haunted, it’s super haunted, and it’s not, you know, the, the fun ghost from Halloween.
It’s the, uh, the weird ones that, uh, retarget us on, on Black Friday and Cyber Monday, and haunt Christmas all the way through. They’re the ghosts of Christmas, by me, by this, by this, by this. Now. So turn off your phone from time to time. Do it as part of your cyber hygiene because the less people and companies know about you in theory, the better.
Thanks for listening, and we’ll see you next week.
Our privacy advisors:
- Continuously find and remove your sensitive data online
- Stop companies from selling your data – all year long
- Have removed 35M+ records
of personal data from the web
news?
Exclusive Listener Offer
What The Hack brings you the stories and insights about digital privacy. DeleteMe is our premium privacy service that removes you from more than 750 data brokers like Whitepages, Spokeo, BeenVerified, plus many more.
As a WTH listener, get an exclusive 20% off any plan with code: WTH.

