Skip to content
 
Episode 21

How Cyber Criminals Exploit Human Behavior

EPISODE SUMMARY

Today the team dives into the human side of cyber security with Jessica Barker, Co-CEO of Cygenta and author of two books: Confident Cyber Security and the recently-released Cybersecurity ABC’s.

Jessica goes into the awareness, behavioral, and cultural factors behind malicious actors and human failure. Also, learn why we need to change the way we treat victims and how companies can enforce positive security behaviors in their employees.

Subscribe or listen now:  Apple Podcasts   Spotify   iHeartRadio

mike-gruen-150x150
Mike Gruen

Mike is the Cybrary VP of Engineering / CISO. He manages Cybrary’s engineering and data science teams, information technology infrastructure, and overall security posture.


Joseph Carson:
Hi, everyone. Welcome back to another episode of 401 Access Denied. My name's Joe Carson, one of your co-hosts today, chief security scientist at Thycotic, and I'm really excited about today's discussion. This is a followup to the previous guest, Jessica, where we have another guest, Jessica today, who's going to take us through some of the human behavior slides of cyber attacks, and really the true impact, just the true cost of cyber crime. So really excited about today's episode. Again, I'm joined here with Mike, my awesome co-host. Mike, do you want to give us a little brief introduction?

Mike Gruen:
Yeah. Mike Gruen, CISO and VP of engineering here at Cybrary located in DC. Jessica, thank you for joining us. Very much looking forward to this. To start off with, why don't you give a little bit of background on yourself? Maybe mention the fact that you have a book coming up and a little bit about that as well.

Jessica Barker:
Sure. Thanks so much. So it's pleasure to be here. So thanks for the warm welcome. So I'm Dr. Jessica Barker. I'm co-CEO of a cyber security company based in the UK, we work globally, called Cygenta. So with my husband, FC, we run a small and awesome team working on the human, technical, and physical sides of cyber security.

Jessica Barker:
I've always worked on the human side, my husband on the technical side and the physical side. So a few years ago, we thought, "You know what, let's bring it all together and see what we can do." So yeah. I work on awareness behavior culture, absolutely love what I get to do with all sorts of different clients on that side of things. I also published a book in September this year, published by Kogan Page for Confidence Cyber Security. Became an Amazon number one bestseller on the day of release, just saying.

Joseph Carson:
Very good.

Jessica Barker:
So people want to learn more about cyber security, which I think is great news. Have another book coming up in January, which I have co-authored with Ciarán Mc Mahon, Bruce Hallas, and Adrian Davis, and that is Cyber Security ABCs, all about awareness, paper, and culture.

Joseph Carson:
That's fantastic. I think this is sometimes the forgotten part of cyber security is one thing that I can usually get upset about over the years is we always blame the humans. We aways blamed them. They were the fault. They were the error. They were the ones that clicked on the link. When we think about it, I always get upset at that, because I will see that they're the victims as well. In most cases, even when we talk about insiders, they are secondary victims. They're not intentionally.

Joseph Carson:
If they drop something, leave something in an airport or they click on something, in many cases, that's their job, and we victimize them too much. That always got me upset, and I think it was really important what you're doing in the industry to really raise the awareness and raise that these are victims, these are the victims of crimes, and we should empower them. We should look to how we can use technology to help them, not make them the blame. So is that something you're seeing as a kind of upcoming kind of awareness factor?

Jessica Barker:
Yeah, I think it's changing actually a lot, which is brilliant. When I first entered the industry, it's basically 10 years ago, I came in with a background from sociology, politics, civic design, and I found it really weird the way that everyone blamed the user as well. We didn't even call them people. We blamed the user as if that is different beast that's different to us, as if we aren't users as well. Yeah, a lot of kind of victim blaming.

Jessica Barker:
So I was really interested to draw parallels, for example, between some of the sociology research around victim blaming and some of the stuff on criminology and then what was happening in cyber security. You do still see it. Of course, you do unfortunately still see a lot of victim blaming both at the people level and at the company level.

Jessica Barker:
But I think we're moving away from it. I think the last couple of years, we have started to see that shift where many people are calling out this victim blaming. So for example, one thing I do, I'm the chair of CLABSI, so which is a group of over 500 information security leaders. In our research this year, we asked them about a hundred of our membership, what they were most interested in for the year ahead, and they said security culture, and we asked what they would doing in terms of advancing security culture.

Jessica Barker:
One of the number one things they said was promoting a proactive, no-blame report culture. So really trying to move to adjust culture where we don't blame people for incidents. So there's a definite shift happening, and I think that's really positive.

Mike Gruen:
Yep. Yeah. I think, right, in a lot of cases, it's people just either just trying to do their job the best way they can and making an honest mistake or not or whatever it is, or as Joe pointed out, the sort of clicking on links. I think there's probably, and I'm curious what you found. The number of actual malicious insiders is probably very, very small.

Jessica Barker:
Yes, it is. A malicious insider is really damaging. When they strike, the psychology of a malicious insider is fascinating. We can look at sort of models around fraud that show us, so there's the crows pentagon, which looks at the sort of five different factors that are usually in place when somebody carries out sort of internal fraud when only have a malicious insider. Usually, they are disgruntled. We'll have all seen these cases. Often somebody who has been in an organization for years or decades, and they just feel looked over, they haven't been promoted, they feel they're not getting rewarded or recognized. So they're disgruntled. They usually have a pressure, maybe debt, maybe problems at home, whatever it is, something driving them there, and then a few other factors, overconfidence, thinking they can get away with it.

Jessica Barker:
They usually are pretty smart, and all of these things kind of come together, and they actually convince themselves that they are not a criminal. They're not doing anything wrong. They convinced themselves the organization is wrong and that they are getting what they deserve by doing whatever they're doing. But that is rare to see that. I think that the figures vary according to research somewhere between 3% and 10% of the insider activity. Whereas the first majority is non-malicious insiders exactly as you've both described.

Mike Gruen:
Right. But I think the malicious one, it's just a much better story. So it tends to get more... Right? That's where typically-

Joseph Carson:
More immediate attraction.

Mike Gruen:
Exactly.

Joseph Carson:
Employee accidentally clicks on the link. Doesn't really make the headline news. But malicious insider who purposely came back here with the credentials and deleted the entire service, that's what makes news. So unfortunately, I always get upset as well. We tend to go after those big stories, and we don't get into the reality of what really is getting into the human side of this, that ultimately the internet and the browser was created to click on things.

Joseph Carson:
That's ultimately what many people's jobs is to open attachments, to look at links and to fill in forms and to... In many cases, the internet and what we do today is mostly in a browser and to victimize people for not being able to detect. They're relying on technology in the background to do the job, to do what's intended to do.

Joseph Carson:
Now, if you clicked up a hundred links or a thousand links and say the thousand and one is a malicious, I don't think we should be blaming the humans. I think it goes back to... One of the things I always believe is that we need to empower them to be able to not be afraid. I remember a few years ago, I did a major workshop and educational session with all teachers in police force. They said, "There's one thing I can tell organizations that would make the difference is, what would it be?"

Joseph Carson:
The one thing that came to mind is that never be afraid to ask for advice, is that when you accidentally click on something, don't be afraid to go and say, "I did something, and I'm not sure whether it's malicious or whether it was good or not." The organizations have put that in practice, really are able to... They're able to stop the attacks much earlier because the employees are more willing to actually speak out to IT. Jessica, do you feel that we're getting there more, accelerating that type of mentality and culture shift?

Jessica Barker:
I think we are. It's slow to change. But I mean, just yesterday, I was speaking to a client, financial services, and they were talking about their phishing simulations, the fact that they're moving away from looking at click rate, to looking at report rate. I have had that conversation with so many CISOs in the last year compared to a few years ago where it was very rare that somebody was looking at that. So I think we are moving towards it. Of course, it takes time.

Jessica Barker:
But that positive reinforcement of the behaviors you want is so much more impactful than just blaming people for the behaviors that you don't want, that like you say, Joe are impossible anyway. Don't click on links. Well, I have to click on links. So be wary-

Mike Gruen:
That's right.

Jessica Barker:
Be wary of suspicious emails. Well, I mean, some of them are obvious, and some of them are much more sophisticated. So what are we even telling people when we say be wary of suspicious emails? That's incredibly vague and unhelpful.

Mike Gruen:
Yeah. That actually reminds me of... I saw a security researcher. I think I've told this story before. I want to say it was at Black Hat.

Mike Gruen:
Yeah, exactly. I only have so many stories. But she was talking about how... So she's a security researcher. She was like, "We tell people they have to be vigilant, 007 vigilant all the time." She's like, "How hard is that?" So she decided that she was going to hire a company to spearfish her. She knew it was coming. She knew how many times they were going to try and attack her, and she knew over what time period they were going to do it. I think she said she fell for three out of four or four or five or whatever it was, and it's impossible.

Mike Gruen:
Even as she clicked the link, as soon as she clicked the link, she was like, "Oh, nuts. That was definitely a phishing attack. That was them, and I screwed up." It's because they just... I mean, right. I mean, depending on what your job function is, I have an externally facing email. There's people who have to contact me who I've never been in touch with before. Right? Don't open emails. Don't click links. So I don't think awareness and being vigilant is a reasonable ask, and I'm curious what your thoughts are, Jessica?

Jessica Barker:
Yeah. I mean, I think awareness is really important. I think people being aware that this stuff happens, understanding some of the telltale signs, I think that's all good. But we can't rely on that as our only defense, and we also have to make sure that with our messaging, that we are precise, that we're consistent, and that we recommend stuff or try and advise the things that actually people can action. Otherwise, we just reduce that ability to engage with us. We reduce the extent to which they want to. We increase fatigue. We're just creating noise and making it much harder to get through with the messaging that actually matters.

Jessica Barker:
So I did a keynote for some security awareness summit a couple of weeks ago, and I paralleled some of the communications we've seen around COVID-19 and cyber security communications and some of the criticism of some of the COVID-19 communications, where it has been imprecise, where it's been vague, where it's being actually recommending stuff that people can't practically do, and where it says things, for example, in the UK, be alert.

Jessica Barker:
Well, what does that mean? Oh, my alert all the time. That's exhausting beyond anything else. Focus groups around those messaging found that people's understanding of it was low, and actually in some age groups their flouting of the rules doubled. It's the same, I think with cyber security messaging. So me and my husband, FC did this all of parody video of some of the cyber security messaging, where it's kind of like, "Do this, but don't do this, and what about that, and what about this?" We end up being so contradictory that it just ends up impossible, even us to action, let alone people who aren't security professionals.

Mike Gruen:
Oh, yeah. No.

Joseph Carson:
That's right.

Mike Gruen:
I mean, the fact is I break the rules all the time. I have to. In order to do what I need to do, I have to click on links sometimes that I've never... I don't know who this person is, but I'm reasonably confident, and I have to use my brain to figure out what's the risk here. I think awareness, I think you bring up a great point. I would just want to make sure that stress, it is. Awareness of what to do is way more important than awareness of being... The idea of vigilance, it's tough.

Mike Gruen:
The idea of that oh shit moment, I clicked the link, now, what do I do is way more important and something, right? To Joe's point, we need to make sure that CISOs and people like me aren't like, "Oh, what did you do?" It's like, "Oh, okay. Let me help you." Let's try and figure this out. Let's get ahead of it. Thank you so much for letting me know early, rather than sweeping it under the rug and me having to find out about it. It's sort of like my kids. They're not going to get into trouble if they come to me. But if I find out about it after the fact, that's where there's the problem.

Joseph Carson:
Yeah. Even the dog around, that cartoon that was seen many times, everyone has a Dave in the organization. We got IT insecurity on one side. You've got Dave in the other side. But to be honest, reality is we are all Dave. All of us are. It just comes on to how criminals abuse our trust and what's our point of being abused? Even the point where having to look and be vigilant, to your point, Jessica is really looking at being alert all the time. That's going to be able... That wouldn't be just so much pressure and fatigue and exhausting to do that all the time.

Joseph Carson:
Interesting. I remember years ago getting into implementing. It was a security communication to the employees. It was a large organization, over a hundred thousand employees, and we were doing this communication. We kept failing. We kept failing all the time because putting into this very legal, very official policy speak does not relate and does not communicate effectively to the employees. What I find is that really getting into, we changed. Eventually, we actually brought kids in to tell us how to communicate to adults.

Jessica Barker:
Nice.

Joseph Carson:
They told us that the best thing is getting into comic books. Comic books tell a story, getting into that imaging and graphical. We took specific use cases, and we actually put it into storyboards. That was the most kind of crucial thing that we took. One example of USB, plugging it in, what impact can that be? It also made that we didn't have to translate it, because then you get into language barriers was able to be translated across that. So what's your suggestion into some of the best ways that we can communicate with employees effectively?

Jessica Barker:
Yeah. I think that's a great example, and we've used some comic book stuff as well with clients that works really nicely. People engage in it. You can have some fun with the graphics. So I'm a big believer in telling a story, which again, that speaks to telling stories are really powerful, and people engage with those way more than they do, say statistics or facts and stuff like that.

Jessica Barker:
So using emotion in the right way, empowering, definitely used that word earlier, Joe, and I totally agree. So moving towards empowering, rather than focusing on the FUD - fear, uncertainty, and doubt. One thing I always sort of talk to clients about is the difficulty of any awareness raising if you don't have an action attached to it and if people can't do it. So for example, if a client's wanting to do a campaign around passwords, but they don't have a password manager, or then they don't have single sign-on, that's really challenging, because then what are you asking people to do?

Jessica Barker:
You're asking people to remember complicated, unique passwords. That's just physically impossible. That's not happening. So I think making sure you have an associated behavior and that that behavior can be supported is really important for awareness raising and also bringing this stuff to life. So I find demonstrations of attacks are really powerful, getting anything interactive and hands-on, whether that is lockpicking or stuff like that, but getting people engaged, incident response scenarios, tabletops, and things like that all really effective. Anything that brings this stuff to life and is engaging.

Joseph Carson:
Yeah. I love the lockpicking because it shows that in a physical... I always like to try and compare things in the physical world to that reality and digital world, because sometimes in digital, it's how hard to show the impact. When you're lockpicking, it really helps you show that visualization. It helps you bring it to life. One thing I've really loved is, I don't know, Mike, if you've seen, but a good friend, Ian Murphy has really kind of put these series of videos into cyber off the last like six months or so. I'm putting it into songs and putting into graphics and images.

Joseph Carson:
I think it really. While sometimes maybe a little bit over, I get it. I love it. But I think it's a really great way of showing the reality. So Jessica, have you seen Ian's series of videos of cyber offs, and what's your thoughts on those approaches?

Jessica Barker:
Yeah, I have. I think humor is something that's often overlooked as a tool in this industry, and I think it can be really powerful. Obviously, it depends on the context and the client culture. So any awareness raising has to kind of go with the culture of an organization. I sometimes see, and this is not connected to that, but this is moving on, just in general. I sometimes see infosec teams wanting to change the culture of an organization or put in materials that are at odds with the culture. You really have to go with your culture, and of course you want to shift it. So it's getting the right tone of your awareness raising for the right context that I think then you're looking at something kind of magic.

Joseph Carson:
Yeah. We can change people. We have to take security and put it into the existing. There's a metaphor that I've used to try and compare this, is that if I turn around and tell all my employees that from tomorrow, "I know that you tried to work. I know you drive from A to B, and that is most convenient. But tomorrow I want you to take a train, a bus, walk, and cycle, and that's because it's safer, it's more secure."

Joseph Carson:
But to change that habit, it's difficult to do. What you want to do, what your real intention was to do is actually maybe go slow or be careful as you're driving on the road. So what you're trying to do is not change people's habits, but you're trying to embed security into their existing methods of doing things. It should also somewhat be in the background. It should not be something that they should potentially have to completely change. It should be just actually existing security controls into what to do. Mike, I have a question for you.

Mike Gruen:
Yeah. Yeah. my feeling is right, it's really about changing people's thought process or how they approach how they're going to do what they're going to do. When I got to Cybrary, for example in 2017, the company had been around since 2015, and so I came in, and I was like, "Okay. I have to change how we're doing things." But I also want to make it so that people. I don't want to change how we're doing it, just how. What I want to do is change people's thought process of why they're doing what they're doing and then just make it so that it's just more obvious and more clear, "Oh, I should do it this way because X, Y, and Z." Not everybody needs admin access to this thing because these are the bad things that can happen if everybody has access to the PayPal account, for example. So...

Jessica Barker:
Yeah. It's the hearts and minds, isn't it?

Joseph Carson:
Yeah warning signs.

Jessica Barker:
Yeah, and helping people understand the why, which I think you've both touched on that is really important because if we just tell people, "Don't do this, don't do this, don't do this," and give no idea as to why, then it's like, "Well, why should I. That's just making my life more difficult, and it's pointless." It's just impose. They're coming up with some rules.

Mike Gruen:
My feeling is it's also about making it, not just the why, but make it easier for them to do it the right way than to do it the hard way, and I've learned that lesson over and over again across whatever it is in technology is people will take... It's a stream, right? The water will flow the path of least resistance. So make the fat path as easy and as secure as possible. We're talking-

Joseph Carson:
Sure make it easier.

Mike Gruen:
Right. We were talking about humor earlier. So our security awareness training, that was one of the things I looked for, was I found a company that does them. It's a monthly module. They use a lot of humor. It's a little hit or miss sometimes. But overall, the feedback I've gotten from everyone was like, "As bad as security awareness training is, this is the best security awareness training I've taken." Because it has that element of humor, and I thought it was pretty cool when our HR people came, and they're like, "We want to do anti-harassment training, and we want it to be somewhat or how we're doing the security awareness training and that same level of humor, but not that cringy. Which is very much more difficult to do in any harassment security.

Jessica Barker:
Yeah. That's a challenge.

Mike Gruen:
Exactly.

Jessica Barker:
You want to get that right.

Mike Gruen:
Absolutely.

Joseph Carson:
We want Mr. Bean version of cyber security so... but I have a question. So Mike and Jessica, one of the things I've got is that being you based in the UK and Mike in the US and me based in Estonia is that, are we seeing differences in cultures as well? Because I have seen in North America, some cases in the past year or two, they've taken a much more aggressive approach to employees, disciplinary actions. Is that something we're seeing continuing, that employees may be clicking a phishing email that know become... actually losing their job as part of that. Is it a culture difference? Are we seeing that just being a unusual just spike in that events, and we're changing or to seeing them as victims?

Jessica Barker:
So I'm really interested to sort of hear Mike's perspective on this. I can certainly talk about one case from the UK which hit the headlines. I think I know the final outcome of the case. Essentially, a worker for a pretty small company up in Scotland, she received a phishing email. I think it was invoice fraud. She transferred the money, and then they realized it was phishing. Too late. The money's gone.

Jessica Barker:
We've heard that story a million times, unfortunately. But the company fired her. So she lost her job. Then they took her to court to try to sue her for the money that was lost, which I think was a couple of hundred thousand. In the end, I think I've seen the final bit of the court case because sometimes these things come back around again. But in the end, the last I saw it, the employer or previous employer had lost the case because it was deemed that the employee hadn't had sufficient training. So then that sets a precedent of organizations need to make sure that they support people, that they train people, that they ensure there is a way to report these phishing emails, that everyone understands it.

Jessica Barker:
That was the sort of tense moment, I think of like, which way is it going to go? Because if the courts had sided with the previous employer, then that sets a whole different precedent, where suddenly, we are blaming people on a whole other level.

Mike Gruen:
Yeah. And my-

Joseph Carson:
Yeah, terrible.

Mike Gruen:
Yeah, that seems terrible. Yeah. I mean, my experience with working with different organizations. So prior to being at Cybrary is a company called red owl, and we sold mostly to financial services. It was a user behavioral analytics platform. Right? So trying to identify potential inside risk and inside threat and potentially malicious insiders which I found funny given how small percentages, in any event.

Mike Gruen:
What I found was that... I don't know if it's a US culture thing or if it's an industry thing. But definitely, there was big differences between how different organizations just approached it. There were some that wanted rule, like draconian dictator style with an iron fist, anybody who screws up is out of here type of attitudes. Then there were others who were what I would classify as way more progressive. No, no, it's on us. If a user does a thing and it results in bad stuff, that was us failing them, not them failing the company.

Mike Gruen:
I don't know if it's a culture by country or industry or if the stakes are higher at large financial institutions, and therefore they feel that they have to be more draconian. I don't know where the boundaries are. I do know that I worked very briefly at one, and within the first couple of weeks, there was a phishing awareness. They sent me a spear phishing email, and I didn't handle it properly.

Mike Gruen:
But what's funny is I couldn't handle it properly. I was supposed to report it, but I was on a Mac, and they didn't support Mac. So there was no way for me to actually report it, and I got reprimanded for not reporting the email because I wasn't using Outlook on Windows. I was like, "I don't know what I'm supposed to do."

Jessica Barker:
That's a perfect example. Exactly. Perfect example. It's like you have to make this stuff easier. But yeah. I see the same. I see a big split between organizations that much more progressive as you've described and organizations that are still in that what I see as a kind of older mindset that we're leaving behind of blaming people. I don't really see it being split by sector because I know some of the big banks that are much more progressive, certainly from a UK perspective. So I haven't yet kind of worked out, other than on a sort of maturity level in terms of their security culture. I haven't really worked out what the pattern is.

Mike Gruen:
It might just be who's in charge, the CISO, right? I mean, it just flows from them. I'm sorry, Joe. You were going to say?

Joseph Carson:
Yeah. So I've had experience with this in the past. I did a lot of work. It's probably a good 10 years ago in the maritime industry. What I saw was that in that culture and that change is that when you've got an incident, and I did see it mostly in shipping accidents is that if you find a policy failure, insurance doesn't pay, because the company was at wrong. If you find a human failure, insurance pays. So sometimes what I did find was that as I get into cyber attacks, them becoming much more a frequent occurrence is that the company wanted to find a human at fault because then they were able to get the insurance money to pay for the accident versus that if they had a policy failure, let's say the employee didn't follow, or the policy was wrong. You get into things like PCI, and you get into ISO, or you get into these frameworks.

Joseph Carson:
Now I'm at risk of financial exposure from a regulatory compliance failure. So in many cases, sometimes the way the structure and the industry set up, especially when compliance or regulation that companies want to find human failure, because otherwise, they might be exposed financially from an irregulatory failure. I get worried that this is the continuous, especially as cyber insurance becomes more popular and companies start looking at. I'm hoping this cyber insurance policy doesn't force us down that path, what I did see in the maritime industry. So that's a fear that I have, that we start looking for human failure because of this.

Jessica Barker:
Yeah.

Mike Gruen:
Yeah. The unintended consequences. It's always at the root of why things end up best intentions, and then I'm sure the insurance had all the best intentions, but right at the end of the consequences. Now we just blame people rather than actually trying to solve the problem.

Jessica Barker:
Yeah.

Joseph Carson:
So Jessica, one thing as well is that I've seen a lot in the industry as well, and this is getting your perspective on it is that my view is that we are all here to defend. My goal is I use my expertise to... I realized a few years ago when I did a pen test that my job was not security. I quickly realized that actually my job is to identify business risk, either human side or technology side or process side and use my knowledge in order to help actually reduce that risk. My experience is in the security side of things of how to reduce that risk.

Joseph Carson:
Now, with that, one of the things that we tend to have... All companies will have incidents. There's no a hundred percent security. The incidents could be bigger, small. But what I hate is seeing this aggressive finger-point the organizations when they do become victims. Ultimately, I do see is that organizations, they all are also the victim here. They're the victim. Their employees are victims. Their customers are victims.

Joseph Carson:
So from your perspective, do you see... In the industry, how do we deal with that? How do we improve to make sure that vendors aren't just using these as an opportunity to sell products and that we realize that we're a collective rich community, all with the same goal in mind? How do we overcome that?

Jessica Barker:
Yeah. It's really challenging. So I think we see it, and you see it on social media, don't you, whenever there's an incident that becomes kind of public knowledge. It's really unhelpful because, of course, it'd been just drumming. It's stressful, I think for the organization that's a victim. There are peers working in those organizations, and they're being attacked by their peers in the community. Of course, again, it's on this sort of higher level, just drives people to try and hide incidents, because reputational damage can be one of the most damaging things, and so people don't want to talk about it, because they don't want the finger to be pointed at them.

Jessica Barker:
I do again think the tightest sort of turning, and it's becoming less acceptable in the community to take that kind of approach. I think some people still struggle with it. I remember giving a presentation at a conference, I don't know, maybe six or so years ago now, and I was talking about this. I was talking about this kind of victim blaming of individuals and organizations, and it's the one time I've had someone literally heckle me in the middle of my talk and derail it, and it was my fault. I was a pretty new speaker, and I allowed it to happen, and we kind of got into this debate of like, "To what extent do you blame an organizational victim of an incident?"

Jessica Barker:
The person's perspective was they haven't done what they should to protect data. That might be my data. It might be personal data of people I care about, financial data. They should be doing more, and if they get attacked, then it's their fault. I was kind of saying, "Well, that might be the case in some instances." There is often organizations things that they can do more of. But it's difficult. It's challenging. There's no such thing as a hundred percent security. People are out there working hard at it. If we're looking at hierarchy of blame, the criminals have got to be at the top of this, right? Yet we often don't talk about them because they're often unseen, and as humans, we like to attribute blame.

Jessica Barker:
That sells from a news perspective when you can have a bad guide, person, organization, when you can pitch kind of good against bad. If you can't identify the attackers, then sometimes the victim gets into that role of being blamed.

Joseph Carson:
It's just part of the cause here is that we can't point the blame to any company, and we've also got to the point where we're blaming countries without having really good attribution, and that doesn't also provide any value, although than actually create more political strains.

Jessica Barker:
Yeah.

Joseph Carson:
Right.

Mike Gruen:
Yeah, no. I was going to say it's sort of bringing into the physical. So a number of years ago, a friend of mine was walking in DC, got stabbed and robbed. He was on a street in DC. What could he have done differently is just not an... Yeah, he could have not been there. But he needed to be there. He was going to his car from... So I think that there's that same sort of, you have to bring it back to that person who is like, "Well, obviously the company is not doing everything they could." Yeah. I guess my friend could have been wearing a Kevlar vest and could have... There's any number of things. But there's this risk.

Joseph Carson:
Armored car.

Mike Gruen:
Right, exactly. There's only so much that you can really do to defend yourself against certain things, and sometimes it's just bad box. Sometimes it's wrong place, wrong time. Sometimes you don't realize that you're a target for whatever reason and-

Joseph Carson:
Yeah. We have to call them. We have deals. One thing is that even... Jessica, I don't know if you have recommendations in the PR and communication side, because that also is a problem in our industry is that when we have people coming out and making the cyber attack for sophisticated hackers, in my mind, they're digital thieves. By calling us sophisticated and calling them hackers, we're putting them in the pedal stone that we should not be doing. We're putting them up there.

Joseph Carson:
They actually enjoy that. We had to get down to really calling them... It's crime. It's digital thieves. We have to get calling them to what it really is. Those companies and employees are victims, and we have to make sure that this is no different from going in and robbing a bank, or to your point, Mike, going and stabbing someone in the street is that there is a human impact here. There is a financial impact and that we had to make sure we call it what it is.

Joseph Carson:
They're not leaking data. They're stealing data. We had to get to really getting into, and our terminology is horrible. We like acronyms. They create new marketing ideas. But I think we really had to get back to the basics and really start getting into consistently calling these crimes and really getting highlighting the impact.

Jessica Barker:
I completely agree. I think they're criminals. That's what they are, and that's what we should be calling them and if that brings a clarity as well. I think on the victim blaming thing, just kind of go back to that for a second, part of it is down to a psychology thing, and it's the same with physical attacks and when we see victim-blaming there. As with cyber attacks, if I can blame a victim, it makes me even subconsciously feel more comfortable that it won't happen to me. That's what it comes down to often when we see victim-blaming is people say, "Oh well, they did that wrong. I would never do that. So I'm fine. I don't need to worry about this."

Mike Gruen:
Is it that they feel more comfortable, or they just feel superior?

Jessica Barker:
Yeah. Maybe a bit of both, isn't it? Yeah. It's pointing the finger and just making us all feel better.

Mike Gruen:
Right. When I walk around, my eyes are always up. I'll look in from side to side. That would never happen to me.

Joseph Carson:
You've got your masculine walk.

Mike Gruen:
Yeah, absolutely. Right, right.

Jessica Barker:
Yeah. Tom cruise in disguise. I'm fine.

Joseph Carson:
Exactly. So one thing is I find is that actually organizations should help the victim as well and actually become a... I find that the best people to speak out are those who are victims because they really know the true impact. I do find that even turning them into spokespeople or sharing their story with the organization, it reminds me going back, I think one thing that changed a whole perception on the industry I think was when ... were the victim of, and they created videos. They showed the real people. They show people working around the clock, spending time away from the families, ordering pizza and trying to recover the systems.

Joseph Carson:
I think they really made... I think that was one for me I think that all companies should probably take a good lesson from and to really showing... When we do become victims, organizations will that we have to really start highlighting their true impact, what it means for people. Ultimately, because I think criminals are criminals. The ethics are right there. They're not going to have any ethical kind of standards.

Joseph Carson:
I think we have to get to pointless, even getting where countries can not provide safe havens for them. We have the countries where they're providing safe havens for those criminals to operate, really has the mean. We had to work as a collaborative community together.

Mike Gruen:
I think though-

Jessica Barker:
Yeah. I completely agree.

Mike Gruen:
Yeah. I'm curious what you think, Jessica, but I would say that by doing that, I think it would also help. So right now, all of the publicity is around the hackers. You look at TV shows, whether it's Mr robot or whatever it is that sort of makes it seem like this Robin Hood type of person. If we were to do more to show the response, that what we should be doing is highlighting those people. They're the heroes in the story. They're the ones who we want to emulate. I wonder if it would also help tip the scales not just from a showing the real cost of this, but also tip the scales towards, isn't this the type of person you want to be, the person who at 3:00 AM, gets that call and saves the day as opposed to the person just hacking into some system and stealing some money and getting away with it. So it's probably double.

Jessica Barker:
Yeah. I totally agree, and I've seen that approach work internally in organizations as well, from an awareness raising point of view, from a cultural point of view of humanizing the infosec team of really kind of... People like to have heroes. When you showcase actually the work of the infosec team, what they do in terms of incident response and show that they are people, then that can really help if you've got an organization where the rest of the business feels a bit far from infosec, and they're not really engaging. That can really help with that.

Jessica Barker:
Actually, speaking to a client the other day, and I asked them, what's the most successful thing you feel your team has done over the last year or so for awareness raising. They actually spoke about an incident. The person, it was someone that had clicked on a link in a phishing email, and then that person agreed to be a case study in a really positive way. Exactly what you're talking about was sort of showcased in a newsletter, and because they'd reported it, they obviously used that angle and talked about how this person did exactly the right thing in reporting it. They talked about the follower, what it meant for the infosec team, and it had a massive impact. People are still kind of talking about that now. So we can use those stories internally as well as externally.

Mike Gruen:
Yeah, I agree. I mean, I was at a company a number of years ago where they hired somebody to do various attacks, whether it was USB drop attacks and all these different things. I remember, and I still remember the story. There was one woman who I guess the company had set up a fake website, and it was like, "Oh yeah, we're beta testing our new HR system. Please go ahead and log in with your windows credentials." She just was sort of half paying attention, started logging in, hit the login button, realized right away like, "Oh crap, what did I do." Called security, and they're like, "Hey, not a big deal. This happened to be a simulated attack."

Mike Gruen:
She and the security team made a big deal about how she responded and how well it went and the rest of it that I still remember that story from now coming up on probably 10 years and to the point where I remembered that being such a positive experience that I've done sort of similar things at Cybrary, where sort of highlighting. I have a security footage of somebody tailgating into our office, and one of the employees sort of noticing, stepping up and be like, "Who are you here to visit?"

Mike Gruen:
He just happened to be going from one part of the office to another part of the office, saw this person come in, stepped up, did the right thing, and I was like, "That's what we need to make sure that we're highlighting, that people do that." I think those messages stick with people way more than, don't do this, don't do this. Always make sure nobody's falling from behind, that type of stuff.

Joseph Carson:
We need to show success in these kinds of things which is one when people really are making a difference. I'm not meaning that the company that has become a victim of attacks and employees of a convicted. But those successes, and I think that the ... that made a really impressive positive image to me. So Jessica, one thing I'd like to also ask, I think it's really important in this, some of the words that you and FC is doing is amazing, especially with next generation of the kids just coming through.

Joseph Carson:
I'd like to talk about that is that I've done a lot of work with educating myself and Mike continued to talk about, how do we make kids safe online? Some of the kind of the work that you're doing is amazing in educating the future kids and really getting them involved. Can you talk a little bit about some of those activities that you're doing?

Jessica Barker:
Oh, yeah. It's something we're really passionate about at Cygenta, and actually the whole team gets involved. They don't have to. We don't force it on them. But I guess it's part of the ethos of people that we work with. So everyone is really committed to that, and it's something we find really rewarding personally as well. So at Cygenta, we're a partner of the NCSC CyberFirst Schools scheme. We've been supporting that for years and through that, run loads of activities. We obviously moved it online this year. So we ran a YouTube series, where we answered questions and talked about different parts of cyber security, all aimed at school peoples and students.

Jessica Barker:
We just ran a cyber security writing competition. So with the schools in the local area, we said, "Do you want to write an essay and say what cyber security means to you, like 300 or so words, and the winners will get a copy of my book and some stuff like that?" We were inundated. We've had over 900 entries in the space of two weeks.

Joseph Carson:
That's awesome.

Jessica Barker:
So we have an amazing team of industry judges who are-

Jessica Barker:
... judging the entries as we speak. Madeline on my team has done an amazing job of kind of whittling down entries as well. So we do all sorts of stuff. We do some stuff with TeenTech, which is a fantastic organization in the UK that focuses on tech in general, but has quite a strong cyber security sort of stream to it. So we did a virtual event for over 800 school kids in the Northeast of England, which is where I'm from a couple of weeks ago, which was very fun. Yeah, we do all sorts of stuff, and we love it and love seeing that impact and seeing kids really start to get what this subject is so that they're more secure and so that they think about it as a profession as well.

Jessica Barker:
So seeing some of them kind of decide they want to go into being an apprentice or they're maybe going to study at university is incredibly rewarding. They're so bright that you just think, "Okay. We're going to be all right." Actually, the kids have got it.

Joseph Carson:
That's great. I mean, for me, I think it's amazing, and we really do need to make sure that we engage with the next generation of future defenders and protectors in the world, and what you're doing is amazing. Your book, how do people get the book? Can you tell a little bit about what to do because-

Jessica Barker:
Sure.

Joseph Carson:
... I think it's amazing. I know how difficult it is to write. So tell us about it and how the audience can get a copy.

Jessica Barker:
Sure. Yeah. This time last year, I was frantically trying to finish it, and I'm really pleased with it, actually. I have written this book essentially for me 10 years ago, someone coming into the industry or thinking about the industry, just getting started and wanting to get an overall picture. There's so many resources and great books out there. But something that kind of brings together the human, the technical, the physical that talks about social engineering, technical vulnerabilities, cyber war, what cyber security means to different industries, everyone from pop stars to footballers to big banks and small businesses and what individuals can do, organizations can do and profiles of the people in the industry.

Jessica Barker:
I really wanted to tell the stories of people working in different roles. Is it something we find with the outreach is... I remember this at school. Trying to understand what people actually do in a job can be really tough until you're in the world of work. So yeah. I've had great feedback from people in the industry, executives. There's a chapter on cyber security for board members and from those students and people just dying out. So it's confidence obscurity available from all good booksellers. If anyone reads it, let me know what you think.

Joseph Carson:
Yeah. We'll make sure that whatever link, I don't know where we're going to put it well, but we'll put it in the show notes that we have. We'll make sure that people get a copy to the link in whichever... Well, we'll make sure from regional wise they'll get access to it. So absolutely. Jessica, it's been awesome having you on. This is fantastic. I think this is really important. I think it's really humanizing security. That's what we need to do, is we need to show that technology is here to help us do our job, not to replace us, and that's ultimately we needed to work with us together.

Joseph Carson:
We definitely need more people like you, more people coming in the industry, more people that have a perspective on not just... I'm one of the old retro guys here and Mike as well. We're old school, and we have a very binary kind of process. But we definitely need more perspective, more about the human behavior, more about how we can make security work for people. How do we make it easier? So it's a pleasure. Any final thoughts? Any things that you would like the audience to know that you'd like to change the world?

Jessica Barker:
To change the world. To change the world in one sentence. If anyone wants to-

Joseph Carson:
No pressure.

Jessica Barker:
Well, I prepared this earlier. No, it's been a real pleasure. I really enjoyed speaking to you all. What I would say is cyber security is very broad, and I focus the human side. Luckily, there are people really heavily focusing on the technical side, people focusing on the physical side. The more we can all work together and recognize how broad a discipline it is, then I think the more successful we will all be.

Joseph Carson:
Yeah, absolutely. This episode, it's a really pleasure, and as again, for the audience, every two weeks, 401 Access Denied, join us for these fun conversations. Again, provide us your feedback and comments, and it's been a pleasure as always, Jessica. Again, make sure stay safe and look forward to speaking to you all again soon. Thank you.