Skip to content
 
Episode 73

Cybersecurity Government Task Force with Jen Ellis

EPISODE SUMMARY

Wondering how cybersecurity policy is created and enforced among organizations?

In this 401 Access Denied episode, Jen Ellis, founder of NextJenSecurity and board member of several major cybersecurity companies, chats with Delinea’s Joseph Carson about the ins and outs of a cybersecurity policy. You’ll hear how entities like corporations, governments, and even individuals can be impacted by these policies along with expert insights from Jen.  

Watch the video or scroll down to listen to the podcast:

 

Subscribe or listen now:  Apple Podcasts   Spotify   iHeartRadio

Jen Ellis: To me, for vendors to make that choice for me by withholding information is really anti-customer. And so, to me, it became a consumer rights issue. This whole thing around security research became a consumer rights issue. And so I got really indignant and I learned all about the CFAA and the DMCA, and I went to the CEO of the company and I said, this is a huge problem. It's affecting our industry, but it's also affecting our society. And he said, "You should go do something about that." And I looked at him and laughed and went, are you insane?

 

Joseph Carson: Hello everyone. Welcome back to another episode of the 401 Access Denied podcast. I'm Joe Carson, the host for the episode. I'm Chief Security Scientist and Advisory CISO at Delinea. And it's a pleasure to be back and I'm really excited about today's episode, because we have an awesome and amazing guest on the show. And today I'm joined with Jen Ellis. So passing over to Jen. Jen, do you want to give us a bit of a introduction about who you are, what you do, and how you got into the industry?

 

Jen Ellis: Sure. And thank you for having me. It's great to be here. It's always nice to have a chance to nerd out with friends. I like it. I don't know how much other people want to listen to that, but I enjoy it. So yeah. So I'm Jen Ellis. I have a fairly unusual role in the security industry, I think. I sort of sit at the intersection of community public policy and sort of awareness. So my goal, if you will, is to create change. I basically have a view that cybersecurity is a societal issue because connected technologies are in everything that we do.

 

Joseph Carson:Absolutely.

 

Jen Ellis: And so the risk that comes with connected technologies relates to everything that we do. And we've seen this play out massively in the past couple of years, through the pandemic as we've seen hospitals being attacked. And just last week we saw the Royal Mail get attacked. We've seen attacks against critical infrastructure sort of ramping up over the past several years. And so my question is how do you create change that reduces risk for everybody on a societal level rather than just being about reducing risk for those who can afford to buy solutions?

 

Joseph Carson: Absolutely.

 

Jen Ellis: And I think the answer to that is you have to create behavioral change amongst large-scale technology manufacturers, buyers and operators. And so that's going to be a combination of the top down, which is where the public policy piece comes in and then the bottom up, which is where that sort of ground swell of awareness and of peer-to-peer education and of standards adoption all comes in. And I sit sort of in the middle of those things trying to create connection points, I guess.

 

Joseph Carson: In the toughest part, the toughest part of the industry.

 

Jen Ellis:I don't think so. I think I sit in the eye of the storm. So everything where I am is calm, everything around me is chaos. It's definitely an interesting spot to sit in.

 

Joseph Carson: Absolutely. And just to get... So I mean, absolutely for yourself. This is always my favorite part of the week is recording these episodes, because it does get me to nerd out. It gets me to talk to great friends and people in the industry and just also just sometimes get different perspectives on things.

 

Jen Ellis: Yeah.

 

Joseph Carson: Because when you're in your own rut, you always get stuck in that continuous kind of feed. And for me this is the greatest part of my week where it's basically allowing me to just get a bit of the view from a different part of the world or a different part of the industry that I would typically, normally do in my day job. So it's always one of my fun parts. And for me, so I've always been sitting... So in what you do, I've always been from the techie side of things, in the industry. And I remember a quite years ago, it was probably going back about 10 years ago, even before that, even late 2009 or so, I started getting involved in this little thing called GDPR and stuff like that. And-

 

Jen Ellis: I've heard of that. Yeah.

 

Joseph Carson: And doing the technical reviews. And I was always kind of brought in it from the technical side, not so much the policy side, but I did get a lot of exposure to those who are making decisions. And I'll never forget GDPR because it was always one of those things where, when I was trying to get 28 countries to work together and come to the same conclusion. And the moment you introduce many different countries and societies, it gets more complex, and more and more complex. So then I remember doing some policies and working with different governments and different agencies and it was always a one-way street. It was always me sharing information and then not hearing anything back. So can you tell me about some... You've been working a lot of these areas and advocacies and bringing policies and bringing governments and industry together. Can you share some of the task force and things that you've worked on that were interesting?

 

Jen Ellis: Sure. Actually. So on the point that you're making, if I can go into story time a little bit and kind of take a step back about how I ended up in policy because I didn't start working in policy and, in fact, I don't think anybody would've predicted that this would've been where I would end up going. And for me, so I was living and working in the US. As you can probably tell from my accent, I'm a Brit, but I was living and working in the US for a security company and I worked really closely with the research team. I had helped champion and build a research function within Rapid7. And there you go, Rapid7 gets some free advertising. And worked really, really closely with the researchers, and they were doing technical research. And our then head of research got threatened with legal action for a piece of... It was port scanning research.

And it was a legit project. I mean, we'd put out findings. We had a WHOIS lookup that enabled you to opt out. And we honored all opt-out requests. It was as bonafide as a research project could look right. And he got threatened with legal action for three months. And at the end of which he said, "I'm glad that this has been dropped, but it took a huge toll and I don't know how much I really want to do research anymore when I'm dealing with this."

And in the meantime, I had looked into it and found out that he wasn't the only one, that lots of researchers felt that this was the case, that they were getting threatened with legal action under either the US Anti-Hacking Law, which is the Computer Fraud and Abuse Act or state equivalent of said anti-hacking law or the Digital Millennium Copyright Act, the DMCA. And so one of these things was being used basically to frighten off a lot of researchers and threaten them. And I kind of started to really think about it. And it's interesting you say that you come from the technical point of view, and I really don't. I have a really strong passion for the intersection of technology and society, but I'm not particularly technical myself, which I know is a huge crime insecurity and I apologize sincerely.

 

Joseph Carson: It is not.

 

Jen Ellis: It certainly can be.

 

Joseph Carson: On social media. Yes. I think that, unfortunately, the social media community do come harsh on those who may not have a technical background. But you don't need a technical... You don't need to be a devotee and you don't need to be a techno-

 

Jen Ellis: I think it depends how you're applying yourself, right, and what you're claiming ownership and knowledge of.

 

Joseph Carson: It is how you're applying, absolutely. I mean, we need to be better at communication.

 

Jen Ellis: I 100% agree. 100% agree. So I was thinking about this and I sort of ended up in this view that security research to me, obviously, is fundamental to the research community. It underpins everything that we do. It underpins all of our knowledge. But I think it's actually a way bigger issue than that, because here's my view is that I may not be super technical, but I have every right to make an informed decision decision to manage my risk and to balance how much risk matters to me compared to other things. So when I'm buying a new phone, I might go, I really care about the best camera. Or I might go, I really care about having the same platform as all of my family so that we can integrate things. But I might go, what I really care about more than anything is not being put at risk and not having my data shared or being hackable.

And that informs the choices I make over which phone I buy. To me, for vendors to make that choice for me by withholding information is really anti-customer. And so, to me, it became a consumer rights issue. This whole thing around security research became a consumer rights issue. And so I got really indignant and I learned all about the CFAA and the DMCA, and I went to the CEO of the company and I said, this is a huge problem. It's affecting our industry, but it's also affecting our society. And he said, "You should go do something about that." And I looked at him and laughed and went, are you insane? And I was like, so number one, I'm not a lawyer. Number two, I've never done anything on policy before and, number three, I'm British and I don't know how the US law system works.

And he looked at me and he said, "Is it the right thing to do?" And I was like, yeah. And he's like, "Then you should go do it." And so I went off and I don't know if you've ever seen that Family Guy episode where Peter goes to Washington, DC and he ends up working with a smoking lobbyist, and he's like in his dress with his little hat and he's doing twirlies. I basically was that. I was in DC, wide-eyed, innocent, going to these government buildings, had no idea what the hell I was doing, getting lost everywhere I went. And I went and met with staffers on Capitol Hill and they would say to me, it's so fantastic to meet you. You're the first person we've ever spoken to from the security community.

And I would be sat there going, if you go to congress.gov and you put in cybersecurity, then the list of bills that get returned that have some cybersecurity component in them. And I'd be like, you guys are actively legislating on cybersecurity. And this is back in the days when they were trying to pass the Cyber Information Sharing Act, CISA 2015. And I said, you're actively working on encryption backdoors and hack back and prosecution authorities and cyber information sharing and breach notification and cyber hygiene. And you are not talking to people in the security industry. Who are you talking to? And they said, well, we hear from the banks and we hear from the defense contractors and we hear from big tech, but we don't hear from the security players at all. And I was horrified. Horrified. And so it became a mission for me.

 

Joseph Carson: It's not a bit shocking because a lot of infosec people are like, they're always concerned about how much they interact with the government because some of the things that they're doing, so there's always that wall that they've built up over years.

 

Jen Ellis: Completely agree.

 

Joseph Carson: But it's needed to come down. That wall has needed to be demolished.

 

Jen Ellis: And so that clock, if you look at that, that was probably nine or ten years ago now, right? I don't know because the weird vacuum of time that was the pandemic means I feel like I stopped counting before the pandemic, and I haven't started counting again. But yeah, so it was probably nine or ten years ago. And then, since then, the landscape has changed so hugely. So at that time, we really didn't have a lot of nonprofits that were doing anything around cybersecurity. And when I say nonprofits, in this sense, what I'm talking about is the kind that interact with governments. So you have trade associations and you have think tanks and you have research divisions and you have all sorts of civil society. And there were very few that were really doing anything about cybersecurity. A few that had focused on privacy were doing a little bit here and there.

VFF, who I know are very beloved in the security community, they were doing little pockets but they don't really lean forward as policy engagement being their number one thing, although I think they're a bit more active back then maybe. And so now, fast-forward the clock to now, there's actually quite a few of them. It is quite a big section. And one part of that was that the Hewlett Foundation... What are they? What's their full name? No, it's not flowing out of my mouth right now.

I'll have to think about what they're called. But they're a bit like if people are familiar with the Gates Foundation. They're like that, they're the huge endowment and they find sort of worthy causes to give money to and they created an endowment specifically for cybersecurity. They said for this set number of years we're going to fund projects that help advance security on a societal level. And so that meant that a lot more nonprofits could start engaging in it. And that's helped with what I would call kind of the democratization of cybersecurity because before that, the way the governments talked about it, it was all about national security. And national security, as we know, tends to be something that gets talked about behind closed doors, in very closed groups. It's very invitation only.

 

Joseph Carson: Or classified. It's all classified.

 

Jen Ellis: Exactly.

 

Joseph Carson: It's not for public view. You have to go through a whole procedure to be able to get involved or see any of the data.

 

Jen Ellis: Absolutely.

 

Joseph Carson: So it's not something that works on a civilian level.

 

Jen Ellis: No, exactly.

 

Joseph Carson: So it's very kind of, let's say, behind the closed doors, as you mentioned. You don't get to see it.

 

Jen Ellis: And then I think a few different things happened. So one, there was a massive upscale in the level of cybercrime. And that made it something that people had an awareness of. It was something they saw in headlines, it affected their lives. And so that moved into a sort of civilian domain much more as something that people talked about. It also meant that governments had to start thinking much more about how are we going to respond to this and what are we going to do? There was a prediction that I saw last year that cybercrime was going to cost the global economy, last year alone, $7 trillion. And to put that into context, the GDP of the UK is about 2.6.

 

Joseph Carson: It's bigger than many countries.

 

Jen Ellis: So it's insane. It's insane. And so governments had to respond, they had to come up with a response. And in that year of 2021, when we'd already had such a terrible year the year before-

 

Joseph Carson: The one that didn't exist, the one we can't remember.

 

Jen Ellis: Right, exactly. I feel the same way about '20 and '21. But we had that run of Colonial and then two weeks later HSE and then a week after that JBS. And so within a month, you had three of the biggest, most high-profile attacks, ransomware attacks against national critical infrastructure, or not national, but critical infrastructure-

 

Joseph Carson: Supply chains, very, very big supply chains. Which-

 

Jen Ellis: Exactly.

 

Joseph Carson: You had this domino effect, once one's impacted everything that's downstream-

 

Jen Ellis: Oh, yeah.

 

Joseph Carson: Can no longer do business or function.

 

Jen Ellis: 100%. And then we had Kaseya maybe, I don't know, two months later. And so I think what we've seen over the past few years is a huge sort of shift in how we talk about the stuff and how governments approach it. But what still happens is there's an access problem. And we exist in echo chambers. We talk about this a lot in security, because I think in security people are quite aware of it-

 

Joseph Carson: We're used to working in silos. We work in our own little silo as long as it doesn't impact everything else. Because in the background, we were techies. We like communicating with technology and not so much people.

 

Jen Ellis: I think that's true. I think that's true. But I will say, I've worked in a lot of sectors of B2B tech and I've never worked in a sector that is such a community as this. And actually, when we say that what we actually mean is it's multiple communities layered over the top of each other.

 

Joseph Carson: Yes.

 

Jen Ellis: There's like the threat intel community and the pentesting community and there are... There's blue team and purple team and red team, and they all sort of layer on top and mix together. But it is a community. I have never seen a B2B tech sector that has as many events in its annual calendar as security. For people who exist online, security people love to get together in person. Love it.

 

Joseph Carson: Yep.

 

Jen Ellis: Absolutely love it. And as a community we may have a certain reputation for partying hard and that might factor into why people like it. But I don't just think it's that. I think it's also the fact that people who work in cybersecurity, often, it's a real passion and that ability to come together and learn from each other and do what we are doing right now, which is just nerd out together and exchange war stories, I think is just a really big thing for people in security to do. And so we have this really amazing super active community, but it's an echo chamber. It's a bubble and we all sit in it. And actually, the government sector is its own bubble. It's a whole other bubble. It's a whole other echo chamber. And so when the government comes to being in a position where they've got to create policy on something, they know that they need to talk to experts. They know that. They actually want to talk to experts, but they actually don't know how to break into the echo chamber.

And because we are so bombarded with messages every day, we also don't really know what to pay attention to, to go, okay, that's the thing I should respond to. And we don't know how to break into the government echo chamber. So you've got these bubbles, I'm now describing them as if we're basically sort of terraforming Mars and we're living in our bubbles. And that's a little bit what it's like, right? It's like we're all trying to do the same thing and work together towards a common goal, but we're sitting in these bubbles and we don't actually know how to reach each other's bubbles. And I think that's where things like task forces or working groups or policy groups can really be helpful. I think some of the events that we have in the sector can help give people access. We had the Department of Digital Culture, Media and Sport, which is the UK ministry that basically sets public policy around cybersecurity.

We sort of call them the "any other business ministry" because they cover such a wide range of things. Which by the way, if you're trying to cover, if you're trying to follow what they're putting out about cybersecurity, super hard, right? Because you go to the website and you've got to wade through museums and theater and sports. And then even when you get to tech, you've got to wade through 15 internet things and AI and quantum, before you get to cybersecurity. Not that those topics aren't relevant, they are, but they're not necessarily going to be focused on security. So it's actually really hard to keep up with what they're doing. But DCMS came to Black Hat Europe in December and they gave a talk a sort of 101 on here's what's happening in policy and how you can get involved. And I think those kinds of opportunities, those kinds of moments are super, super powerful, and they give people the opportunity to create those relationships or start creating them. DEF CON did the same thing last summer. They had this policy at DEF CON Track and they had people from all over the world come.

 

Joseph Carson: And they've been doing it slowly at DEF CON has been great because it all started, of course, with the voting machines. It started bringing in the voting machines. They started having actually senators in Congress speaking-

 

Jen Ellis: I will say, actually-

 

Joseph Carson: At the event as well-

 

Jen Ellis: It started before the Voting Village. It started way before the Voting Village having in the govies come over, and there have been some really-

 

Joseph Carson: Attending in person. Yes.

 

Jen Ellis: Yeah. I mean, I think Congressman Langevin, who is retiring and who we are all going to miss desperately because he's been an incredible, incredibly positive force for cybersecurity in Congress in the US. He has now been out to DEF CON a few times, and I think the first time that he came he actually did a trip with Will Hurd. And that was sort of organized, I think, at the time by the Atlantic Council. And so there've been these pockets of activity and opportunity, but now we're seeing a sort of much greater buildup of momentum and groundswell. And I honestly still think we've got a way to go. There's still so much more that can be done to create those connection points, because I think the vast majority of people who are listening to this podcast are probably thinking, I've got no idea how to get involved.

 

Joseph Carson: Yeah, I remember my first-

 

Jen Ellis: And we have to change that.

 

Joseph Carson: Yeah, I remember my first involvement was I did a lot of advisory. So from my kind of background in what I did with previous companies that I would get a lot of government agencies coming and asking for technical advice or architecture design. So whether it was to do with patch management or whether it was to do with vulnerability assessments or to do with migration processes and so forth. And they would come. And my obvious thing was that I wouldn't know how it was being used or the way it was intended to be used. They just come in and say, with these limitations of scope that I was allowed to be informed on, I would provide my expertise back. It was-

 

Jen Ellis: And then you'd hear nothing back.

 

Joseph Carson: I'd hear nothing. For me, when you were actually talking about these little bubbles, it just got me thinking about, it's almost like the movie Contact where I've sent these signals out into space, and I'm sitting here with my satellite dish waiting for responses back. And years go by and I never hear a thing. And you get a point like, was I helpful? Did I... Because we always want the feedback, we want to know what we did. Was it adding value? And that happened for so often. And I think I remember one of, there was an event... I attend this kind of the CERT events, what it's all about instant response when you hear what worked and what didn't work, and we hear some of the major incidents.

And it was the first time, I remember, we had a round table, it was law enforcement and it was ethical research, security researchers, hackers, all on the same team. And that was the first time where we started having back and forward interaction where we started hearing what they were proposing. And it was the first thing, and it was probably around 2014, I think, at that point. And I think that's where I started seeing slow changes happening where we started having much more communication. And to the point, one of the things that I think the last time I met with Chris Krebs, I think it was maybe RSA last year. And I said this-

 

Jen Ellis: This is Director Krebs, former Director of CISA.

 

Joseph Carson: Former Director Krebs from CISA. Because that was the first time when Chris went into CISA.

 

Jen Ellis: Yeah.

 

Joseph Carson: And as far as CISA was... It was the first time I started seeing bidirectional communication. And it was the first time you started seeing communication proactively coming out-

 

Jen Ellis: So it depends, right? Depends on the government you look at. So when Chris went in... So CISA was born out of what was NPPD, right, at DHS. For those who don't spend all their time talking to governments, yes, they really, really, really like alphabet soup. So I'm not going to be able to remember what MPPD stood for. It was something like national something protector, I don't know. But it became CISA, which is the critical infrastructure, no Cybersecurity and Infrastructure Security Agency. And as part of the Department of Homeland Security in the US. And MPPD becoming CISA happened, I want to say in about 2018-ish?

 

Joseph Carson: It was around 2018. Yeah, it was around there. It was not long after the elections, right? The 2016. Yeah.

 

Jen Ellis: And not super long before that was when NCSC, the National Cybersecurity Center in the UK was formed, where the UK basically said we've got all these different parts of us dealing with security and we're going to kind of pull together and centralize it under this one organization, NCSC. And one of NCSC's core tenets has always been transparency. Which is hilarious, right? Because they come out of GCHQ. And so that, I think, is actually why they made it. This is a core tenet for us. It's like we don't want to have the reputation that GCHQ has of being about secrets. That's a reputation they necessarily have, but we don't sit in the same spot. And so we are going to be transparent about what we're seeing and how things are going and that kind of thing.

And I think a really big part of the reason they did that was because when NCSC started, they identified 16 core areas of activity and basically the private sector was involved in every single one of them. They basically were able to execute and move quickly through partnerships with the private sector. And everything that they were doing involved that. And so they kind of had this view of, well, if we're partnering with you, we have to tell you how things are going. We have to share that information with you. And actually, we live in a democratic nation, we represent the people, we should share that information broadly, and we should improve the general level of knowledge in the security community. When Chris came in-

 

Joseph Carson: That's how you build trust. That's the whole-

 

Jen Ellis: It is how you build trust.

 

Joseph Carson: It's the whole foundation.

 

Jen Ellis: And trust is something that is massively lacking between the public and private sectors in the US, where there has been a sort of cultural norm of the private sector not trusting the government. I think you can actually date it back to when they were paying taxes to the British government. So they've never really got past this view that you can't trust the government. And so whether they're an overseas government or an in-country government. And so Chris, I think, when he started, had a strong point of view of, one, wanting to be able to build that trust and build that collaboration.

 

Joseph Carson: Yes.

 

Jen Ellis: Because he recognized the value of it. And so he, I think, paid very close attention to what NCSC was doing and what was working and what wasn't. And started to... And I'm not saying... One of the things that Chris and I talked about a lot and I've talked to various people at NCSC about a lot as well is that you can't really compare CISA and NCSC, because it is an apples-to-orange comparison. What the UK deals with in terms of scope and scale is considerably less complex than what the US is dealing with.

 

Joseph Carson: And you're not having to deal with the multiple states and legislation across it, because that gets a whole level of... You've got a federal side, which is one thing, but when you have to deal with states and local level, that changes the whole-

 

Jen Ellis: Right. And I mean-

 

Joseph Carson: In the UK, you don't have to deal with that.

 

Jen Ellis: No. And you are talking about a population of sort of 70-ish million versus I think 320 million or something?

 

Joseph Carson: Over 300 million. Yeah.

 

Jen Ellis: So that's a pretty huge difference in terms of scale. And then this relationship with the private sector was a hugely significant difference. And I think what's interesting is if you look at those years, you've seen NSA launch a public-private partnership scheme, you've seen CISA actually really, really lent into it. So even CISA post-Krebs has continued to do that leaning in on how you build that and has built various schemes that involve the private sector working with the government. And I think we will see more of these develop over time. I think it's really important that we do. They are very valuable.

 

Joseph Carson: Even the FBI, I've seen a lot of cooperation with the FBI, which is surprising. I remember meeting with... Doing some of the events and we cooperate together and we exchanged... And their view was like we've struggled to do this because the FBI is so used to asking questions and listening and not so much sharing. And of course that's even within states within different departments and stuff, that's always been challenging. And they're now getting to the point where it's a whole different type of culture that they're also... That's a cultural change for them to also start actually communicating and having bidirectional communication. And I think CISA really kind of was the step-up and where they can all look to that direction as a kind of one that was making those steps. But I think they're slowly coming around. I think it's a great thing because it's how we build... It's going back to that trust side of things. It's how you build trust and how you know that value's been created.

 

Jen Ellis: Yeah, I mean, I think you need, in order to be able to do that, you need to have a sea change at the very top to say that this kind of sharing is okay. And I think, actually, if you go back to before, because remember, CISA started in 2018, but the groundwork that was laid to create CISA started far earlier. And so actually, you can trace it back to the Obama administration and some of the changes that were made through the National Security Council and the recommendations that came out of that White House around how you create better partnership. And it's interesting to me that even though the Trump administration massively downsized the level of investment in cybersecurity and took away various roles in the White House administration around cybersecurity, that the good work continued, and it is through the leadership of people like Chris Krebs and others. So yeah, I think-

 

Joseph Carson: Is it important, to your point, from a leadership perspective... I noticed Estonia has been doing this for many years. We've had a CIO and we've had a digital... So they've been looking, they've always had somebody who was the central person who would coordinate everything and have that... It's always been about transparency and communicating with the citizens and making sure that the public was informed. And I've seen other countries starting to have some type of security leadership. Of course, they've had the advisors and stuff in the White House. Is it really time to have somebody, like a CIO type of role that every government should have that allows them to have more coordination between multiple agencies? Is this something that we need to have or is that just giving something too much complication?

 

Jen Ellis: It's not that that, it's just that the challenge I have is I've never met an organization that defines the role CIO the same, let alone government that does it, right?

 

Joseph Carson: Yeah.

 

Jen Ellis: The US actually has CIOs, the British government, I believe, has CIOs, but they are treated as very much operational roles, not policy.

 

Joseph Carson: Not from security on the policy side.

 

Jen Ellis: And so it depends what you're trying to achieve by having that role. I think if what you're talking about is a cyber czar, then-

 

Joseph Carson: A cyber czar is more what I'm referring to, somebody who's more-

 

Jen Ellis: So I think that exists in various different places and various different roles. So this year... Not this year, because we're now in a new year, I'm still in the time vortex. Last year, we saw, and actually now that I say I'm in the time vortex might not even be last year, but I think it was last year, we saw the introduction of the office of the national cyber director and we saw the introduction of the cyber director role, which is basically a cyber czar. And so that's Chris Inglis.

 

Joseph Carson: Yes.

 

Jen Ellis: At the moment. And I don't think it's a big secret that he's planning on exiting, and so it'll be someone else soon. But Chris is there right now, and they only hire Chris's to work on security at senior levels in government. Yeah, I think when you look at the Obama administration, he had a couple of people who had very, very senior roles. So Michael Daniel was one who was his chief sort of advisor and sat on NSC, the National Security Council. There is a question about what ONCD and NSC, how they go together or how they differ. And I think the White House is still figuring out exactly how to answer that question in a very crisp, articulate way. And some of it, I think, will be revealed by what the new national cyber strategy that they're working on, that they're going to bring out, will be. I can give you my very rough explanation.

 

Joseph Carson: Absolutely.

 

Jen Ellis: NSC is not focused on cybersecurity purely. They are the National Security Council, so they cover any topic relating to national security that needs to be focused on or looked at from a White House level, that incredibly senior level and they advise the president on it. The ONCD is specifically focused on cybersecurity. And so you can expect that they will work very, very, very much hand-in-hand. But ONCD should I think be the ones that go super deep and then they advise NSC. And then there's also questions about how ONCD sits separately to CISA. I would say CISA's focus is mainly critical infrastructure piece, but also CISA is very operational. They're about doing things to take action to protect, either by driving better preparedness and awareness.

 

Joseph Carson: Yeah, even the best practices for ransomware and-

 

Jen Ellis: Exactly.

 

Joseph Carson: The major vulnerabilities, they've become almost like the alerts for CVEs now.

 

Jen Ellis: Right? The KEV list, the Known Exploited Vulnerabilities list, which if you guys are working in cybersecurity and you're not aware of KEV, go check it out. Known Exploited Vulnerabilities. It's getting longer and longer, start now. But they're looking at, and they offer free services for certain types of organizations. So they're really looking at the operational beast with a heavy focus on critical infrastructure. Whereas ONCD is sort of across everything on a strategy level and it's a big old government, they have lots of different pockets of things.

 

Joseph Carson: So what they were looking at some-

 

Jen Ellis: They all work together.

 

Joseph Carson: Yeah. So would they be looking at something similar to what, for example, the EU came out basically about the ratings for cybersecurity for certain IOT devices, such as also what you have in the UK where they were coming out with the policy but not having default weak credentials, right? Is that what they would be responsible for, or?

 

Jen Ellis: Yeah. Yes and no. So the way that in the US it works is if you are looking at creating policy, you're looking at Congress. Congress creates legislation, right?

 

Joseph Carson: We'll just say the law.

 

Jen Ellis: Right, they create the law. But then you can have laws that basically get created by Congress that then say X entity is responsible for figuring out the details. And you see this happen. So for example, the US passed a law called the IOT Cybersecurity Improvement Act. And what it was basically doing was saying, hey, we recognize there's a lot of risk around IOT, we want to be able to change things. We're going to use the immense buying authority that the US government has as a huge entity and say that if you are a US government entity, when you buy IOT, you have to look for certain things. And because we want the law to stay up to date and not time out really quickly. And because we recognize that as Congress, we're not experts in this, what we're going to say is that NIST, they are going... NIST who are basically the president's advisors on tech, the very technical aspects of what should be done-

 

Joseph Carson: Defining the standards.

 

Jen Ellis: Standards bodies. Yeah.

 

Joseph Carson: Standards. Yeah.

 

Jen Ellis: Yeah. That NIST is going to come out with the what it is, and they're responsible for keeping updated what it is that these government energy should be looking for. But we, as Congress, will pass the law that says governments have to do this, right? And so you've got two roles, you've got congress passes new law, you've got then specific sectoral or relevant agencies that have a specific role who then can help with the regulation. So say that you are the FDA, you are in charge of coming out with what the requirements are for medical device security, pre and post-market. So you've got those bits. And then the last area is the White House and its administration can issue rules within the framework that already exists. So that's where you get executive orders. Executive orders can't pass new laws but they can leverage what already exists.

And part of that also means that you get like CISA can do something like they can issue something called a binding operational directive, a BOD. And that basically is them saying we, as the cybersecurity sort of guardians, operational guardians, can go to any civilian federal agency and say you must do X, you must have a coordinated vulnerability disclosure program. You must have DMARC. These are all things that we have seen. One of the ones they're working on the moment is around supply chain. They pushed that out a while back and now we're seeing some of the stuff come out from it. So that's sort of how the structure works there.

But it differs country by country, right, and who does what. In the UK it's a different setup. And I think the key here is to make sure that you always have... I think you always want to have the people who are creating the policy, the people who are in the government as the technical leads. And then the third branch needs to be the people who work with the stuff day in, day out, either engaging directly or through advocacy initiatives like task forces or nonprofits or whatever.

 

Joseph Carson: Which is one of the next things we're going to ask. All of this is cross country and cyber crimes are... That's the thing is when we work in cybersecurity, it's not within the national borders. Very seldom do you get criminals basically operating... You may get certain elements of it, doing tax fraud and credit card fraud, because of course that has to have some local element at some point. But when you get into things like ransomware, which is all almost cross border, that you're dealing with criminals in other countries where it may not be considered a crime in those countries.

And one of the things when I also look at things like the complications as well is similar to EU, one of the things that what tends to happen is, for example, the EU AI Act, which is all about the artificial intelligence acceptable use and so forth and policies and strategies. We have the Talent Digital Summit, which is every year, which is then all countries coming together. And what they end up finding is, a lot of cases, whatever country has taken the most progress in those policies that everyone else has come and they want to hear, can you share the policies with us? And they just clone them and modify them slightly to work in their own societies. So sometimes that accelerates it, when you have some of those corporations and they're willing to share the policies and allow them to be modified. But it seems that how is things like the joint ransomware task force working?

 

Jen Ellis: But the thing is, right, we want to see this happen because what you don't want... What we want to support is a continuation or expansion of the global digital economy, because it actually benefits everybody and it allows countries that have traditionally perhaps had a harder time creating opportunity to create opportunity. So we want to see this continue. And in which case, for global entities doing business around the world, they don't want to have 57 different things they have to learn about. In the US, there are 57 different breach notification rules for each of the states and territories. Great that everybody has one. And actually, for the most part, the core elements are very similar-

 

Joseph Carson: And you end up choosing the most-

 

Jen Ellis: They're different enough that you have to go and look at all of them or at least look at each one for the areas where you operate. And wouldn't a federal one be nice because then it would stop that from having to happen.

 

Joseph Carson: Absolutely.

 

Jen Ellis: So we have the same thing in Europe where we have this huge complexity over having multiple member states, there's a difference between a directive versus a regulation. Directives are open to interpretation in country, whereas regulations aren't. So there's a whole level of complexity there. You had mentioned earlier on about the UK IOT thing and what was interesting with that was, yes, the UK developed a code of practice for consumer IOT, and it's 13 principles that are, I think, pretty... They're pretty common sense principles for those who work in security and spend a lot of time on it. It doesn't feel like a huge overreach to me. It feels like these are very solid principles. So after the UK developed that, they went and worked with ETSI, which is a European standards body. And they took those principles, the 13, and they basically built them out and refined them and talked more about how you would do them. And they then made it what's called an EN. So it's C-E-N... Jesus, 3-0-3-4-6-5 or 6-4-5. Numbers are hard.

 

Joseph Carson: They're always so similar. So you're always going to mess them up into something else.

 

Jen Ellis: What the EN means is that basically each of the member countries of the European Union basically then takes it on board in some way-

 

Joseph Carson: And they ratify it, some kind of local version depending on what laws.

 

Jen Ellis: So then as the EU looks at building legislation, they have this standard now... It's voluntary, ETSIs are voluntary, but they have it to look at and to draw on. And then what happened with that was India adopted it and then Australia adopted it. Now the UK has actually built the Product Security and Telecoms Infrastructure Act on top of it, where they're not going to take all 13 principles, but they're taking the first three, as you said, one of them is don't use universal default passwords. And so you see these things sort of propagate, right? And that's what you want to have happen because it actually does simplify the situation for those who are trying to adhere to them but also do business internationally. So that's very good. So I'm sorry, you asked about the taskforce and you've asked me a couple of times and I keep diverting.

So we formed the ransomware task force at the end of 2020, because we were sitting around and after we'd all learnt to make bread and our own cheese, we thought what should we do next? I honestly don't know who these people are who had time to do that stuff, but I'm very envious of them. In any case, actually what happened was the Institute for Security and Technology, which is a West Coast think tank in the US, sits on the sort of intersection of technology and national security when there's a lot of ransomware attacks against hospitals. And I don't know if people know, but there's a pandemic.

And so they basically went, that elevates this to being on a level of national security. And they said something needs to be done because what we're doing isn't working. And so they said, we're going to create this task force. And what was really savvy is, if you think about the timing, they looked at the situation. They said, we've had a new administration come in this year. They've had their hands really full with everything going on. But in spring of next year, they've already signaled that they want to do more on cybersecurity, which is not hard because the bar had been set very low.

And in spring of next year, they're going to be looking at what they should set as their priorities. And if we can get a set of recommendations to them in their hands at that point, then we've got a chance of them looking at them and factoring them into their thinking. So they pulled together, the guy, Phil Reiner, who put it together, pulled together a group of a hundred of his closest friends, and it focused on four main work streams. So how do you deter and disrupt ransomware attackers and how do you prepare... How do you help organizations prepare for and respond at scale? So this isn't about issuing the advice to like, hey, you should use multifactor and you should have offline backups.

This is about what governments can do. It was all about advice for governments. And the task force, Michael Daniel said in the launch that we had all sprinted a marathon, and it did feel like for all of us, like we had a whole other job, like full-time job while we were doing it, because we created the thing in a sort of three-month window. But we came out with these 48 recommendations aimed at governments around the world. And it just so happened, completely coincidental, the report landed and a week later Colonial happened. And as we've already talked about, Colonial, HSE, boom, boom, boom.

 

Joseph Carson: It was a whole kind of domino effect at that time. And it was where, of course, criminal organizations had started and ransomware as a service and you had a lot more, let's say, criminal organizations out there who were moving into software crime and basically hiring groups of engineers in order to get some type of financial kind of participation in the ransomware side. And a lot of organizations, as they've trended to digital transformation, have very much depended on digital in order to conduct their business. And absolutely the impact is so very... Interestingly, last year has been a more quieter year, I would say, from my side on this response, I've seen a bit of decline. I don't know if it's to do with sanctions or to do with devalue of cryptocurrencies in some regards, or?

 

Jen Ellis: If you want my opinion and my opinion is unpopular because everybody would like to declare this a victory, but I think it's got a lot more to do with Russia invading Ukraine than it has with anything else.

 

Joseph Carson: That's also my opinion as well is that I believe that they... Russia is a safe haven for many criminal gangs and they've been-

 

Jen Ellis: So is Ukraine.

 

Joseph Carson: And Ukraine as well.

 

Jen Ellis: A lot of their hackers are busy with other things right now. And actually, but it's not just the fact that their hackers are busy with other things. It's also the fact that, I don't want to sound hyperbolic, but we actually do kind of stand on the precipice of the potential for there to be a pretty major war. And actually in that context, you have to remember that one of the biggest things that allowed these groups to thrive for so long was the fact that the governments that they operated under either couldn't prosecute them or didn't because it actually-

 

Joseph Carson: Benefited them.

 

Jen Ellis: It benefited themed. It furthered their political aims, right?

 

Joseph Carson: Because it's the cyber mercenaries, it's the cyber-

 

Jen Ellis: Absolutely.

 

Joseph Carson: Basically, you're holding on a leash and ultimately you're allowing them to carry criminal activities and be financially rewarded for them. You might even be getting paybacks, you might be getting financial back payment for that. But ultimately, when you want something done for you, then basically you're giving them campaigns to carry out. And this is where the whole cyber immersion thing is, we know that a lot of those well-known criminal ransomware gangs have been also carrying out other activities as well, whether they're giving access or-

 

Jen Ellis: And so if you think about the run-up to the invasion, at the same time that we were seeing tanks en masse at the border, we were also seeing Biden say to Putin, you got to stop attacking our critical infrastructure or else. And he delivered that message not once, but multiple times. And then we saw this thing where Russia made some arrests around REval and there was some take-downs-

 

Joseph Carson: It was a PR show.

 

Jen Ellis: Yeah, we don't believe that. But here's the thing, the reason it matters isn't because it's going to meaningfully change the landscape. You're right. They were scapegoats. They were sacrificial lambs. It isn't because it's meaningfully changing the landscape. It's because of what it's signaling about the fact that Russia wanted to demonstrate that it was willing to play ball, which shows that actually the attacks against western infrastructure has a perception place. It plays in the way that the dialogue around these attacks was developing. And so I think when you are in a situation where escalation can happen through just sort of shoddy execution and escalation can be profoundly impactful, then I think you say to people, hey, you know, what you are doing, maybe focus a little bit more on Latin America for the time being and not so much on people who were part of NATO. And that's what we've seen. We've seen a massive upscale in the level of attacks against Latin America.

 

Joseph Carson: Costa Rica was the big one, but they had, I think it was the pension scheme, it was hit, if I remember?

 

Jen Ellis: Their entire... Not just the pension scheme, they got absolutely hammered and it has massively impacted them. But I mean, not just them. I was at an event the other day and there was a chap there from the National Bank of Peru talking about being here. I mean, just hugely effect in many places in Latin America. And so I think it isn't that it's gone away, it's the way it's being handled has shifted for sure.

 

Joseph Carson: Yeah, absolutely. So we're just going onto the policy side. What's the direction forward? Where do we need... Where's the ideal place that you see where we get to when it comes to corporation?

 

Jen Ellis: Yeah. Yeah. So I mean, I have this thing about the fact that I think that we're in the second generation of cybersecurity as an industry. And I think that every generation looks at what the generation before did and thinks about how it can build its own path forward. And the reality is context changes and what you do in reaction to that context changes. And the reality is that cyber risk is now too big and too costly and too impactful for our industry to remain the way it has been, where it's been niche and largely actually unbothered because people see us as so niche and so complicated, and they don't want to get involved. That's not going to fly anymore. What's going to happen is things are going to come our way. We are going to end up with professionalization of some kind being put on us. We're going to end up with regulation. These things are going to happen, and it's going to impact our industry very profoundly.

 

Joseph Carson: We're going to move into the whole financial type of really kind of strict boundaries and checks and balances. I guess that's... We've seen regulation coming in certain areas, but I think it's going to be become-

 

Jen Ellis: And look, we liked pout and whine about it and say it's so unfair and no other area of technology has it. I mean, frankly, I think we won't be the only area of technology that has it. I think AI will probably get it. But the reality is this, actually, professionalization is not uncommon. Lots of industries have it, and it all comes down to the risk associated with what you do and the level of complexity. And so I kind of take a view of if it's good enough for doctors, it's probably good enough for us. I don't know why we think we're special in that regard, but what we have as an opportunity is we can either be part of the process or we can not. And that means that we as a community, we have to stop the infighting. We have to stop... People in security come from a background of pointing out problems. We're not always great at pointing out solutions or suggesting solutions or having the patience to be part of creating a solution.

 

Joseph Carson: And we're also not great at listening many times as well. The other side is that we do have to listen to the industries and the businesses in order to know how we can help rather than-

 

Jen Ellis: Well-

 

Joseph Carson: Because we've become a very much... To your point is that with the enforcers, here's what we need to be doing, but sometimes we are not as flexible as well.

 

Jen Ellis: Yeah, I mean, going back to that original thing that I was saying at the beginning about it's such a crime to be non-technical in security. We have a habit of having a view that if somebody has different experience to us, that makes them intrinsically less knowledgeable rather than just differently knowledgeable. If I go and talk to somebody in the policy sphere and they're an expert on security, what am I doing there? I want them to be an expert on policy, which is not something that I am an expert on. I'm not an expert on the mechanics of how you create law. And I think probably most people in security aren't really experts on that. But what they are expert on is the piece of security that they work on or whatever they've got experience of. And if they can bring that to the table and we can get other people who have expertise in law together, then we can create something potentially very positive.

So when it comes to looking at things like certifications for our industry, we're the people who are going to be affected by them. So we are the people who should figure out which ones we like and which ones we should take forward or it'll get thrust upon us. So in terms of where the future's going to go, my view is we all, everybody who works on security, has a choice to make about whether they want to play a role in determining their future or not. And I think if you do, it doesn't have to be massive changes. You don't have to suddenly decide you're going to do my job. Please don't, I would like to be able to continue to do my job.

I don't want to compete with everybody. But what you can do is go, what are the little things that I can do? How do I communicate with non-technical people? How do I communicate with people outside security? How am I teaching people to fish rather than... And I'm not talking fish with a Ph-, don't go that far, that's too far. But how do I, rather than just being home IT for everybody in my family and rolling my eyes and tutting and making disrespectful comments.

 

Joseph Carson: How do you get the society around us more secure, more thinking about it-

 

Jen Ellis: You said it at the beginning, you said we need better communication.

 

Joseph Carson: Yes.

 

Jen Ellis: That's it. Like how do we do a better job of having empathy and communicating with people in a way that rather than them going, oh, it's too complicated, it's too technical and I don't want to know about it. Instead, they go, oh, that matters to me. Tell me more. Because it matters to me. And that's the thing I think everybody can play a role in.

 

Joseph Carson: Yeah, I think it'd be well said from when you talk about Mikko Hypponen talking about that we're no longer... We are in a society. We're no longer just protecting computers anymore. We're protecting basically society. And I think that's the mindset we have to get into and start thinking beyond the scope of where we're coming from in the past. That what we do has a much more society impact today than it's ever done before. And it's not just a computer at the end of it, it's a service. And then that service could be keeping people alive. It could be providing electricity to people's homes. It could be funding the whole financial industry. And a lot of it we have to understand that it's no longer just a computer or a server. This has basically societal impacts as a service. And we are there keeping the lights on, in many cases, and we have to look beyond the code and look beyond our tacky kind of... I'm a geek at heart, but I always have to remember that, at the end, this is providing some type of function or some type of service.

 

Jen Ellis: I could not agree more. And I think some people listening to this might feel like, well, I just want to do this as a job. I don't want that responsibility on my shoulders. And fair enough. That actually is fair, you can't judge people for that. Fair enough. But the way I look at it is, isn't it also an extraordinary privilege to feel that you personally can have an impact in society in that way? And actually, yes, it's a heavy mantle to be a protector, but it's also... It's an extraordinary thing to know that that's what you do with your time is that you protect people. I think it's incredible. So I basically say to anyone who spends their time doing that and gives their time up to protect other people, that you guys are awesome and we appreciate it and keep digging in. Find the ways to make people care about what you do.

 

Joseph Carson: I think that's a fantastic note there to start closing the podcast on. Jen, it's been fantastic talking to you, and again, I've learned a lot of things as well. As I mentioned, this is my favorite part of the week, is getting to talk to awesome people. So many thanks for spending the time. Any final words of wisdom that you would like to-

 

Jen Ellis: No because we got to... I've talked too much already and I've taken too much time, so I'm just going to leave it where it is and say thank you so much. This was awesome. Thank you.

 

Joseph Carson: Thank you, Jen, for being on the podcast. It's fantastic. And for the audience, and really the perspective here is that, at the end, it's all about being transparent and working together and finding ways that we can basically make sure that we communicate and have the common understanding of what we have an impact in the world's digital society. So as I've mentioned, Jen's been awesome. For everyone on the podcast, again, tune in every two weeks for the 401 Access Denied. I'm the host of the episode, Joe Carson, it's been a pleasure. Stay safe, take care, and I'll see you again soon.