Skip to content
 
Episode 8

International Cyber Warfare: How Real is the Threat? Part 2

EPISODE SUMMARY

Josh Lospinoso joins us again for part 2 of our international cyber warfare conversation with Joseph Carson and Mike Gruen. This time we focus on zero-day vulnerabilities and the moral complexities that need to be considered. How do you train personnel to know where the line should be drawn and when they should hold back some of the cyber tools in their arsenal?

Subscribe or listen now:  Apple Podcasts   Spotify   iHeartRadio

mike-gruen-150x150
Mike Gruen

Mike is the Cybrary VP of Engineering / CISO. He manages Cybrary’s engineering and data science teams, information technology infrastructure, and overall security posture.


Intro:
Invest in yourself today with our Insider Pro product, which gives you the career path to reach the next step in your cybersecurity journey. Join today on cybrary.it using the discount code, podcast.

Mike Gruen:

You're listening to the 401 Access Denied podcast. I'm Mike Gruen, VP of engineering and CSO at Cybrary. Please join me and my cohost, Joseph Carson, chief security scientist at Thycotic. As we discuss the latest news and attempt to make cybersecurity accessible, usable, and fun. Be sure to check back every two weeks for new episodes.

Joseph Carson:

And it also gets me into my next topic, which is a bit of the controversial one. And everyone's always talking about is the use of zero days, and cyber weapons, and cyberattacks. And for me, in the past, I spent 11 years at Symantec working on patch management, and doing basically helping patching the zero days once you got some type of configuration or hardening and so forth.

Joseph Carson:

And it also got me in thinking, as well when you're using to zero-day, you mentioned burning. If you want to be noisy and you have to do it fast. Sometimes you either burn something quickly. And it gets into is really what's the risks of zero days. Because zero days when we look at them is that you could be, let's say, what was a Windows' vulnerability.

Joseph Carson:

And two, if you look at the mass, how many people in the world are using Windows? And where's the mass population? And is your target mostly using those operating systems, or is it just that minute, let's say, a campaign target? And it gets into really where do we sit? When is it okay to use zero days? And when is it should be responsible disclosure? When is the risk too high?

Joseph Carson:

Let's say, for example, it was Facebook. Most of the citizens, population using Facebook are Western countries. And if I use zero-day or some type of impact to use it against the target, at the same time, once that zero-day is burned, it gives the target, the ability to turn that around, weaponize it, and shoot it back again. Unlike conventional kinetic weapons, it's like ping pong. Is when you hit the ball over to one side of the net. That ball can be turned around and hit right back at you. So what's your thoughts in the runs of zero days when it gets into the responsible disclosure, the impact, and risk assessments of when it's okay to use them?

Josh Lospinoso:

Yeah. It's another really complicated topic. I mean, I think first off, if you are a legitimate, offensive cyber actor, you should try as hard as possible not to have to use exploits. So you should lean on your unfair advantages as like a government agency to find other ways in. You know what I mean? I mean, the big one is credential stealing, right? So if you can just steal credentials and masquerade, do that. That's way less noisy. You aren't compromising any kind of system at a fundamental level. And basically, use the principle of the least invasive technique you can to achieve your objectives. Right?

Mike Gruen:

Also, regarding the ethical moral issues around disclosing that to … .

Josh Lospinoso:

Exactly. Exactly. Right. So I think generally speaking, that's the way it go. And I don't think that's controversial. I think that definitely is how things go. Where you get into really sticky territory is like, okay, well, should we keep a treasure chest of really critical vulnerabilities for like if you're up against an adversary who's got their shit together, and you can't masquerade, or they've got really stringent controls around the things that you need to get into, do you keep a sledgehammer around so that if you've got to break into something, you do the risk reward calculus? Yeah, it's that important. Let's use the zero-day, right?

Josh Lospinoso:

That's where we get into territory where not everyone's going to agree. Right? Where my feelings are is that I love the way that the security community keeps us safe, right? And I love that we have responsible disclosure. We've changed the mentality so that manufacturers and people that publish software no longer have a litigious response to you like disclosing something. It's like, "Oh, thank you for doing that. Here's some money."

Josh Lospinoso:

We've come a really long way. My default position is like, disclose the vulnerability, keep us safe, right? Because if you talk to a government, someone who's lived in a government their whole life and all they know is cyber operations, they're going to be focused on their little view of the world. And they're going to say, "No, that RC against Windows remote desktop is super juicy. We need that. We'll pay for that. Keep it on the side. Please don't disclose that." Because that means a lot to them in their operations.

Josh Lospinoso:

But you think about the millions of people that are now all of a sudden at risk, right, because maybe they live under a dictatorship, or they're journalists and they're getting suppressed, right? Is the juice worth the squeeze there? I don't know. I go back and forth on because I've seen both sides. You know what I mean? I don't really have any good answers.

Mike Gruen:

The saying is, the same people who have no problem keeping a nuclear arsenal around also fall. I mean, it's the same thing we have this deterrent, sledgehammer thing that we all know is completely we should never use, but yet we don't want to get rid of it. I'm sure if it was up to civilians, 100% up to civilians, we'd probably be doing something different than what we do.

Josh Lospinoso:

Yeah. Sorry. I was just going to say the additional complexity there is that I think making a nuclear weapon that you can put on an ICBM is orders of magnitude harder than finding some of these vulnerabilities. You know what I mean?

Mike Gruen:

Oh yeah, absolutely.

Josh Lospinoso:

Yeah. Yeah.

Joseph Carson:

It's the capability. I think you don't need to store thousands of these things around. It's the capability. Knowing that you have the capability of thing is sometimes enough of a deterrent. And knowing that you have the capability, if you needed to, you could have hundreds of them in a short amount of time. I think it's more of the capability versus having it sitting around. Because having it staying around just means that you have to maintain aging, let's say, very instable, very rusting maintaining a very costly infrastructure to do so. I think it's knowing that you have the ability to do produce something quickly is enough of a deterrent, when you get into really big kinetic impact side of things.

Mike Gruen:

I think that also goes with what Josh was saying earlier about use your unfair advantage, right? As a nation state I imagine if we really needed to find a zero-day quickly, there's no reason to stockpile. Then we probably could find it pretty quick.

Josh Lospinoso:

Or find another way in. Use human-enabled operations, right? Like we're a government. We can pay people off to do stuff. You know what I mean? There are other ways around it. And I just go back to, I mean, we've seen a couple of really devastating worms in the past couple of years that sensibly came from a compromised nation state actor. You think about EternalBlue, that remote code execution vulnerability. Having that on the shelf would be really, I imagine for whoever the Asian was that got compromised from it, having that on the shelf is incredibly valuable.

Josh Lospinoso:

If you've got to crack into a hard target, and there's no other way. You've got to get into that domain controller, or you've got to get into the IT administrators workstation or whatever. And you've been sitting there for three months, and there's some critical stuff going on. Yeah, bring out the hammer. We got to get in there. I could see how the temptation is there. But look at the damage, right? Was the juice worth the squeeze on that? I mean, I don't know. That's not for me to say.

Joseph Carson:

So it will truly be then companies value over space of only a few months. This is where I look at it is that at some point and time you have let go of things from the past. You have to keep moving forward. And those types of sitting on the shelf, we should have a shelf life for them. I think it should also be based on a risk assessment approach saying that this particular exploit, or a zero-day, that there's an 80/20 rule. This 80/20 to my own wounds that people that are protecting are exposed to this and only 20% of my target. When do you have that decision? Okay. There's a shelf life of this. I prefer nobody but us. How do you know no one else has it until it's been used? Which is the … .

Mike Gruen:

And the longer it sits around, the more likely it is that somebody else has discovered it anyway.

Josh Lospinoso:

Exactly.

Mike Gruen:

So probably better.

Joseph Carson:

Correct. So it should be expiration shelf life of those. And you should always be looking for the next one, the one that you find next and keep moving forward, especially later versions and update it. And this even gets them into, one I remember there was a situation for years ago. It was in DC. And it was all about private companies doing this for a business and selling it to governments. This is even where my ethical, moral compass gets skinned.

Joseph Carson:

I hate that the ability where a private company, their business model is for finding zero days and selling to governments, because they're not going to sell it to one government. They're going to sell it to many. And overnight, those governments that they sell it to could turn to be a folk. So where is it okay for private companies to be involved in this type of activity? We should get the export compliance because there's a lot of company. And there is no export compliance for these types of activities. For zero-day there is no export compliance.

Josh Lospinoso:

It's complicated. Yeah. I mean, look, private companies are motivated by profit, right? And that's neither good nor bad. It really depends on what the company's doing. And I mean, I think we're seeing things like the Zero Day Initiative and bug bounties are starting to get us towards a place where people who do this kind of work of reverse engineering and vulnerability research can make a really good living, like exposing vulnerabilities, and getting them past. It pales in comparison to certain times of ...

Josh Lospinoso:

If you have a full-blown jailbreak for an iPhone from Safari mobile browser, all the way through to the Colonel, it's like over a million dollars, right? Is probably what a government will be willing to pay for that. I am unaware of any private company. I don't know what Apple would pay you for that, but I'm pretty sure it's not over … .

Joseph Carson:

Less than a hundred. I think their maximum has been 10, 25, I think 50K. It becomes a private government type of a customer. The thing is that those companies they don't just want one customer. They want to have multiples. So that's where you get into the challenge.

Mike Gruen:

Right. … to Apple and three government agencies, three other governments.

Joseph Carson:

Well, for me, that's actually perfectly fine because it gives you equal rates. It's an equal playing field that everyone has the same starting point, if they sold it at the same time that it is.

Mike Gruen:

Right.

Joseph Carson:

It gives everyone an equal opportunity. For me, I'm perfectly fine where that says disclosure, meaning that we're racing. I'm racing to find a fix and the patching, and others are racing to exploit it. So it gives an equal opportunity. That's what responsible disclosure sometimes is about is to give everyone an equal opportunity rather than attackers having knowledge and abusing it before everyone can actually go to harden or to decide, and do a risk approach. And it gets into-

Mike Gruen:

… responsible disclosure is making ... My opinion is that you're trying to give the defender a little bit of a leg up. You don't release it all at the same time. You say, "Hey, this is the thing. I wanted you to be aware of it. And if they don't take care of it within a reasonable period of time, then yeah, you do to force the hand." But you want to give them at least some period of time to address it.

Joseph Carson:

Absolutely. … what do they really give. Notifying the vendor first and giving them the opportunity to fix it. But at the same day, when it gets into what we're talking about selling it to governments, I think it's important to give everyone an equal playing field at that point.

Mike Gruen:

Right.

Joseph Carson:

Should we ended up getting into we're should be a fact sheet for security researchers? Should they get some type of protection, like a whistleblower type protection for this as well? Because I remember doing a panel a couple of years ago, and it was an interesting panel because we had security researchers. Penters was on one side. We had law enforcement on another, and companies on another.

Joseph Carson:

The outcome scenario was interesting because law enforcement said, "If you basically, let's say, find an exploit or find some type of vulnerability and those companies either infrastructure or tools, or solution, whatever it may be, and you do do it in a way that you might be exposing yourself to some legal issues, then the law enforcement says, "Disclose it anonymously."

Joseph Carson:

And then the other side, so you're hardworking. You don't want to get recognized for. And then the second part was is that those companies find out that you did find the vulnerability and that they're potentially putting many customers and companies and people and citizens at risk by actually not fixing it. That they were suing the security researchers for actually doing that activity.

Joseph Carson:

And there's a lot of cases right now. There's several companies that are actually suing security researchers, and even journalists for reporting it to really make a move forward to ... Should give some type of protection over those who are not doing it from a criminal perspective. They're not profiting over it. And it's more about getting to that equal playing field where we all have the ability to know about it and decide ourselves to fix it. So, Josh, do you have any thoughts on that and Mike as well?

Josh Lospinoso:

Yeah, definitely. So it's certainly like a scary thing, right? So I worked a little bit with the EFFs coders rights initiative. I mean, these guys are amazing. They give pro bono legal advice to people that disclose vulnerabilities. So I've worked with them a couple of times on some things, and that's given me a lot of comfort that I was on from legal footing, and disclosing vulnerabilities, especially ...

Josh Lospinoso:

One of them I found a vulnerability in the DNC's donor database. You could dump their entire donor list. You could unsubscribe people. It was pretty horrible. It's like a URL enumeration thing. Obviously, given their history, I was really concerned about what's going to happen if I disclose this thing?

Josh Lospinoso:

So I think number one is just having resources available for security researchers that are doing good work to make sure that they feel like they're on Ferber legal footing when they go in and approach people. But then, of course, those lawyers are giving legal advice about US code. And so, we need to make sure that US code is also unambiguously and firmly outlining what is and is not. Because another part of it is, especially when you're dealing with ...

Josh Lospinoso:

If you're dealing with a binary or some code that you have in a mocked up environment, that's one thing. Oftentimes when you're dealing with SAS vulnerabilities, you're dealing with a life server that's in production. And so when you're in the process of discovering what's going on with that service, there's always in the back of your mind like, "Where's the line? Where's the line here where I'm good?" And, "Oh, wow, okay. I found a SQL injection and now I just dumped a bunch of customer data. Is this the line? Where's the line?"

Josh Lospinoso:

So I think we need to go a longer way and illuminating that for people. And the challenge is that it's really hard to explain this stuff to people that are experts in crafting code. So I don't know what the way forward is there. Yeah. Mike, I'd love to hear your thoughts are.

Mike Gruen:

It's just a tough issue, right? I mean, my thoughts are all over. But yeah, I think about our own disclosure program that we've released. Yeah, I do want people to participate. I want to know those vulnerabilities. And to your point of like, well, what's too far? We were actually having this discussion the other day about like, well, so we have the platform. And people can launch virtual machines and do labs and all these things.

Mike Gruen:

Well, that actually runs up our bill. So at what point of doing vulnerable ... I appreciate you letting me know, but I also don't want to cause our AWS bill, or our vendor bill to go and put us out of business because you didn't realize the impact of this thing that you're doing to test. And so, I think there's a lot of ... Yeah, it's a difficult, difficult area.

Joseph Carson:

Your point is sometimes it only makes an incident when there was actually some type of financial impact as well. So that when there is some financial money, and I go back to the cookers' day, which I always love reading the book and there's I think it was a 73 or 77 cents, which actually became a criminal activity, even though it was such a minute cost.

Joseph Carson:

And it gets into, so Josh, you were saying one of the things I find I remember doing a lot of penetration tests the password. I did walk up to the door and the doors open. I can look for the door. I could stand there. I could watch people coming in the door. But the moment I put my foot inside the door, that's when that gray legal area became the challenge.

Joseph Carson:

And that's where you can sit and watch whatever data's going in and out. The doors lying right open and other, let's say, criminals could decide to go in through that same door. But when the point is, is that you want to know, well, what's the risk of that door being open? What's behind it? And end up getting in and you'll start looking. And now it's a data database. And then you've got all the sensitive data. So now you know the risk.

Joseph Carson:

But at that point in time to identify the risk, you put your foot in the door, and this is where it all just gets in. The EFF, I think, is definitely fantastic to have that at least from a legal backing for a lot of security researchers. But I do think there needs to be some type of at least protection from a whistleblower type of star, security whistleblowing. Some other protection for the future, to make sure that they're not exposed. Because a lot of these individuals, most hackers are good citizens looking to use their skill and help. And sure, they're not-

Josh Lospinoso:

Especially if they're telling you about it.

Joseph Carson:

Exactly. And even work with them to help fix it because sometimes they're the ones that knows how to close the door.

Mike Gruen:

And I think one of the other challenges is that the international, right? So the US laws, and what happens with the researchers in some other country, or vice versa we're here. So I think there's probably got to be some sort of international framework. Earlier, you were mentioning the whole sell it to nations, and multiple nations. What ran through my head is like, "Do we need like a WHO of, a World Health Organization, of cybersecurity where it's actually a UN-run and they'll give you $10 million?" They can put up more money than anyone else for that vulnerability and make sure that there is that even playing field. Something along those lines.

Joseph Carson:

And this gets into what I might say earlier about the things I like to tell on papers, which was one of those first steps, really looking at international cooperation, and really that definition into how we actually mitigate these types of incidents in the future. And then I really liked when Brad Smith, Microsoft, really brought up the term. I was a bit against at the beginning, because I wasn't quite sure what his opinion. What are the Geneva convention style type of cyber corporation, which is really kind of that WHO of the cyber realm about how do you make sure that if somebody does decide to do a cyber, go for attack against some nation state to make sure that civilians are avoided?

Joseph Carson:

And it gets into the big gray areas, even in Ukraine. When you attack a parsley station, you take an electricity was a human life losses. Because I worked in [Numbers 00:21:17] Service Hospitals in the past and if I have no electricity, especially in a December winter night and when it's really cold, then you are at risk of people's lives being lost.

Joseph Carson:

So this gets into really where I think after those events and those areas that I do think there has to be some protection. Because sometimes even just missing is that what civilians working in aircraft carriers, civilians working in war zones and military locations to make sure that they're not at risk. So we do need to have some, especially things like Red Crosses out in a lot of these areas from medical historians. But we really do need to have some type of … mention to protect civilians from being secondary victims of accidental cyberattacks.

Josh Lospinoso:

Right. Yeah. And I mean, I think this is totally not a plug for shift I promise. We've seen a ton of security activity in the information technology space over the past couple of decades, obviously. We've seen a lot of really promising work in defending these systems to the point now where you and I do banking transactions on the internet, and we're more or less reasonably confident that it's secure against the casual attacker, at least. That's pretty incredible progress. Right?

Josh Lospinoso:

But if you look at operational technology, so things like planes, trains, automobiles, your car, this isn't news to you guys. Your car has dozens of computers in it, like microcontrollers that are running firmware, communicating with each other. And a lot of cars these days when you move the steering wheel, it's not directly driving the wheels. It's like fly by wire. There are computers in the middle of you and your wheels.

Josh Lospinoso:

So there's software that is conveying you on a highway, like at 75 miles an hour or whatever, right? That's terrifying because I will tell you, after looking at these systems, they are nowhere near the level of security that IT systems are. And that is terrifying. You know what I mean? There's not an OT system that I've seen, or come across, or been part of a team that's been penetration tests where we just haven't compromised at the lowest level.

Josh Lospinoso:

Use things are designed with physical security in mind. And that's just completely not the case anymore. Either, A, you can very readily gain physical access to these things, or the supply chain that is putting these systems together. Or, B, very often there's telemetry systems nowadays in your car. You've got your OnStar. I think GM, for example, not to call them out, every car as of 2017 and 2018 has ... You have no option in the matter. It is going to phone off. You know what I mean?

Josh Lospinoso:

Yeah, I think on the IT side, we've seen that information systems can cause loss of human life. There was a hospital, I can't recall where it was, but I think NotPetya hit them, and CryptoLocker at a bunch of systems and they couldn't schedule surgeries. And there was patient management issues that undoubtedly cause loss of life. Right? Now imagine we do that to OT systems, which the whole point of these systems is to manipulate the physical world. It's too easy for us to go to the DC Metro and run a train into blocks at 80 miles an hour. Right? I don't know. That's where I'm just absolutely terrified is that we haven't made progress on the OT side.

Joseph Carson:

I agree. That's probably one of the areas that keeps me up at night is that when it comes into people's lives are risk, and always they get into ... When I worked in Numbers Service years ago, my systems, I had an SLA of 23 minutes. And if my systems weren't running for 23 minutes, then I knew I was responsible for people's lives. And that's where your SLA. From then till now, that's the only thing I remember my entire career that really gave me a lot of pressure and actually kept me up at night where I want to make sure that their systems were operating all the time.

Joseph Carson:

And to your point as well is that when you're buying equipment today, if you're buying a TV or a car or whatever, they call it IOT devices. But really it's just the computer that is dedicated purpose use functional that cannot be reprogrammed that much. It's dedicated purpose. When you buy them, you're actually not buying the car. You're renting the service of the car. And as the manufacturer vendor who's now owning the data.

Joseph Carson:

I remember even working I did a project on autonomous shipping. And the engines that we're using for the township. We purchased the engines. We own the hardware itself. But the manufacturer owned the data that those engines were actually creating. So it gets into even the cars. You get a Tesla is the manufacturer owns the data that those cars create. There's difference in contracts that we have today, which even increases a lot of risks that ... And it's ease of use simplicity.

Joseph Carson:

I remember somebody once they had this computer system. It was a funny scenario, it was the computer system, this is the most secure computer system ever. Everything you can access it. It's very limited. And the person came up and turned around one of the guys in the team said, "Oh, so if I take the chassis off the alarms and stuff off I can't interfere with the computer electrical equipment?" Like, "Absolutely." And he took a drill and drilled a hole in the top of the computer, took the top off. And of course, not setting off any alarms and then tried to access to electric equipment.

Joseph Carson:

Sometimes it gets into perception. And we had to make sure that we are realistically, when we look at these security perspective is doing a proper assessment and proper risk assessment. And that should be part of security by design. But definitely help in the future become more resilient. And this ultimately the more resilient you are, the more or less attractive you are to cyberattacks. And it goes back to ...

Joseph Carson:

Even Estonia, as I missed at the beginning. When Estonia decentralized database data repositories across multiple countries, they became less attractive because in order to attack Estonia, you needed to attack five countries. And that becomes a less attractive. For attacker, you might be attacking your ally as a result of that because you might have good relations. It also means that the small countries are almost acting like a bigger country because they're cooperating, collaborating, and relying on those other countries' infrastructures.

Joseph Carson:

So I think that really gets into one of my next points and final points is the defensive side of things where civilians get involved. I know the article that you wrote was actually heavily getting into this area where even Estonia after the attack, there was the development of what's called Cyber Defense League. This is really like is where civilians get involved to defending the country and leaving the offensive capabilities to the government, offensive operations.

Joseph Carson:

So what do you think, and especially in the countries around the world and even in the US, the future of cyber defenses, or cyber core, or whatever you call it, cyber reserves, or the guards, what do you think it is today? And where do you think it needs to go on the future?

Josh Lospinoso:

Yeah, I mean, I've seen a lot of really promising progress on this front. So there are a number of initiatives, the National Security Agency, and the cyber commands to share. I mean, there are details torque out here. But information sharing between corporations and the US government. I think we're building some strong foundations there. And that is absolutely critical. I think corporations need to trust the government that if we're tipping them off to something that we're seeing at the government level, using our unfair advantages, that they trust that information. Right?

Josh Lospinoso:

So it's going to be a long-term relationship building thing. There's a technology solution as well as a people solution that we need to build in there. So that I think is a really important foundation. I think we're seeing also promising progress on the civil military divide. We need to go farther, but I think, for example, national guards are standing up these cyber units that can go and do incident responses as appropriate. And these are people that their day job they're a CSO at a company, or they work in a sock, or something like that. And then they can transition that experience directly into their military role and go do an incident response on a power plant or something like that.

Josh Lospinoso:

So seeing a lot of really promising progress there. And then on the government side, we're still having a hard time attracting ... Not even attracting, but retaining really talented individuals and keeping them satiated with the high stream of quality work. They signed up to do a mission. Most of the time they're not doing that mission. They're doing military ceremonial stuff, rather than digging into an incident or something, and giving them the tools.

Josh Lospinoso:

I think it's a pretty hot seller. We'll say to someone, "Hey, you're going to go do hunt missions on submarines that are deployed." Or like, "You're going to go out to Afghanistan and you're going to do hunt missions on helicopters." You know what I mean? The helicopters, come back, you're going to go and you're going to dissect all kinds of memory dumps and do hunting and surveillance and analysis. You're going to attract some top talent to do that kind of mission. But we throw up a lot of barriers to getting them there.

Josh Lospinoso:

I'm a lot more confident, though, on the defensive side, because I think the people can do the defensive mission in their civilian capacities in a lot of ways where on the offensive side, the closest analog you have is pen testing, or threat emulation. And guess where a lot of those threat emulators come from there were like my coworkers until they got fed up with government service. And then they go and they write root kits for whoever. Right? So yeah, kind of a scattershot. But I'm relatively pretty bullish on where we're headed, especially on the defensive side. I see a lot of promising progress.

Joseph Carson:

And that's great to hear. Because I think over the years, I've been seeing a lot of different discussions of the points. And I think over the years I've been sharing what Estonia has been doing in those areas, and hopefully that it's been replicated and those lessons learned, what the sub defenses.

Joseph Carson:

And then even here, we have the lock shields event every year, which does the more government type of, say, gamification side of things. So it also gets into a lot of the activist side of things. So that's always great to see. And I think those types of engagements and activities will definitely increase both the defensive and offensive capabilities over the long-term.

Joseph Carson:

So there's a lot of positives, I think, direction on this. And I think overall the direction, I think there was a lot of sturdy areas, of course, in the OT side, and the zero days and protection and stuff. But I think I'm glad that I haven't seen anything major in the last couple of years. And I guess, so reading the latest sunroom that it's meant to be a 10-year thing.

Joseph Carson:

So maybe we're due one sometime soon. So we should be on our toes. Because definitely being in Estonia, we have a noisy neighbor that keeps us, let's say, vigorous and not complacent. We're always observant and alert. And so it's always making sure that you don't become complacent over time. I think that's important as well.

Josh Lospinoso:

Yep. Yep. For sure.

Joseph Carson:

Awesome. I think this has been interesting. I think the audience is going to have a lot of different things to take away from this, a lot of different discussions. And for sure we could probably go on for each topic for hours and hours.

Josh Lospinoso:

Oh, no doubt.

Joseph Carson:

But Josh, fantastic having you on the show. It's amazing. I think definitely I'd love to even pick one of these topics or even of the events, and even going to a lot more discussion to look into the details.

Joseph Carson:

Yeah. Mike, awesome having you. From your side, even was it research a lot what Cybrary is doing in regards to things like panties and active design and responsible disclosure. So I think maybe we got two or three students out of this one. I don't know. But it's been a pleasure. Thanks for having me on the show for everyone. I really look forward to follow-ups in the future.

Josh Lospinoso:

Great. Looking forward to it. Thanks so much, Joseph. Great to meet you. Good to see you, Mike.

Mike Gruen:

Yap.

Outro:

Learn how your team can get a free trial of cyber for business by going to www.cybrary.it/business. This podcast is also brought to you by Thycotic, the leader in privileged access management. To learn more, visit www.thycotic.com.