Intro:
Invest in yourself today with our insider pro product, which gives you the career path to reach the next step in your cybersecurity journey. Join today on cyber.it using the discount code podcast.
Mike Gruen:
You're listening to the 401 Access Denied Podcast. I'm Mike Gruen, VP of engineering and CISO at Cybrary. Please join me and my cohost, Joseph Carson, chief security scientists at Thycotic, as we discuss the latest news and attempt to make cyber security accessible, usable, and fun. Be sure to check back every two weeks for new episodes.
Joseph Carson:
Hello everyone. Welcome back to another awesome fun show today. This is the 401 Access Denied podcast. You have, myself, Joseph Carson located here in Tallinn, Estonia. It's a long time since I've been traveling. This is actually I think the longest period of time I've been in Estonia nonstop without travel. But really happy to have another exciting show for you today with an exciting guest on which we'll introduce you to shortly. So my name's Joseph Carson, chief security scientist at Thycotic and advisory CISO to several governments and critical infrastructure around the world. I basically am located here in Estonia and my cohost of the day is the awesome Mike. So Mike, do you want to give us a little bit of background about you.
Mike Gruen:
Sure. Mike Gruen, VP of engineering and CISO at Cybrary, located in DC, and I'm really excited about today's episode and we have Chris Kubecka joining us who helps out around the place with Cybrary and our content as well. So really, really glad to have her on with us.
Chris Kubecka:
I am Chris Kubecka, and I love OT. I got interested at an early age because my mother was a robotics programmer for assembly line manufacturing in the automobile industry. I love the idea that you can take technology and bits and bytes and make things move. Who doesn't want to move the world, right? I do. So some of the things that I'm working on right now is I do advisory service for several governments as well (also NATO and the United Nations). I'm currently working on a very interesting project for part of the European Union to set up a purely proactive security team to hopefully avoid major incidents from occurring because once you get to the cert level, that means it's very reactive, right? Yay. Emergency, great… to hopefully minimize some of those incidents.
Joseph Carson:
Interesting. Proactive is a key word there because we are so used to in the security industry, we are such reactive people. We are the firefighters of security. We tend to only respond when things happen rather than... One thing I think, I remember every year I attend the cert symposium events here in Estonia. One thing that's always missing from those events is never talking about the proactive side of things, never talking about even the successes. We don't even like talking about successes because we don't want to put ourselves out there as a target. So I'm really interested. A very key area is really proactively looking at defenses and mitigating risk against OT. So, Chris, do you want to mention about what is really the proactive side? What does it entail?
Chris Kubecka:
Well, what it entails is gathering a lot of different pieces of information and putting them together, verifying the data that you get, as well as using your trust networks, both formal and informal. I'm a big believer in informal trust networks, such as various volunteer technology groups. One, in particular, the CTI League, threat intelligence league, where they've been proactively defending hospitals and some medical workers because certs just don't have the capacity. Some countries may have the cert, but they're not a very mature cert. For instance, Tanzania, where I used to spend a lot of time, they've got a cert website, which isn't encrypted and they don't have a cert team. Yaay. That's awesome.
Because they do a lot of mining of things like gold, and rare earths, and plutonium, and uranium. Little things. Little things that other people might want to get ahold of. Most certs don't really have the time to be very proactive, because, again, they're putting out the fires. And if you can flip the script and try to alleviate some of that, then it's my personal belief that you will see a lot more countries, as a whole, be able to make their cyberspace safe.
Mike Gruen: So maybe for those who don't know, we've mentioned a couple of terms, OT and certs. Maybe we should introduce those or what those are. Chris or Joe?
Joseph Carson: Absolutely. Chris, go ahead.
Chris Kubecka:
Operational Technology. Now it might seem like this foreign strange set of words and you're like, "Oh, but I don't know what any of that is." Well, it can be something like a power station. It could be your water system so that when you get water out of your tap, it is actually clean, just not Detroit. It could be planes, trains, and automobiles. For instance, one of the things I'm glad about living in the Netherlands is we're the second-largest food exporter in the world, for a country of 17 million. There's reason why. It's because our agriculture is top-notch technology-wise, but it's OT. So you've got robotics, you've got drones, you've got automated systems -- all functioning in a very, very small space, because trust me, the Netherlands is small and able to do these magnificent things because of technology, and that's actually OT - operational technology.
It can be security stations and police stations and hospitals and private security companies. It could be portions of a lab as part of a smart hospital (because everything's got to be smart, right?) All of these different systems that are set up even to control things and do things with a series of programs. If you have this fancy-schmancy espresso machine like I do, (I got one a few years ago - best choice ever) it's set up to control, move, and do things with a series of programs. So I've got a piece of OT in my house right now. So it can be from really big to really, really small; that's all around us.
Mike Gruen: Yes.
Joseph Carson: Absolutely. Some of the things people don't realize as they're walking, doing the day-to-day jobs and life, is all the things around us. From the street lights...
Mike Gruen: Yeah, traffic lights.
Joseph Carson:
Traffic lights when they come on, how frequent, even here in Estonia is that during the summertime it's every second light that they have. To reduce cost and then in the wintertime they change it based on the season because it gets darker here. So they also have a program that's automated to deal with the light and what time and season it is. Even things like conveyor belts -- when you go to the airport and your baggage comes out, all of that in the background is all automated basically through OT. Elevators... I think it's escalators and elevators and lifts in the UK. (I just found a good right term so our international audience gets to know what we're talking about.)
But even when you get into an elevator, all the controls, everything, from the stop, the floors, is all through OT. Everything that we do from basic car-to-car communication, weather systems, sensors, it's all around us. We sometimes take it for granted. But what's really now happening is just the impact and significance of it is becoming more into not just providing the city around us and the functionality, but also it's not becoming more into our homes as well, with things like your espresso machines or your vacuum cleaners or your light bulbs, your home security system, your automation. Everything we look at automation as well. That's really playing a part in OT. I remember even back, some of the projects I worked on, from even the early days, which was in Ireland 1999.
So yeah, I'm pretty old in this industry, even go back further than that. But in 1999, one of the things that I was working on, it was when Nokia phones became really popular, and they first came up with a data connection through edge computing and you connect them. One of the first things that we had done with that technology was we actually... and I was responsible for the ambulance service in Northern Ireland. We put into the ambulances defibrillators and EPDs, and we connected those two mobile phones. When an accident happened, basically the patient or victim got into the ambulance, they were put on those readers, machines, and all that heart ratings.
That was all sent through back through fax, through those mobile phones, back into the emergency room, so that doctors could already analyze the patient before they got there. These are some of the lifesaving benefits. Dr. … really analyze prior to them arriving. Five, 10 minutes, that can make a big difference to saving someone's life. So this is what we see a lot. I think it's really important. One of the other things... Chris, you mentioned something about the cert teams, you want to also mention what certs are as well?
Chris Kubecka: So a cert is a computer emergency response team, and somebody set up fireworks.
Joseph Carson: Very late for 4th of July, isn't it?
Chris Kubecka:
Well, you never know with the Dutch, they love the fireworks. Love them. Pull up a picture of Amsterdam during new years, it looks like a war zone, basically. But certs are very interesting. It started with Carnegie Mellon back in the day, and nowadays there was... The UN loves these acronyms. GGE, group of governmental experts, wrote a report July, 2015, five years ago report age, that said, "Hey, we like this idea of these computer emergency response teams, but we also like the idea that States have to take some active role to defend their ICT and critical infrastructure. So we're going to get the States to agree or state member States, that every member States should have a computer emergency response team." If there's some major issue with critical infrastructure in one country, could use the help of another country, whether it be data or a technology or people, we'll use our formal networks under various lovely treaty names to get that information and those people, and those resources. That looks great on paper, right?
Mike Gruen: Yes.
Chris Kubecka: Happy, happy, fun time, and it's all magic and it will-
Mike Gruen: Just the world loves to cooperate with each other. All the countries really love cooperating.
Chris Kubecka: All the countries.
Mike Gruen: Yes, all.
Joseph Carson: We like sharing information.
Mike Gruen: Especially ones that share borders. There's a lot of cooperation.
Chris Kubecka:
Right? By the way, what you hear right now is our automated street cleaning system. Which has decided to park outside my window. I love living in the city, but a couple of weeks ago the UN asked me to look at that report and speak on what some of the current challenges are and how they should more rapidly share information. Again, this can be difficult. If Iran is on fire, which it is, right? I don't think that Saudi Arabia is going to go, "Yay, let's help out."
However, if you use informal networks, such as the cert networks and technologists, you can actually get through that level of bureaucracy. But what I told the UN was, "A cert is fantastic. Step one." In step two obviously, try to mature those certs, right? Awesome. But by the time it gets to that cert level, you've already got a major problem. So you need to also look at the preventative focus, and to already have those trust relationships built, because it's one thing, again with this lovely formal treaty system, it's another thing if someone can pick up the phone and say, "I have the information that you need, and it doesn't take two weeks for the paperwork."
Joseph Carson:
... One of the things I got involved really early in this whole because based in Estonia back in 2007 of course, we had the famous Nation-States cyber-attack against the government here and even local companies were targeted. One of the things that involved was the community came together. We created back then, which was basically with … League, which is the cyber defense league. Which was basically citizens who were experts in the field, came together in the defense of the country. So really helping things like DDoS prevention, hardening of websites and companies' defenses. Really … into that, the cyber defense league became a continuous operation hand. Now it's an official group in Estonia that really is there to come to the defense of the country when it comes under attack.
Of course, that evolved in Estonia out of the Estonian cert as well, is also part that came out of that. Which is more the official group that's there to react to things that target the country and then the volunteers that are, as additional resources when they become inundated under major attack. So it's I guess really important. One of the things I'm always questioning is that there's a defense that should not be part of an offensive team that should always be part of the official government, but they should be always there to help the countries out when it comes on. This is also what we've done recently under the pandemic as well. I'm part of the cyber volunteers '19, which is there to help defend the medical infrastructure as well. This really gets into that, it's about, to your point Chris about sharing information.
If you see something early, somebody else can make use of that in order to mitigate it. I think that's one of the ultimate responsibilities that we have is to make sure that if there's a zero-day or vulnerability, that other people have the ability to make the decision to eradicate that risk. That even gets into some of the things. I don't know whether the policy or the standard you're working on has resulted from... I know that in 2007, 2008, we worked on what was called then was the talent papers, which ultimately also then Brad Smith from Microsoft came out with a Geneva Convention version of the same thing. I agree, one of the most important things that we should have is sharing information.
Because it means that if we have more cooperation and transparency, it means that cybercriminals, real cybercriminals who are acting on basically malicious motivation, have fewer places to hide, fewer places to operate. This can then hold. If you have some type of Geneva Convention, then you have some type of accountability, you can hold countries responsible for failure to prevent their citizens from doing attacks in other countries. This is really where I think that absolutely some of the work that you're mentioning seems to be along the later stages of what that early discussions were.
Chris Kubecka: Yah. Now prevention is important. So that we don't have a critical infrastructure, digital tech pandemic.
Mike Gruen: Right. I think that's the important part, right? I think we started talking about all the different systems that OT touches, right from your latte machine, which if somebody were to attack your coffee maker, that's one thing, somebody...
Chris Kubecka: I would die.
Mike Gruen: Well that's true. You would. But what would happen if they hit your power station, and then all of your neighbors. So then your coffee machine still doesn't work, and you don't have electricity.
Joseph Carson:
What's that happening? That's the thing is that we've failed to act on some of the most major incidents of OT in the past. I understand from a Stuxnet perspective, it was very targeted. But we look at Ukraine. Ukraine is basically being the testing target of a lot of OT-type of attacks. So we look at 2015 December, they lost electricity to 200,000 citizens. There's always an indirect impact that... One of the things is, of course, I've categorized OT into two things. One is that you've got OT which is data-driven and you've got OT which is kinetic driven. So the OT, it actually has a sensor that's gathering things like weather or population movement, or really that data analytical side to make quick decisions based on things that can improve efficiencies.
Whether being water flows into our water pressure or electricity, then you get into the kinetic classification, which is OT that actually has moving parts. Things that actually can damage, whether being a furnace, that is basically releasing steam and opening up that steam so that it doesn't overheat and cause damage. So those are the two types of classifications. Of course, when you have electricity, that has a kinetic output as well. You've got hospitals that no longer have electricity. Ultimately the one thing that... When I was working in ambulance service, the one thing that kept me up at night was the potential of my systems potentially not running and ultimately having a death as a result, having human fatalities. So Chris, do you want to give us some of... What's some of the things we should be worried about? What are some of the risks?
Mike Gruen: Yeah, scare us. Please.
Chris Kubecka: Yes. All right. Scary time.
Joseph Carson: … our kids tonight.
Chris Kubecka: Yes. So we have come to the point where we can no longer function as a modern society without OT, ICT, and the digital world. Even when I've visited different parts of East Africa where you don't think these things are affected, at the same time in order to do money transactions you'll use the MPESA system, which most people don't have bank accounts so. Right? So how do you get paid? So we're absolutely dependent on these things. Some of the research I've been doing, especially since he pandemic, I got to say it's scary.
Joseph Carson: Oh, damn it.
Chris Kubecka: Is-
Joseph Carson: … We'll do that and post.
Chris Kubecka:
Is the fact that it's accelerated both ICT and OT, forms of OT to be turned into remote services more and more. That sounds like a great thing. However, it's not particularly a great thing if these things are rolled out very quickly, and they are using protocols that they shouldn't be using, the technology that they shouldn't be using. Obviously, we still want our power to run if people have to be isolated, but we also don't want there to be a problem with a nuclear power plant or something along that line. Some of the recent scans use a Z map, the Z map project where I can scan the internet in about 15 minutes, depending on how nice my ISP is. What I found is there's been a stark increase in these connected systems being set up with highly vulnerable with known exploits in a very, very hasty manner.
One of the recent scans that I did was I was looking at the top 10 countries, with both ICT and OT, one and all, systems connected to the internet. The United States has almost 48 million assets, and China does not have as many, but it's close. What I found was zeroing in on just a known exploitable, vulnerable remote protocols and use 59% of China's ICT infrastructure, has these known exploitable vulnerabilities. The United States is sitting at 26%. So it's disturbing because unfortunately, human nature is, "Go, go, go, go, go, oh, something happened, whoops, sorry about that."
Mike Gruen: But I think a lot of it is, people are... There are these systems that were never designed to be on the internet. They were designed either before that was really a thing or with the idea that there'd be an air gap or all of these other things and so-
Chris Kubecka: Air gap. Sorry, I was coughing.
Mike Gruen:
I know exactly because I think we talked to Joshua Espinosa about cyber warfare, and I can't remember... I've talked to him several times. He's also a friend. But we talked about like, "Hey, if your protection is an air gap, that's awesome from an attack perspective." Great. All I have to do is close that gap and I win. But in any event, I think there's because of all of that, these systems that were never designed to be on the internet, this rush to suddenly get them on the internet. I think in that rush people don't necessarily even think to look at this thing. Like what are the protocols? What are the things that are going to be suddenly one step away from being attacked now? Whereas maybe it was two or three steps before?
Joseph Carson:
We have to remember, there is no such thing as air gap. Where humans go, we can be the link between... So we need to look at potential house … into intense reactor was basically... most like a human walking and then the door. So I would say is where humans work and where we operate, there's no such thing as an air gap. Because you might assume that it's network segmentation, but we can actually bring in things as humans.
This reminds me of a fun story. So one of the search engines we use of course is Shodan. Which of course is basically there to do a search for machines on the internet with different ports and protocols. I remember a number of years ago doing a penetration test on a shipping company. One of the things we got surprised, we get shocked. One of the things we actually find on the search was actually one of the ship's navigation system was actually appearing on the public internet, and this should not be possible. They're meant to go through the VISAT systems up to the satellites and back into the headquarters. We're just wondering, "How is this even possible, it should not be possible. It should not be happening."
So eventually when we ended up getting into actually finding out what happened. So what happened was during these long voyages at sea, the captain of the vessel, I wouldn't say his nationality … get them in trouble. But what he wanted to do was he wanted to spend more time in his cabin, on the internet, speaking with his family. But he also wanted to be able to actually see the route that this vessel was going, the speed and navigation so forth. So one of the ports, he actually went and got a long internet cable, plugged it into the Actis navigational system, took the wire all the way down into his cabin, plugged it into his laptop. So one of the things was when he wanted to... Because it was a law in 2009, which was called the See Welfare Law, that meant that they had to have internet at sea and all these things and so they had to have connectivity communication.
So the crew on the vessel now had internet access since 2009. Every time he made a Skype call to his family, he had actually created a crossover from his laptop onto the actual ship's navigation and engine room. So he could sit and see the destination of the vessel, but at the same time communicate with his family. This was really ultimately breaking that air gap system, and actually, we're systems that were not designed to be on the public internet. Through human error and human of course in wanting to simplify things and make our lives better, ultimately put a vessel at risk because any malicious actor who did that search, and ultimately the only thing was protecting was a four-digit pin and we know how easy they are to get past.
Ultimately it could have meant an attacker could have made that vessel look like it's somewhere else or change its direction. And these vessels, these ships, it takes one to two miles for them to turn and stop. So these are things that increase the impact. So I think it's really important. A lot of these areas is mostly through human errors, as well as configuration mistakes. That's some of the things we also have to be... A lot of these systems are also meant to have long lifespans. I remember watching a satellite decommissioning. That was impressive. Where basically one button that was designed 25, 30 years ago, that they were hoping that the buttons still did what it was meant to be doing when they pressed it. They also had to make sure the fuel was emptied at the right time and they moved it to the so-called satellite parking lot in space, or they move it out to a certain orbit. It was really interesting that this was designed to be an operation for 20 years, and it already passed 25 years. And-
Mike Gruen: Don't forget that they have a technology cutoff date of several years prior to it even launching or going into... Who knows how old the actual technology was.
Joseph Carson: It was 30 plus years old before... Because you're absolutely right. Because the time before that even got sent off in space, there was a timeframe of course testing and validating and making sure where... That took years, that whole life cycle, and this is what we really faced is that with OT, they tend to be old, vulnerable, out of date, not patched. Using legacy... I can't tell you how many times I've seen XP still running, even 2000 NMT running as some of these systems.
Mike Gruen: Oh, wait, should I stop running XP?
Joseph Carson: Depends on what you're using it for. So Chris, what's some of the risks here. What are some of the risks that you see in OT. I know the potential damages could be significant, but what's the risk, what's the failures that we have out there?
Chris Kubecka: Well. I think we also have to look at the supply chain and vendors. Now I'll tell you a funny story. I think it was my fourth or fifth nuclear event.
Mike Gruen: Starting off hilarious.
Chris Kubecka:
Awesome. I was called in to look at a nuclear facility because part of the system that automatically raised and lowered nuclear fuel rods one of them had failed. What they had discovered was, although it was a pretty new system at the time, the vendor had actually installed a 2G modem into the system without telling them with a public IP address, and they obviously were very scared that somebody hacked into it and caused the system to fail. Luckily there were no rods in the system. Great. Turns out the vendor's response was like, "Well, yeah we didn't document it. But basically, we thought it would be a value add."
Another word for we were testing it on you in this nuclear facility. Whoops. They also told the facility that if they disabled it, then they would void their warranty. Yes. Yes. Because - fun fact - a lot of vendors, if you have a maintenance warranty, they want the data from your system, right? They will go, "Hey everyone, let's tie all of our customers into this really old version of VNC to remote in, with no password, no encryption, or just use the same credentials of admin-admin or tech-tech on all of our customers."
Mike Gruen: Admin password is my favorite, but yeah.
Chris Kubecka: Awesome. You're like, "But can I update that?" "No, you'll void your warranty," you're like...
Joseph Carson:
I've seen that with so many manufacture contracts these days, especially working in the power stations and shipping, where you've got the engines that you're purchasing even the Tesla model, as the same model. When you go and you buy that equipment, you're actually owning the physical equipment, but the warranty and the maintenance are tied to them owning the data. If you actually look into these contracts, you must provide them data. In many cases, as you're saying, it's actually online connected data, rather than saying it's offline but we'll extract it and give it to when we need to, that these aren't connected and I've seen the same with a ship management company that where actually the third party vendor was using VNC to remote in and look at the systems. You're just going, "What? Where does your security start and end?"
Chris Kubecka: Yeah, no, it was just never a thought. Yeah, yeah, yeah, no, no. I do understand that we also have to remember that the internet was never designed for security in the first place and many times these systems whether the hardware, the software, or the integration, the people who design them are designing them for valid legitimate use. They're not thinking about people like us who might want to be a little bit evil. I sometimes wonder why I'm on this side of the discussion instead of being perhaps a Lamborghini owning cyber criminal, because there's so much potential, and I could have paid off my house by now.
Mike Gruen: It's called morals and ethics.
Joseph Carson: That's right. That's my weakness. My evil has been moral and ethical.
Mike Gruen: Yeah, to your point about never being designed for this. Like I remember a long time ago, I read a dealer of lightning, which is the story of Apple and Microsoft and the internet. That's actually really the story of HP... Oh no, Xerox park. So it's the story of Xerox park and all of that. They were talking about email. I remember the story in there of like, "Why..." It never occurred to anyone to secure email because what evil monster would impersonate someone else in a message? Because it was designed by academics. For academics, these are just things that they don't think about.
Chris Kubecka: They don't think about it.
Mike Gruen: Exactly. I'm at least heartened by the fact that I think things have changed. I think we do think about security from day one on a lot of stuff. Unfortunately, we're building on top of so much stuff where that wasn't the case.
Joseph Carson: Yeah. I think possibly right is that the security gets entered at the wrong stage. Sometimes it's not by design, it's … . One of the things even we look at email. Email, when I started using email back in '93 or … it was really just a simple message. It was: take a note, and you send it, and you sent the colleague. But today it's so much more than that. It's your digital footprint, it's everything you do. It's all you use. It's your location information, your address, your password manager, your password, you send reminders to yourself with the password saying my pass... Oh, that's for the password.
Mike Gruen: Oh no, I just meant for the forgot password.
Joseph Carson: Reset password.
Mike Gruen: Password reset. Right.
Joseph Carson: Some people are storing the passwords on emails.
Chris Kubecka: Awesome.
Joseph Carson: But it's always an afterthought. Sometimes even the tools and solutions that we create, the purpose changes as well. It also means that sometimes it bypasses the traditional security methods that we apply. So Chris, what things we do recommend, what is the way forward? What are the solutions or what are the ways to reduce the risk in this?
Chris Kubecka:
I would say educating especially the larger companies who do a lot with critical infrastructure or even property owners who own lots of buildings and things of that nature because they've got a lot of these different systems. Is their project managers, procurement, and legal department get a basic level of security so that they can look at, yes, they want to go for the lowest bidder, but what is the risk associated with that?. Yes, they want to sign this contract, but how is it going to bite them in the bum? So those parties start getting included in the discussion because otherwise a lot of things won't change. The project manager and I've heard this before at a particular bank, who would create those algorithms for fast trading and an exotic market. The discussion was, "Hey, we actually need to like security test these." The project manager is like, "No, no, I didn't budget for it. I don't see what the need is." Then the next question is, "Do you know what these algorithms do?" "No. Nobody knows anymore." Awesome. Awesome.
Joseph Carson: You forget the purpose.
Chris Kubecka: Right? So in the beginning there isn't that discussion going on with, "Oh, maybe somebody could do this. Maybe we'll get our stuff hacked or we'll be put on the news or we'll have to hire an outside PR firm or we'll go out of business and we'll have to lay off all of our employees."
Mike Gruen: That's what I think is interesting, is the larger organizations, I think, can deal with it better. They have more money to put into marketing and other things. You look at just history has shown that they don't get punished nearly as much as companies that are a little bit smaller that just don't have that, those are the ones that go out of business. The larger ones can solve it with PR, which is, I think even more unfortunate because it doesn't put the responsibility in the right place.
Joseph Carson: But Chris, do you think things like GDPR and regulations or standards that are coming out have an impact to companies that are larger because now if they do have a breach and we've seen GDPR being hit with the likes of British Airways and Marriott Hotels on their reservation systems, with serious financial penalties, do you think regulation will influence these companies to make a change?
Chris Kubecka: I think there needs to be a certain level of regulation, but at the same time, when you put forward regulation, you also have to ask the question how much do you want to pay for your water to secure that water infrastructure? Is a hospital going to have to choose between ventilators or securing its ICT. So it also has to be done in a way that it's economical and people understand what the in-costs are actually going to be. Now think-
Mike Gruen: I'm also super cynical because in the United States, a lot of that legislation is going to be written by not the people you want writing the legislation. So that's how you end up with...
Mike Gruen: Right. Or technologists that work for the companies. That's how you end up with these situations where it's totally legal for a company to say, "No, no, you can't tamper at all with this device, you do anything, and it voids the warranty and it's because there are laws in the US that allow for that, and who wrote those laws, you can trace it back. It's just...
Joseph Carson: Even recently the news that came out and it was around that making vendors know, making sure that they don't have easy to guess passwords, and I'm just going, "Who's putting these together? Who's coming up with these policies and rules and regulations and standards?" Because what it means is that, okay, the vendor will actually create a more difficult-to-guess password, but it's going to be the same in all devices. It's going to be the one…
Mike Gruen: Or it's going to be just a standard pattern that's pretty easy to just …
Joseph Carson: Yeah, and it will be publicly available.
Mike Gruen: Right, within minutes.
Joseph Carson: Exactly. So one of the things I think is definitely better collaboration with experts. Having people like yourself, Chris, in part of those discussions and helping direct some of the outcomes.
Chris Kubecka:
Yeah. Well, one of the things I've been advising several governments here in the European Union and also the UN is it's great to have, and obviously, we need health boards that advise the government made up of experts. But you also need technology boards that have an advisory role with an actual voice that aren't just academics. I have to stress that because they don't think the way that e-bill hacker does. You have to have subject matter experts. You have to have also components of legal and procurement and certain things like that because all of them tie in. Just as we plan for various other emergencies, we also have to face the fact that we have to plan for some magic bingo word cyber emergency, right?
Because otherwise, Joseph and I have seen what happened at Saudi Aramco, and there were wide-ranging effects that happened beyond just the company. They actually provided two-thirds of the mobile internet infrastructure. When they disconnected it affected the rest of the country. They also provided the network, which was on their flat network, internet and connectivity to a lot of hospitals, police stations, emergency, like fire department and ambulances and so forth. Guess what? That was turned off. So we have to realize that something that can happen in a company can have these wide-ranging effects, and we have to-
Joseph Carson: … rippling effects on others.
Chris Kubecka: Yeah. We absolutely have to consider those.
Mike Gruen: I think there's also needs to be just planning for the inevitable that that's... There's no way to 100% protect against that, and so making sure that you have some government action plan. What are you going to do when the power shuts off and you have all these hospitals and being able to respond to that quickly.
Chris Kubecka: Yeah, I think one of my favorite ones is I think it's the CDC in the US Army that both have public plans on how to plan against zombies. They … right?
Mike Gruen: Right, right.
Chris Kubecka: So you can read both of their plans and they're pretty funny and cool, but they figure, if you plan for the absolute worst, you should be prepared.
Mike Gruen: Do they take different approaches for the zombie? I imagine the army is more about just-
Chris Kubecka: Yeah, a little bit-
Mike Gruen: More little offensive.
Chris Kubecka: Right.
Mike Gruen: That's looking for the cure, looking to repurpose them.
Chris Kubecka:
Right. But they don't have any of these plans put forward for what happens if there is a worm that goes around and hits a lot of these legacy devices using some of these protocols, I remember the … warm that did something like this back in the day, way, way back in the day. It must've been before I was born. Some of these things can operate on themselves like a virus going everywhere and infecting everything. Then suddenly you don't have internet. So how are the rest of us who are being safe and trying to work from home, going to be able to work from home if we don't have any internet? How am I going to get other services? I'm in the Netherlands. Really, there is no place in the Netherlands I cannot get internet, and I will die without it. But literally, we have to consider all of the OT that is absolutely dependent also on the internet.
Joseph Carson: Yeah. Absolutely. One of the things is that you look at the most necessary needs everyone has today is one is battery and internet access. Where would we be without Google answering all of our questions?
Mike Gruen: Dr.Go, we should be using Dr.Go.
Joseph Carson: Dr. Go, we should be using.
Chris Kubecka: Oh, yes.
Joseph Carson:
For privacy. But absolutely, I think it's crucial to know that it's so dependent and so entwined to everything that we do. Most people don't have hard-line telephones anymore. We're all dependent on mobile communications and without internet, mobile communications would not function. It reminds me of a great book that I read a number of years ago. One of my favorites, which was Cyberstorm, which is all about basically this entire... It's the catastrophic event of basically being without electricity, being... We look at, if electricity fails in a major city, the basically just-in-time delivering, just-in-time manufacturing, they would run out of food and supplies within 24 to 48 hours.
Even considering the recent pandemic, is that when you don't have logistics systems working, and be able to function and operate with efficiency, that things in shops go out of stock very quickly. Hence peoples probably sitting around mass amounts of toilet paper right now, as a result of some of those things. But this is where we connect, is that the whole setup we have in that just-in-time supply chain was become so critical, even fuel supplies to medical supplies as well, which we're facing right now, a lot of those challenges. They're all connected. This book actually goes through some of those examples, a great read. I highly recommend; it's fictional of course. But it gives you that-
Chris Kubecka: Yay, for now.
Mike Gruen: Right. For now. I think that's one of those things that all of the, how much OT relies on other OT, even talking about... You're talking about the split between the kinetic and then the sensors, right?
Joseph Carson: Yes. Yeah.
Mike Gruen: Even if I were to secure some of the kinetic-side of things, if I go after the sensors, if I change what maybe weather information is being fed in, suddenly like I know around us, we get an incentive to have the power company be able to control our thermostat in our house to save energy. If you're not home, we can do these things and blah, blah. But I can change what our power station thinks is the ambient temperature or the outside temperature. I can get it to do things with other... I can have actual connected facts just affecting those sensors and what data is being supplied into the system.
Joseph Carson: Absolutely. We did a test a number of years ago into data poisoning. So data poisoning.
Mike Gruen: Right. Data poisoning.
Joseph Carson:
So this is really where you get into... We actually did a simulation DDoS attack on a data center, and it was actually playing around with the HVAC system that the servers were set up to shut down once it hit about 35, 36 degrees Celsius. Ultimately what we did was basically just increased the heat. That ends up having a DDoS effect because all of the servers that started hitting that temperature had an automatic shut down in order to prevent failures and overheating and burned up chips, and the board. So you can have, yes, there is a connected effect and I remember even we looked at, it was the entertainment system,. I love Chris Robert's explanation, a lot of the entertainment systems from the airlines, and of course, these challenges were flying from his comments and research.
But when you get into, is the sensor connections. It's the data sharing portion. Is that you might air gap systems but sometimes that data sharing can have influence. So things like airline entertainment systems, if it's on the same part of the network as what's the safety systems that say... Or pressure loss or cabin air pressure drops, then that can have an effect actually into the main system, which actually then controls the planes. Because it says air pressure in the cabin's loss. Therefore basically we need to descend to a lower location. It doesn't mean that you have the ability to control flights, and really, but it allows you to have an impact on the main systems through data poisoning, data manipulation.
Mike Gruen: Yeah, and that was one of the attacks on cars as well, was through the entertainment system was connected to the … and then you're able to poison what's going on in there and we'll probably talk about cars some other time.
Joseph Carson: Yeah, I've got my ... sitting behind me.
Chris Kubecka:
Yeah. But that's one of the big challenges. When I was able to get into the Boeing systems and into the flight control software and the aviation ID system, both live and test, is they had already had a problem with a sensor poorly coded that brought down at least most likely three aircraft. There was actually one earlier here in the Netherlands. That the problem is if you can get into those systems and then you can do any poisoning manipulation and so forth, then you can mislead sensors. You can change the way things go. Our world is so automated that even large aircraft, they are driven or flown I should say a lot more in a controlled manner, not involving a pilot or copilot. It tries to do all these lovely automatic things just like my lovely espresso machine, but the chances of my espresso machine killing me is much lower than for instance one of these modern aircraft.
Mike Gruen: So it can load it's own ingredients and grabs the cyanide from the cupboard.
Chris Kubecka: Right? … just that.
Mike Gruen: I think I insulted your espresso machine earlier, calling it a latte machine. I really apologize.
Chris Kubecka: It's okay. It's all right. It also has the milk thing. It's that fancy.
Mike Gruen: It's very fancy.
Joseph Carson: We're talking about the espresso machines, you can even get into medical devices that do automatic prescriptions and even IV treatments or whatever you might be getting in hospitals-
Mike Gruen: Oh, I had a friend who was telling me about, I think there's heart-
Joseph Carson: Yeah, the pacemakers.
Mike Gruen: The pacemakers that you can connect to them.
Joseph Carson: Bluetooth connected.
Mike Gruen: Yeah. That's scary composition.
Chris Kubecka: John Hopkins made one of those devices and was eventually fine, which also brings up a good point. I live in the Netherlands, and we had the first cyber laws related to the responsibilities of companies and also the responsibilities of security researchers. Where if, for instance, I happen to find a wide, gaping, open hole in your systems (because you didn't apply a general basic best practices depending on the level of criticality), then I am not prosecuted and you can't sue me.
Mike Gruen: That's awesome. Whistleblower needs to be protected. Absolutely.
Chris Kubecka: Yeah. Exactly.
Mike Gruen: Is there any move movement towards also if you find that stuff holding you somewhat responsible for disclosing it?
Chris Kubecka:
What is in the guidelines is that you attempt to try to responsibly disclose it, but you have to use a secure manner. If the company does not have a vulnerability disclosure program, this is one of the issues I ran into with Boeing because they take advantage of our Dutch tax system. So they push all of their money through the Netherlands. They did not have a vulnerability disclosure program at the time. So the Dutch government is looking at fining them currently for doing that. But at the same time, a security researcher can attempt to contact them, obviously use it securely. So you don't want all of your proof of concept stuff out there, or personally identifiable information. But then we also have, luckily a mature cert system that can then intervene in the matter.
Joseph Carson:
Right. That's awesome. Because I remember going back a couple of years ago, more in my days of daily penetration testing, we had a massive round table discussion, and it was going into responsible disclosure about types of OT vulnerabilities and system vulnerabilities. In the room we had Europol, we had certs, we had law enforcement, and we also had businesses. The problem is that I'm really happy to hear that at least you have a proper whistleblower type of scenario because when you get into security researchers, all too often, when they report these types of vulnerabilities, they end up becoming the target of lawsuits. They ended up becoming criminalized by some of the gray laws that surround and become victims of being targeted because of disclosing potentially damaging vulnerabilities.
We see this all too often. I'm so happy that there's protection because at the end of the day security researchers are basically, we need their help in finding these. There are actually many amazing people who are using their skills for good reasons, and they're putting their selves at risk in many cases to find these vulnerabilities and disclose them. I think it's important that they get knowledge because when we had this round table, that the law enforcement said, "What you're doing is of course you're in the gray area of the law." But the best thing to do is disclose it anonymously. Of course, these people are putting in their hard time effort and they're not going to get any money. It's more about recognition, acknowledgment for their efforts. I think it's really important that we actually use their help and we work together because I think it's important to get security researchers and government agencies and officials in law enforcement and the companies that work together in order to provide a much more responsible disclosure effort without victimizing those who are doing the work.
Mike Gruen: Yeah. I find it mind-boggling that you wouldn't have some responsible disclosure program, some way for people to report things into you, that's in a controlled way and you can handle and has a process, was one of the first things we did, I did after I got here. We set something up and it's been great. For all of the annoyances of people reporting things that really aren't an issue, it's definitely paid off. I find it hard to believe that a large company wouldn't have that set up and wouldn't want to know. I think you were also talking about the laws, it's not bringing the gray area again, that gets back to what I was saying earlier. It's illegal to tamper with a DVD player because there's encryption in there and dah, dah, dah. So because of the laws at least in the US, it does make it, if you want to do some of the work that you want to do, it's actually illegal to do it, or criminalized in a way. So I find it frustrating from that perspective. I'm sure many people do.
Joseph Carson: That's how I look at GDPR; it's digital rights management for citizens, personal information. That's ultimately what the intention is; is that DRM has been abused against citizens for many years and now GDPR is a DRM for the people. I think that's definitely something that leverages and I know Chris you were... I think about some of the IP that you create, that we create. It's things that are in our heads. Those are the things that we own. You can't print intellectual property in people's brains.
Chris Kubecka: Not yet.
Joseph Carson:
Maybe the content that gets outputted, but you get the result, the final result, but the method of getting there, the data, that's something that we have to protect and it's something that … people out. So bringing it to wind up, a summary, is that I think one is, we look at OT is all around us. It's been all around us for many years. Is technology innovation which is really making the communication and how these devices talk to each other, how they make decisions, and also making our lives a lot better. I think that we have to take this with accountability and responsibility. I think it's really great to see governments and vendors around the world working to come together to provide some type of transparency in sharing a mechanism to reduce the vulnerabilities and exploits. I think, Chris, some of the work you're doing around the proactive side, I think that's crucial. I think we can't live in a reactive world. We have to think about the direction we're going. I think that's really important.
Mike to your point around responsible disclosure and the laws that protect those, I think is really critical and organizations should have, especially OT companies should have a responsible disclosure capability where they have an open-door policy for researchers to provide them information that they find, and having protection of it's a whistleblower scenario, that they're finding and discovering. So Chris, any closing thoughts, anything you want to share with our audience, a very international audience that we have. What things that you think, where's the future, where's the direction we're going? What would it be like in a couple of years' time?
Chris Kubecka: Well, I think more and more of us will be working remotely and having that option, which means that more and more things are going to be incorporated into our daily lives. We're going to be looking at more and more drone deliveries, robot deliveries, things that we can digitize to make our lives as easy as possible. It's going to accelerate because of our current circumstance in 2020, which seems to be very determined to try to kill us. So we're going to see a very fast acceleration, but I would... You guys had mentioned buzzword bingo. I would be more concerned with the acceleration, the security of the acceleration of the OT world than I would be with artificial intelligence, Bitcoin, and quantum computing. Did I miss a buzzword?
Mike Gruen: Machine learning.
Joseph Carson: Machine learning.
Chris Kubecka: Machine learning, well it's part of artificial intelligence.
Mike Gruen: It's part of artificial intelligence, right.
Joseph Carson: Quasi prime numbers.
Chris Kubecka: Oh, yeah. Forgot about them from last year.
Joseph Carson: Snake oil. That was... I still love them. Sometimes when I need a good laugh, I go back and watch it again.
Chris Kubecka: Oh, it's scary hilarious. Yeah, that was something. But there are certain things obviously we do need to be concerned about. But realistically we need to be concerned about the things that touch right now our everyday lives, and that is different types of OT.
Joseph Carson: Absolutely. I did see one time recently and hearing this story. We have the automated deliveries, which is the Star Ship delivery systems. We also have this, was it smart connected scooters? It was a great image. I'll have to share it with you. That they're fighting with each other. It was Elvis like the Robot Wars. It was like two devices funny I'd pass... So Mike, yourself, any closing words, anything that you think that we should have as a reminder or summary?
Mike Gruen:
No, I think we've summed up pretty well, but I do think just recognizing, I think there is a lot of responsibility on companies to set up, as we said, responsible disclosure programs, doing testing, bringing security people into every step that it's not enough to just rely on your supply chain to say, "Yeah, no, we've got it secured," or, "Here's the documentation." I've looked through plenty of SOC 2 reports and been like, "Okay, that's all well and good," but it doesn't explain how specifically you do this, this and this, and I have questions and concerns that are not addressed in this.
So you can't just have, "Oh, well we have all the documentation. We have everything we need." There need to be people really looking through that and advising all the way across its technologists and security people. Yeah, that cross-section of security technologists that looks at things of how does it break? As an engineer, I know how to build things really well. I found most software developers are good at building things. Very few are good about thinking about how to knock them down or how they might get knocked down or so and so forth. So there's a definite different mindset and getting those people involved in these discussions and decisions is a critical thing.
Joseph Carson:
Absolutely. Collaboration and communication of privacy and security by design, is the crucial things we need to get to. Awesome. So it's been a pleasure. This has been a fantastic show and hopefully, our listeners have really been educated and had a lot of fun and tune in, sometime soon we'll have the buzzword bingo, for sure. We'll definitely … for the buzzword bingo and see who wins. We'll have maybe a little prize at the end of it and maybe the audience eve join in. But again, it's been a pleasure having you both. Chris, amazing to chat with you and get your insights, has been very educational for me. So this show will happen every two weeks.
So tune in, subscribe to us, find the way to get this podcast automated into your phone or whatever device you prefer. This is going to be one of the first in this series of OT, we will do another … shows. We'll go into things around elections and other types of really interesting and fun topics. But again, Chris, many thanks for having to share, we hope to have you on again soon and chat more with you. It's been an awesome time and hopefully, the audience will have lots of fun and to be educated.
Chris Kubecka: Happy, happy fun time.
Mike Gruen: Hey Carson, great talking to you, Chris, I always enjoy it.
Chris Kubecka: Great talking to you.
Outro: Learn how your team can get a free trial of Cybrary for business by going to www.cybrary.it/business. This podcast is also brought to you by Thycotic, the leader in privileged access management. To learn more, visit www.thycotic.com.