Skip to content
 
Episode 89

Smart Hacking with Ken Munro

EPISODE SUMMARY

Hear how hackers target everything from airplanes to talking dolls. Pen testing expert Ken Munro discusses ways to close security gaps and protect embedded systems and connected devices.

Watch the video or scroll down to listen to the podcast:

 

Subscribe or listen now:  Apple Podcasts   Spotify   iHeartRadio

Joseph Carson:

Hello everyone. Welcome back to another episode of the 401 Access Denied podcast. I'm the host of the show and my name is Joseph Carson, chief Security Scientist and Advisory CISO Delinea, and it's really fantastic to have you listening to today's episode. I'm always really looking for the most amazing, awesome, talented, like fun people in the industry. And today we have a very special guest, somebody have known for quite a long time now, going way back into our shipping days and maritime industry side of things. So welcome to the show, Ken. Ken, if you want to give our audience a bit of background of what you do, what fun things you get up to and about your background.

Ken Munro:

Sure. My name's Ken Monroe and I'm part of the team here at Pen Test Partners. One of the things that we're particularly interested in as part of our pen testing is embedded systems, whether that is, I don't know, the navigational system on ships bridge or an engine controller on a car or something connected in your home. It's all about embedded systems, so understanding how technology that we hope is making our lives smarter and better actually could lead to some pretty serious security problems.

Joseph Carson:

Absolutely, and the theme of today is it's all about hacking smart or hacking smart devices In reality. One of the things I always get into the question, and this goes back to KO's book and about other things, when we talk about IOT and we talk, what makes us a device smart? What's the difference? We call them smart devices, we call 'em IOT devices. What really makes them smart and are they really smart? Is that the right term we should be using?

Ken Munro:

Well, I like the term connected, actually. We use the term smart, but of course smartphones are connected and so a laptop. So there's actually been some really interesting negotiation and legal wrangling around certain bits of regulation over the last few years about what's the smart product. But I think for the purposes of this, we like to think about connected devices for remote access and telemetries stuff that's now connected like our cctv, so we can see what's going on remotely in our house. We've got remote access, it's connected, it's smart. Maybe that's the way to look at it.

Joseph Carson:

I agree. For me, it's always about connectivity, but the question is what really makes us smart? And you're absolutely right, it really just means that we can do things a lot more automated. We can do things when we're not present. We can keep an eye on things, we can get alerted when an alarm goes off. We can control devices or maybe you forgot to do something or maybe you just want something to operate, like you want to be able to do the laundry when you're away, or you want to be able to control the heating system so you turn it on when you're getting close to home. So for me, it's really about that connectivity and the ability to do that in an automated way in a much more, let's say, to make our lives better in the end. But for these devices, what's some of the most interesting vulnerabilities that you've seen? You've been doing this for a long time and I've known, I've seen your sessions quite often. What's some of the most fun ones that you've seen?

Ken Munro:

Well, yeah, I think there's definitely differences between fun and scary, isn't there? We've seen everything over the years. We've first started looking at smart stuff, I suppose 20 14, 15, just the industry was just getting going. I think the term internet of things hadn't long been coined at that point, and everybody was trying to get to market really fast because you get first mover advantage, you've got a certain amount of seed funding to burn through, so there was a lot of rushing going on and as part of that rushing, a lot of cybersecurity controls just didn't even get thought about. A lot of the firms that were rushing to market were typically mobile app developers, so they were used to building something and updating it later, which is easy in a mobile app, but actually really difficult in a smart device if you haven't thought about it first, it's some great stories of crazy devices that just didn't get it right and ended up falling flat on their face. I've got a few of those here with me today. You've

Joseph Carson:

Got quite a few. I can see the doll in the background there.

Ken Munro:

Should we do the dolly?

Let's do the dolly. She to me is everything that was wrong and right about smart devices and the story of this lovely little dolly, my friend Kayla goes on and on, she's still going. Now, amazingly, for those of you who haven't seen this wonderful dolly, she's called my friend Kayla, and she hit the market in about mid 2015, I think, and she was the first interactive kid Dolly, really good idea. You might remember Teddy Rux smart version of him, gosh, 10 plus years ago. But this is truly interactive and she's quite cool. She's got a speaker and a microphone, the Bluetooth connection, and you can talk to the dolly microphone, listens to you. It then takes the audio and offloads it over Bluetooth onto the smartphone where it turns the audio into text and then can look up what you asked it and then come up with an answer.

Bluetooth speaker microphone. Yes. She's essentially a Bluetooth headset, so I kid you not, you can genuinely pair it to your phone and you can actually make phone calls on the dolly if you wish, get you some very weird looks. But anyway, what was wrong with her? Well, the Bluetooth connection, so when you pair your smartphone with your car, you're asked to compare two numbers. It's called numeric comparison. It's a component of Bluetooth low energy. She however uses Bluetooth classic btr and unlike your car, she's got no pin. And what that means, there's no pin. There's basically no authentication or authorization of any of your requests to the dolly, which means that anyone within Bluetooth range can connect to her. And of course that's your phone connecting to the microphone, the speaker, she's effectively a spying bug, so your child is merely sat there in the house.

Joseph Carson:

Right. And that's, it's about 20, 30 meter distance for the Bluetooth range.

Ken Munro:

Yeah, you get more than that on a clear line of sight, but yeah, you're talking next door flat next door, neighbor's, house street outside.

Joseph Carson:

For most people that's a good distance away from your house when you think about it that people can connect.

Ken Munro:

Yeah, who's the last person you want hearing what you're saying about them, your neighbors. Right, exactly.

So this is great fun and actually we try to tell the manufacturer about the problems they've made. We were nice about it, but they just ignored us. They weren't interested, they didn't have any understanding of cybersecurity, they didn't have any money for cybersecurity. So in the end, reluctantly we took it to a journalist at the BBC who was very interested in it and then finally managed to get to respond and in the end, the dolly was taken off the market. The company seems to have disappeared, which is a real shame because I thought the concept in interactive talking kids dog was a nice idea and it didn't take much to do it securely, but they didn't, which is a real shame

Joseph Carson:

Thing, even when you go to market fast to your point is that well, you don't really think about how to update it later. And I think that's where many organizations struggled with is that how do you apply those security updates to a device that only communicates through Bluetooth as well? And so if they were to do this, they'd probably have do basically a recall in some instances you would actually have the update actually the embedded system itself. And that's one of the big things I've seen over the years, especially for organizations who were really quick to enter the market, is they didn't really think about how is that ongoing long-term maintenance? How do I keep it updated? Even I saw light bulbs having to have firmware updates and how do you update that light bulb? You have the sun, screw it, plug it into your laptop, flex the firmer, put it back in again. And that's really a lot of those steps along the way. Organizations were going fast but not really thinking about how do I maintain this legacy devices? You sell it and you forget about it, but these are devices which you can't sell and forget about. You actually have to think about the long-term side of things. What's the sustainability? And that's your point. I think that's why a lot of these companies struggled in the early days.

Ken Munro:

Yeah, they rushed to do a firmer update on this dolly. There just wasn't any mechanism to update, so there was no way to fix the Bluetooth bug amusingly. We actually went a bit further. We actually went into the mobile app and reverse engineered how she operated and discovered that, well, she had this lovely feature that if the child swore at the dolly, it would tell 'em to go and speak to their parents, which was kind of nice. But there was a logic flow there where they're going to hang on a minute if she knows that's a swear word, she must know swear words. So we unpacked them, but the mobile app discovered a SQL-like database and then discovered all the swear words. She was looking for over 1500 of them really good swear awards as well. So we just deleted them, recompile her, and now she swears like a docker.

But it's back to your point though, the manufacturer rushed to get the product to market and hadn't thought about dealing with any sort of update or even functionality updates in the future. So yeah, that all came crashing down. She got banned in Germany. Then a bunch of Europeans countries also ban her for privacy violations and it's still going on. She was cited as the catalyst behind an IOT cybersecurity law in California in 2020. That's the reason why. So it's kind of nice. Companies got it wrong. Regulators realized regulation is often a bit behind in terms of time, but she's inspired change, which was great.

Joseph Carson:

That's always fantastic. These examples really show what organizations do make the mistakes that we can learn from 'em. And I think that's really important is that we actually have to make sure that we're learning from all these different types of lessons. And especially when I talked about the light bulbs came out the bridge, they came the hub which allows you to connect to them, and that was the way to then maintain an update firm and also reduce unwanted light bulbs from connecting to your network as well. That also becomes a problem. So they do learn from those, but has been a long journey. It feels it's been a slow moving, but of course with embedded systems, it does take time to improve and to update and to make revisions and what's some of the differences? What other types of devices have you seen that has surprised you? What's the one

Ken Munro:

One that irritated me actually was actually this one. It's a Fisher-Price device, so it's a big brand. They Mattel, right? And this product, do you remember these chatterbox T phone used to bring have as a kid, it's made of word? Well, they've now made them smart so you can buy them. It's a bit of a gimmick. They're sold extremely well last year. But what irritated me is they had fundamentally the same security vulnerability as Kayla had six, seven years before. Seven

Joseph Carson:

Years ago. That's

Ken Munro:

15.

Joseph Carson:

Yeah.

Ken Munro:

We talk about the organization's learning and many have, if you look at Google Nest Hive, they've really learned, they've really progressed their cybersecurity. But what really irritated me is that someone with the scale and funding of Fisher Price Mattel made the same mistake as my friend Kayla and then didn't respond well when we reported it to them. So she's fundamentally, this is just the same bug, it's just a different format. But anyway, that's another story. So things do progress. Another crazy example, you might remember my smart kettle. Yes,

Joseph Carson:

I do remember your smart kettle. Yes.

Ken Munro:

I'm a Britt, so I like my cup of tea in the morning of course. So you've got to have a kettle to boil your water. This was a fun one. It's actually a wifi connected kettle. Now all the smarts aren't in here, they're actually in the basement. There's a wifi module that connects to your home wifi network. But there were a bunch of vulnerabilities that meant you could do a authentication attack, you could then connect to the access point in here and you could recover the customer's wifi key. So the wifi key that gets you onto your home network. Now that's less of a problem now on a home network because much more the traffic is encrypted by default, but when this first came to market, most home wifi network traffic will be in the clear plain text. So having someone's wifi key stole from their smart kettle, and again, similar problems, they couldn't update it remotely, but we talked about learning and improving. Actually the manufacturer behind this company called Smarter, this was their version one, they did a version two, which was a bit better. They still have problems, but the version three actually they worked with a specialist cybersecurity platform and actually the latest version, the 3.0 version of this, it's actually a really good way of boiling your water remotely and securely. So things do progress, which is good.

Joseph Carson:

It's great to hear the progression of that because one of the things is when we talked about the updating side of things, sometimes there wasn't even an external port in many of these devices to plug into. And that also creates how do you actually connect and modify it and update it without actually having it dismantle and take it apart. And it's great. I mean there is some things, of course you want to boil your kettle on a timer, you want do it remotely. There's great things about making our lives much better. What's the concern? I mean there's a big difference in here between the consumer side of things where of course it seems to be it's all about getting access to somebody's home network, maybe their data, maybe the devices. So pretty much a big privacy issue there. So what's the difference between the risk from a consumer perspective with these devices in the home versus a company who might even, I could see a company even having the kettle in their office as

Ken Munro:

Well. That's interesting actually. So we're starting to see a lot of movements towards connected buildings because everyone's being pushed and rightly so towards being more environmentally sustainable. Our buildings and offices are big consumers of resources. So to have a connected building that intelligently works out is going to be warm during the day. Make sure that the windows are closed if the air conditionings on, make sure you're not heating it up too much in the morning. That's a great idea. Saves a huge amount of money and reduces carbon dioxide emissions. So things like connected thermostats, you'll see in offices things like elevator controllers to make them more efficient, things like HVAC controllers. So we're not wasting energy heating or cooling our buildings. And we're seeing very, very similar security flaws in those. I've actually brought along, this is a smart thermostat. It's actually both commercial and residential type use and we found vulnerabilities in this, probably didn't bought it. So one of the great things about smart devices is they have to have radio frequency accreditation to the FCC or CE in the eu, which means that as part of their accreditation process, you can pull the circuit diagrams or the PCB photographs for free from the FCC or C website, which is great because it means you can start finding, like you said, the various input and output ports, which might get the firmer off the chips for you. This one was quite fun actually.

Joseph Carson:

You might find some ports in there or

Ken Munro:

Yeah, absolutely. Right. So actually you can probably just see there, there's actually a JTAG connection there. So we way in before we'd even bought the device. So that's quite fun. And this we bought and we did a talk at defcon, I think in 2017 we'd seen an episode of Mr. Robots and also some wags on Twitter suggesting that ransomware was happening and oh, my thermostat's been ransom, but actually it was just someone joking, we thought, hang on, that's a challenge. So my colleague Andrew Tierney or Cyber Gibbons worth a follow on Twitter if you want to laugh. Cyber Gibbs is great.

I challenged him to write the first ever ransomware for an embedded system. And do you know what? He did it in three days. He successfully took control of an embedded device. So this isn't ransomware and Windows that's off the shelf. This is ransomware running on an embedded system, which is completely different challenge. And we did that as a proof of concept, presented it at DEFCON in 17. And yeah, I think it really got organizations thinking, well it's one thing losing access to, but losing control of my office and now someone, I dunno, unlocking all my doors or setting my fire alarms off or setting the sprinklers off. That's a very different ballpark in an office losing control of the office.

Joseph Carson:

Yeah, there was the case a few years ago in the hotel, I can't remember, it was in Austria or something like that where they had a ransomware case and they couldn't actually because the doors and the hotels were actually from the keys that the ransomware actually disabled and they could not open or unlock any of the doors. So to your point, that's a scurry thought when you get into that is that when you think about this is controlling your access to even the service and one you no longer can control that, well there's one thing about the service being unavailable, but actually when somebody else is making the decisions over what happens with those, that's quite a scary feat. I mean, would somebody be able to disable, for example, in the kettle the safety features that for example, with certain temperatures and stuff like that, hvac, because I know even if you were doing HVAC and data center, sometimes the data center machines, the servers are usually set to start shutting down at 36 degrees or 40 degrees to prevent overheating and the hard disks from failing. What if you were able to turn up that thermostat to increase the temperature?

Ken Munro:

I'll tell you a story, actually an embarrassing one as well. So actually the firm of accountants we were using at the time invited to see and said, look, we've got a really early building management system. And we thought, oh, let's have a look at that. And we started looking at it and it just fell over the minute we started looking at it, at which point all the fans went off in the server rooms and all of a sudden there's lots of messages from the servers going, I'm getting really hot, I'm getting really hot. And we're like, okay, I think we need to step away and we just reset it and everything came back on. But it makes you realize that it infrastructures of organizations rely incredibly so on air cooling to keep the servers happy.

Joseph Carson:

I heard them. That's one thing is that we talk about these ER gap systems, we talk about these separations and segmented networks, but really when you think about it, in many cases there is some type of sensor or data that's been transferred between one of the other, and you get into really thinking about that is the IT side of things where of course it's very, very connected and very available and OT tends to be not so common, but of course those connectivities are increasing over and over again to the point where it is really it that's now protecting the OT environment. And I even remember cases where even you've done a lot in the aviation side, I remember there's a lot of cases where, well the flight control is separated from the entertainment network, but on the entertainment network I believe it was the oxygen mask and the cabin pressure controls are on. So if you actually were able to fake deploy the oxygen masks, it would actually force the plane to start descending because of assumed cabin pressure failure. So all of those things didn't get connected. So have you any experience, what's your fun stories from the aviation side?

I really enjoy you do every year the aviation village and also you set up the flight simulation side I things which is always fun, but any interesting experience from that side of things.

Ken Munro:

So it's insanely difficult to hack an airplane. And the reason for that is is there is, we talk about physical separation, the flight control systems are what's called the aircraft control domain as in yolks. And the engine controls, they're all completely separate. So they sit on a separate domain, they're behind physical security, which is great. And actually it's incredibly difficult to compromise those systems. So you'll have to compromise typically two to three networks concurrently with millisecond timing. And there've been some stories in the press about hacking airplanes. They don't really line up with reality. However, of course there is always a however, and that is that airplanes are getting increasingly connected, but it's more the information systems that goes to the pilot. So when a pilot goes to take off, they don't really use full power, they don't use full thrust because it uses a lot of fuel and where's the engine?

So if you've got a nice long runway and a nice headwind, you use half power for example, if you're not particularly well loaded, but the calculators that worked that out for you are typically done on a tablet computer. And we found all sorts of problems with some of the apps, some of the tablets, some of the configurations, some the lack of lockdown that meant you could feed the wrong information to the pilot. So they didn't put enough power on. And what usually happens is you have a tail strike where you drag the backside out of the plane and causes millions of pounds worth of damage. So you can't hack planes, but you sort of can in sort convoluted ways.

Joseph Carson:

Yeah, I mean definitely it's great to know that they do have, when I think about those safety, a lot of those safety systems was of course the main focus of that is to isolate them, make sure there's multiple systems that's actually providing the data and the controls. So it makes it very difficult.

Ken Munro:

And they do go wrong. They do go wrong from time to time, very, very rarely, but they do go wrong. But what's fascinating about that is it's so rare that those sort of stories make the press, which insanely rare but also fascinating that, and I think this is something we can all take from this is one thing I love about aviation. I'm a pilot by the way, not a very good one, but I'm a pilot is that

Joseph Carson:

Whenever you did help me land the So I would say you're a very good pilot

Ken Munro:

Practice. Practice. What I love and I think we could all learn from this in general cyber as well is any aviation industry, if anything goes wrong, everyone shares. Everyone talks about it and we will learn from it without attributing blame. And that is such a powerful thing that I think we could all take away on the ground is that the more we share, carefully redact and anonymize, but the more we share about attacker activity, the more likely we are to be able to help others defend. So that's the thing I love about aviation. There's a plane crash or even a minor instant, we talk about it, we write it up, it's investigated, blames, not attributed. If more training's required it's given, we could take a lot from that on the ground.

Joseph Carson:

I completely agree because in the industry there mean there's a very segmented part of the industry where that one half seems to be jumping to the helping. Another half of the industry tends to be pointing fingers of blame telling, we did you so, and I'd like that to change. I would like to see us all come together. And to that point, sharing responsible with responsibility, making sure the information's there and not pointing blame because a lot of cases we don't know the fine details and you're only finding out what's publicly available, which isn't all the information that's happening.

Ken Munro:

I think there's a lot that's some really interesting communities in the U S A as well actually. So that what's called the ice tax or the information sharing analysis centers set up during I think the Bush administration, the first one I think. And they're actually partly go funded and they encourage industry and enable industry to share intel in a safe way. So effectively you've got competitors sharing cyber intel with each other. That's wonderful. And I think that's a lot we take from a in Europe too.

Joseph Carson:

I agree. So given the government side of things, I know that there's been a lot of regulation discussion. I know in AEU there was the I O T discussion. I know in the UK they talked about getting rid of default credentials and passwords and all of that side of things and different regulations. And there's also laws in the US that's looking into a lot of these changes as well. What do you think will regulation and government oversight and policies and compliance, will it make a difference? Will it change how we do things?

Ken Munro:

So I've never been a huge fan of regulation per se because it's always too little, too late. But actually because I think iot is a bit different because you're asking the consumer to make an informed decision about purchasing a product based on cybersecurity that they understand. And actually that's I think where regulation can and will help in this particular space. Regulation took a long while to come into effect. So one of the first we had was the cybersecurity of IOT Act in the US couple of years back, but that only affected federal agencies. So it basically said that if a federal agency was buying something, it had to be secure. If it was smart, I'm not sure that had quite the effect that everyone hoped it would. In the UK we had department for culture, media and sport went to consulted and came up with some clauses in the product security telecommunications infrastructure bill, which is a mouthful and that's just come onto the statute, which is lovely.

But like you said, it talks about default passwords, but most importantly it talks about length of support for a product. And I think that's really important point for a consumer. How long are you going to support my product for? How long am I going to get security updates for? How long is it going to last? Because there are all sorts of products. I mean Kayla, it doesn't work anymore actually I've got some, what is it, one of the connected teddy bears up there that was just end of life when the manufacturer went bust. So now you've just got an expensive teddy bear.

Joseph Carson:

But that's a great point is because years ago when I've been in this industry for a long time like yourself is that you expected devices to be around for quite a long time. They had usually had a five plus year lifecycle and they were independent of anything else. You'd have games, consoles, telephones, laptops, computers that all had this long. And even in the industrial side of things, we've seen devices in the maritime, which has been around for 20 plus years. Even when you start seeing the Windows 95 logo, you start going, well, okay, what can you do? So some devices are running for quite a long time, but I think it's really important to your point is that what's the life cycle? What's the life expectancy of those? And that can also allow you to make those more informed decisions about if you were planning something for longer, you might decide to go to another vendor to get what you really need so that make sure that one, as you're getting the updates and the post security and privacy and all of those things.

Ken Munro:

To your point though, there was a really interesting press discussion. I think Sonos, the connected speaker manufacturer, I remember, do you remember they openly said that they were going to end of life product with I think gave three years notice or something and the world lost his shit. It was incredible. But actually kudos to them because they were honest and open about it. So many other vendors have just gone, no, it's gone, right? It's been deprecated. We've got 2.0 now and I think it's really important that people are open the nice look, we are going to accept the cost of supporting and updating this product for the next three years. That's how long it's going to be. Okay, I'm now informed. Unlike I think Google acquired a firm called Resolve or Revolve, it was they bought a hub and the end of life it almost immediately and in the end the, I think consumer reports in the US made a complaint and then a US agency actually enforced and forced Google to compensate customers. So

Joseph Carson:

It's great to be informed about what is the life of this? Because a lot of times you're paying for the hardware and you're paying for the software and then all of a sudden you find that it becomes useless. And it also gets into the sustainability side of things as well because we are creating a lot of technology waste as a result of this. And if it can't be reused or what's the recycling capability after you do that end of life, are you actually taking the devices back and repurposing and reusing them? So you also get into the whole green and sustainability side of discussion, which is also massive area, especially around IOT devices because

Ken Munro:

We are the worst offender there. I might add we're the worst offender of anybody I know because in order to do this reset, we have to buy loads and loads of iot gear and most of it gets taken apart and during that process it'll often get damaged beyond repairs as we're trying to extract memory and firmware and things. So we had several dumpsters of iot waste that we had to send off to get recycled in the last week alone. It was enormous. But yeah, so don't take us as a good example,

Joseph Carson:

But still it's part of that research side of things that's really identifying this in the long term, which can actually ultimately make things better. What do you see the future of this going? What do you see in connected devices and vulnerability disclosure and getting into regulation? Where do you see the future going? What's the direction you've seen organizations in this space taking?

Ken Munro:

So an interesting thing happened a couple of weeks back. So the Biden administration, I think it was an executive order around smart product cyber labeling in the us which I think was a huge step forward. My only concern about it is how do you correctly or usefully label a product to convey its degree of cybersecurity? How do you assess it? How do you communicate in a way that the average joke and walk into Best Buy or Target and go, yeah, oh yeah, I'll buy that one because it's got an A B B CCF rating or it's green. I dunno. We've battled with the concept of consumer labeling for a long time, trying to think of a way that you easily convey that it's okay and it's not easy and I wish the team that's doing it, I believe part of the team behind NIST is heavily involved too. I wish them the best of luck solving that thorny problem.

Joseph Carson:

I mean we struggle with that in security in general about risk assessment and that's something we have lots of different labels and acronyms for all of that. But the problem is that I guess it really comes down to if the price difference is so big though is that do you really think consumers are really going to, if they have that choice between price point, are they really going to choose the more privacy secure, longer lasting device? I think that sustainability piece that how long would this be supported for? I think that might be a bigger impact than maybe the security and the privacy portion.

Ken Munro:

I'd like is this going

Joseph Carson:

To be a five year thing or is it going to be like a six month thing?

Ken Munro:

I wish I hadn't, but we've just thrown away two identical CTD cameras. The form factor was identical, one was half the price of the other one's hackable. One, how do you as a consumer do it? And that's why I like the update of some regulations actually to slightly increase the barriers to a certain market so that the manufacturers that genuinely take cybersecurity seriously and really take their responsibilities with our data properly aren't undermined and undercut by X, Y, Z product that does the same things but really, really badly and will be unsupported after six months.

Joseph Carson:

I think that unsupported side of things I think might be the big driver. I think that's where people might make the decision because if I have to buy it every six months, that's going to be a driver to the cost. If I can buy this and it's going to last for five years, it's more attractable from you will pay more for that as well. Getting that security side of things as well. But you're absolutely right, you need some visibility, whether it's got privacy in there, whether it's got sustainable from a security perspective, if I'm going to plug this in. I think that's really where regulation not only about the labeling side of things, but also about making sure that these devices have a baseline of security built into them as well, that they've got a minimum amount of standard and best practices. Those are really where organizations and for people buying those devices will at least have, we don't want the world to become cybersecurity experts because that's not going to happen, but we want to make sure that whatever they're buying has went through a really good set of basic security practices so that they don't need to, it is something that they can just buy with confidence that if it's on the shelf that's already been through a certain level of certification and checking and validation because that can make a big difference I think.

Ken Munro:

Hell yeah. I think you're asking also about vulnerability disclosures. That's been really challenging for us is every time we find a bug, we do what we believe is the right thing is we try to get hold of the manufacturer and tell them nicely and give them some tips about how to fix it and give them a timeline. So just talk to us, we'll give you some guidance, it's not going to cost anything. We'll point you at some third parties who can help you. But most of the time we get stonewalled

Joseph Carson:

Really. I mean because we've had discussions. We had Casey Ellis on a few times talking about vulnerability disclosure. We had Katie Zus and stuff and that was always fun. And from the software perspective, I think they are making a lot of inroads and a lot of improvements there. Definitely even Google's zero project and the vulnerability disclosure, I think that seems to be, it seems working and of course we'd like to see it across more broaden perspective. But with the IOT devices, I guess your point is that it's not quite there yet.

Ken Munro:

We still have challenges even in general vulnerability disclosure. So I know Casey and Katie very well to chat to. We usually end up bumping each other you, it's a DEFCON issue. One of the areas that I think even within Bug Bounty that really struggles with is the motivation for researchers isn't always money. And whilst I think it's fantastic that bug bounty in place, I think that's a wonderful thing to be able to reward researchers. There's a gap to our mind and that is not everyone is out there for the cash. Some people are out there. Completely agree and completely agree. A lot of organizations don't have a way to receive a vulnerability report that doesn't involve non-disclosure and no cash. And that actually caused us a lot of problems over the years with vendors saying, well, yeah, if we're going to give you money, then we're buying you silence saying, well, that's not why we're doing this. We're doing because we'd like to see you improve it.

Joseph Carson:

Yeah, absolutely. I mean, I think I find as well is that a lot of people, the researchers who do this, many of them do it because it's their passion. It's what they enjoy doing is they have a motivation not for the money side, but to make the world a safer place. And they want to make sure that they're adding value to the industry and they're using their skills for good. And there's many out there that's doing this and sometimes they are running a fine line between the legal side of things because when you look at a lot of these devices, the EULAs and the software bill of materials, all of those things really is telling you you're not allowed to find these flaws. And that becomes very complicated. And I think in some cases where those organizations who don't have vulnerability disclosure processes or have bug bounty programs, I think it really makes it difficult for researchers to do it properly.

Ken Munro:

It can be really enabling actually. So when a bug bounty is in place, and actually they have a defined terms of reference that can be really helpful to enabling research, giving the research a safe harbor so they can do the right things and explore. And it works for everybody. It's just there is a bit of a gap, particularly when you're working around iot, particularly when you're working with slightly different motivations, maybe slightly alistic of motivations. It can cause frustration sometimes, should we say,

Joseph Carson:

Yes. I've been in that place a few times where sometimes your head's against the table and just like, here we go again. And sometimes it feels a repetitive, it's the same process over again. I even remember I did, it was during, and this was in its response case that I worked on. I was doing this response, I was looking at digital forensics and I find an organization that had become a victim of the same ransomware because the criminals were simply just copy and pasting the scripts and they copy and paste the previous logs of the scripts. And in those logs it had captured all of this organization's data, including the usernames, passwords, server names, ips, the whole thing was in there. So I got permission to go and contact them proactively and say, Hey, I'm working in this case, I found that you've become a victim as well. And they kind of responded, no, we have not

Ken Munro:

Nothing to see here.

Joseph Carson:

It was just the brick wall coming up the door closed slam, don't come back. And I went back again, I was like, maybe you haven't realized you've become a victim. Maybe there were still in your network and this is an opportunity to stop it and door closed. That was it. And about two months and two and a half months later when I had to basically close everything up, had the archive, everything, do the supply chain, hand over all the files, it went to the legal team in law enforcement. I said, I'm just going to do the right thing and let them know that I'm passing on the data that includes information that I had. And I said, Hey, just let you know that I pass on the files, which looks like your victim to law enforcement and the legal team of the company. And immediately right afterwards they responded.

Once they knew that I was passing it over to another agency and law enforcement, they respond. They said, yep, we were a victim. We tried to clean it up, we hid it, we didn't want. And that's when we started realizing that a lot of these cases, especially even in vulnerability disclosure, it's very similar even though it's more about the actually disclosures, incidents is that most want to stay. They don't want the pr, the public backlash. Because again, to your point, the difference between our industry and the airline industry is that a lot of victimization, blaming, and that has a lot of negativity. I think that's if we want people to disclose, let's say, without fear even anonymizing it, we have to make sure that we address the victim blaming side of things as much as we can and work together and do that.

Ken Munro:

There's a interesting thing as a flavor we've had on that with disclosure around particularly AD iot is someone's put their blood, sweat and tears and finances into building something and they've got it to market, and then some sodding security researcher comes to them and goes, it's a problem. And it's like being told there's a fault with your baby. And that can be very, very hard for someone to hear. Anybody would struggle to receive the fact that their project, their love, everything they've done for the last years has got problems. And I think it's being sensitive to that is really, really important. It can be very easy for a security researcher to come in gung and going, you got a problem, we're going to tell the world about it if you don't behave. And actually the first natural human response is probably aggression. So one needs to be really take real care with the way one does that.

Joseph Carson:

Yeah, you bring up a really important point, and this is probably where we can do some training around this type of thing, is how to disclose with responsibility. But it's not pointing fingers, but it's like, let's help make your baby better together. Let's help address the issues. And I think there's a way of dealing with that. And I don't think, yeah, I agree. I've seen some of the ways, I even just saw somebody doing a bug pointy for another organization recently. I saw the message that there is a way of doing it with, that's ethically with, what's the right word, with, I'm trying to think about it with empathy. We're not pointing fingers and saying, you did a bad job. What we're saying is you did a great job, but let's make it safer. Let's make it better.

Ken Munro:

A new term, empathetic disclosure as opposed to ethical disclosure. I like that.

Joseph Carson:

It's, it's a good choice for a future talk. But Ken, as always, it's been fantastic having you on and really your knowledge and insightful and what you do to the industry is definitely making the world a safer place. And definitely, I'm always excited about some of the future revelations and details you'll be sharing. So I'll be looking forward to catching up with you at DEFCON this year, and we should definitely grab a drink together. But it's been fantastic having you on the show. Any final words of wisdom for the audience that you'd say, is there any resources that you'd recommend people to go to that can help them find more information about this?

Ken Munro:

Wow. Okay. If you're looking for advice around iot, we wrote some development guides a little while back, just the questions to ask early on more than anything. In fact, for anything to do with cyber, particularly iot, is get a bit of guidance on day one. It's so much cheaper and faster to build cybersecurity. And at the beginning, the right choice of micro control, there's the right choice of storage memory, the right PCB manufacturer. It's so much cheaper to build cybersecurity at the beginning than it's to try and retrofit it in after you've already been to the fab and had your PCPs designed. So start early is cheaper.

Joseph Carson:

I completely agree, changing the Bluetooth module much later. It's a very costly thing to do. And that really gets to the point is that we have this whole shift left concept and we have security by design. And ultimately, I believe that the next step after that is getting to security by default is that that becomes the ultimate desire goal is to make sure that it's not only by design, but people can actually use it easily as well. And that's an important factor. Ken, it's fantastic, Evian. I always really enjoy talking to you. I'm looking forward to flying the next simulation as well at some point. I did Miss InfoSec this year, so unfortunately it wasn't there. But

Ken Munro:

We'll be there. Aerospace Village at Defcon this year. So if anyone's listening, come along for a little fly, dude, come along. Be great fun and good to see you there too.

Joseph Carson:

So for everyone, tune in every two weeks for the Forum Access Tonight podcast. I hope this has been very educational, very insightful into all the things, smart, connected IOT embedded systems and really giving a bit of insight into that world because definitely it's something that's going to impact our lives on a daily basis, going to make our lives better. But we also want to make sure that it stays, that it's safe, that it's secure, that it's private, and that we have knowledge and we have the visibility to make the right decisions. So Ken, it's Spin Nelson having you on and look forward to catching up soon for the audience. See you again. Take care and stay safe. Thank you.