Skip to content
 
Episode 68

Know Your Hackers' Rights with Chloé Messdaghi

EPISODE SUMMARY

Ethical hackers often walk the line when it comes to legalities. How do they tell which side of the law they’re on? Join hosts and fellow ethical hackers Joseph Carson and Chloé Messdaghi as they get into hacker safety tips including copyright laws, licensing agreements, and vulnerability disclosure policies.

 

Subscribe or listen now:  Apple Podcasts   Spotify   iHeartRadio

Hello from Cybrary and Delinea, and welcome to the show. If you've been enjoying the Cybrary Podcast or 401 Access Denied, make sure to like, follow and, subscribe so that you don't miss any future episodes. We'd love to hear from you. Join the discussion by leaving us a comment or a view on your platform of choice, or emailing us at Podcast@Cybrary.it. From all of us at Cybrary and Delinea, thank you and enjoy the show.

Chloe Messdaghi:

You're not ever going to have an attacker knock on your door and be like, "Hey, I found a vulnerability here." It's like, when they find that vulnerability, they're going to go running with that vulnerability. That is what they do. So I mean, if you have any hackers that reach out like, "Hey, I think I found something here," that's the one who's actually doing you a favor, not coming after you. Because if they're coming after you, you would know.

Joseph Carson:

Hello, everyone. Welcome to another episode of the 401 Access Denied podcast. I'm Joe Carson, co-host of today's episode, and I'm the Chief Security Scientist and advisory CISO at Delinea. And I'm pleased to be joined with the awesome Chloe. So Chloe is my co-host for today's episode. Just give us an introduction about yourself.

Chloe Messdaghi:

Sure.

Joseph Carson:

You can tell us a little bit about today's theme. What's the topic for today that we have?

Chloe Messdaghi:

All right. Well, hello, everyone. My name is Chloe Messdaghi, and I'm the Chief Impact Officer at Cybrary. And before I joined Cybrary, I have been into the world of hacker rights. And if you're wondering, "What the... What's hacker rights?" Well, let me tell you, it is talking about how we don't have rights most of the time. And that's what we're going to talk about today is how do you go about protecting yourself when there isn't any legislation out there that really protects you fully. So today, we're going to talk about that.

Joseph Carson:

Absolutely. And that's always a challenge. It's always important to know. One of the things I find is for me, when I'm doing, basically, it's always important, I always prefer calling myself Ethical Hacker, because that just gives the difference in my motives. My motives is always about making the world safer. It's about making it safer for people to do the things they enjoy doing in living society and having fun and doing their jobs. So I'm always looking at from doing it an ethical perspective.

And it's really important when you're doing hacking, it's so important to understand about what your rights are, what you can do legally. And also, I travel quite a lot, so I also have to know the laws in different countries and the laws in different societies and the rights and what I can bring with me and what I can't bring with me. So it's really important to understand about how far you can go, where's that line, and how to make sure, how you stay protected. And also, not only about even the rights, but also how I make sure that I'm staying safe as well. Because even when you're doing ethical hacking, you also do become a target as well. So it's also important to understand about what things I do. So Chloe, just interested in anything that comes to your mind when it comes to these areas, especially when it comes to the right side?

Chloe Messdaghi:

Yeah. So I mean, every country pretty much has anti-hacking laws, and then you have copyright laws, and you might be like, "Well, what does that have anything to do with each other?" There's a chilling effect, basically, there. So basically, anti-hacking laws is used if you are... You see it in an application differently than what you're supposed to, that you didn't follow the guidelines. But a lot of times, the anti-hacking laws are actually applied to employees at tech companies. We see it all the time. Where they have access to a certain application, but they use it for a different purpose, and thus, the company goes after them for using it for a different purpose.

So the thing to note is that you have also the copyright laws, and those basically say that you cannot change what you purchase. So say for example, you buy a car and you want to fix your car yourself, but the thing is that the copyright laws prohibit you from doing that. So even though you own that vehicle, you don't technically own the rights of changing or modifying your vehicle. And because of that, that's where hackers have a hard time of being safe. So that's one of the things I always try to tell people.

And then if you're going to bug bounty, for example, or vulnerability disclosure policies, it's to know what is in scope, what's not in scope, and then even if you are staying within scope, to keep communication RIN and keep track of all that in case if they were to go after you. Most of the time in the US itself, things have changed and has gotten much better when it comes to anti-hacking laws. So we don't really see any cases of federal going after the hackers in the community that were trying to share a vulnerability that they found, and they didn't exploit or anything like that. So they did it in an ethical way, we say.

Joseph Carson:

Yeah. Yeah.

Chloe Messdaghi:

But we do see it happening quite a bit in local areas. So like your county, your city, and companies, they use most of the time for that. So that's why we're kind of where we are today where we don't really have protections as security researchers at all, because it's very hard to change that law because it's used so much by major tech companies that have lobbied so much to keep that law in place still to this day.

Joseph Carson:

Yeah, absolutely. I think the most recent famous one, I think I'll always remember that was on the social media kind of trending was the famous F11 scenario. And I think that's the scenario where I think sometimes people don't understand that that's built in functionality in order to look at basically the source code of a website. And if the developer created that website is made data available within the HTML source code, it's publicly available. It's not... F11 on, basically, a browser is typically there for troubleshooting and look at the source code, and something that's publicly available. So sometimes I think misunderstanding hacking and built-in functionality, sometimes I think you get a lot of cases where some companies don't understand what it is.

Chloe Messdaghi:

Yes.

Joseph Carson:

Or they don't understand, basically, what they've done misconfigured. It was Weev, I think, that even it goes back to the original Weev one. So Weev was basically a well-known internet troll, who basically, I can't remember which telco company, but simply, he just found that their website by changing the number at the end of the URL basically unveiled another person's account. And he went through, basically, and of course, the problem is that when you enumerate that... I've always... When you're creating a new set of data, that's something that you always have to be careful with. And I think what Weev ended up doing was he copied that, he enumerated the data, and then made it available to a journalist. And what you've done is that you've created a new data set.

And I've also been in this scenario. I got my hand slapped one time by Europol when I basically created two breached databases and created a link between the two, an index. And that's where you're starting to go over the line of legal boundaries. Because in Europol's eyes, I was creating a new breach by combining two data breaches together. And that's something you always have to be careful with is what you have to really understand where your limits are.

And that's where... When you're talking about going into bug bounties and understanding the scope, it's so critical. I've even seen situations where you're given a set of IP addresses for a pen test, and all of a sudden, you go, and it's so important that you numerate those IP address ranges, who owns them? And I can't say this enough is that don't just double check, but triple check. Because I've seen cases where all of a sudden, you might be giving a set of IP addresses, and a few of those IP addresses may not be in ownership of the company, it might be cloud services. And that sets a whole new rights issue if you, all of a sudden, from doing something that was scoped that was on premise, and they accidentally give you IP addresses which are cloud-based, and that might be AWS, it might be Azure, it might be Google Cloud, and all of a sudden, you might be targeting something that might be shared services, where that company that has given you permission has no accountability and no permission to give you the ability to do pen test against those IP ranges.

Chloe Messdaghi:

Yeah.

Joseph Carson:

So I can't emphasize the importance of double checking, triple checking everything to make sure that when you're in scope that you know really do understand that you are in scope, especially when it gets into those things. So I think it's one... The good thing is there was a law that was recently changed. I can't recall the name of the law, but it was a federal law that was about the computer misuse, and it gets down to-

Chloe Messdaghi:

CFAA.

Joseph Carson:

Yeah. It gets down to about the motive behind your intentions. But again, that's very up for debate.

Chloe Messdaghi:

Yeah.

Joseph Carson:

It's not... It makes a little bit flexible in regards to that your motives, your intended motives were for good reasons, but it's still not a catch all for ethical hackers. So you really always have to make sure that whenever you're going down, if you've given something off a bug bounty, or something of a pen test, or a red teaming activity, that it's always important to understand, go down to the fine detail, understand what scope that you have, make sure that you verify that that is the intended targets, and there's no typos, I've seen typos also causing lots of issues, cloud services, shared services. So it's really important to make sure that you double check and verify that what you're given is correct and accurate.

Chloe Messdaghi:

And keep a record.

Joseph Carson:

Don't assume.

Chloe Messdaghi:

You want to keep a record. The case I always bring up is the DJI case.

Joseph Carson:

Yeah.

Chloe Messdaghi:

So basically, researchers that came across DJI, they launched their own bug bounty program, and they state what's in scope, what's out scope. But they sent an email to confirm, having RIN communication, and in the RIN communication, it shared, "Yes, these are the following things that are in scope." So they submitted a bug. And then when it came to a payout of 30k, one of them walked away from it, because the contract itself did not apply any support and protection for himself. So he walked away, not taking a 30k.

And what happened was that DJI freaked out, they thought that this was going to become a PR situation. So internally, they were having communications in the email chain that they were communicating him with. So he was able to see internal communication saying, "This person is a threat. We need to go after him." And he just kept silent. And then he gets a notice saying that he violated the CFAA, the Computer Fraud and Abuse Act, even though he stayed within scope and double checked and everything like that.

So what he did was that he published a blog, which had all the RIN communication, also screenshots of the internal communications that they forgot to note that they are sending it to him directly at the same time. And he got out of it. But not everyone is very lucky in those cases.

Joseph Carson:

Yeah.

Chloe Messdaghi:

So it's one of those things like, even if it says within scope, double check before you submit any bugs. And if you don't have vulnerability disclosure policies, or bug bounty program, I highly recommend you not submitting bugs to them if you can. Instead of go to someone that that may work at that security team to find out what would be the way of processing this information. Because what I've seen sometimes is hackers in the community don't know the right way of communicating that they found something.

Joseph Carson:

Right.

Chloe Messdaghi:

So they'll publicly tweet something, or they'll publicly post something on Facebook, or LinkedIn, tagging the company, saying, "I found this vulnerability. Please contact me," as a way to get them to get in touch with them, because they don't have vulnerability disclosure policies.

Joseph Carson:

Yeah.

Chloe Messdaghi:

And if you don't have EDP and you don't have your email address of who to contact, yeah, this is a way how people will do it. It's not the best way of going for it.

The other one is that usually people will DM these companies. But who owns those Twitter accounts, that Facebook account, that LinkedIn account for that company? Marketing. You think marketing is going to do anything with that? No.

Joseph Carson:

Probably a PR company as well.

Chloe Messdaghi:

Yeah.

Joseph Carson:

That might be third party. It might not even be the same people.

Chloe Messdaghi:

Right. And it could be.

Joseph Carson:

Yeah.

Chloe Messdaghi:

So then they're like, "Oh, we have a situation here." They escalate it to a legal team, or PR team. And what do you think they're going to do? They're going to find any way to squash you. So I mean, it's such a maze, and I honestly believe that every single company at this time should have vulnerability disclosure policies that are very easy to understand, not where you're signing your life and your rights, because it's 50 pages long. I mean, just straightforward, "This is what's in scope. And everything that's not here is out of scope. Third parties contact the third party, don't contact us about it."

Joseph Carson:

And the thing... So one of the challenges I've got is that a lot of my past is the NDAs I've had to sign as well. And that for me, I mean, I've signed away 10 years of my life in NDAs, and there's also lifetime NDAs. I think the lifetime NDAs are usually 20 to 25 year terms. But when you sign NDAs, you really have to understand about what you're putting yourself into, and the restrictions, especially when you're getting into vulnerability disclosure, or you are revealing a vulnerability or CVs and so forth.

You want to make sure that when you're getting into NDAs, you want to make sure what that boundaries are, what those limitations are, what the lengths and terms are. Because when you're getting into a lot of legal talk, it's always important to make sure you actually get what that means in real English and real language and real understanding, because a lot of lawyers will have off interpretation when they put things in legal frameworks. So it's really important when you're getting into bug bounties and you're doing vulnerable disclosures and you're doing ethical hacking is that those NDAs have to be simplified as much as possible. And make sure that you all always have some additional oversight into what you can do going forward.

I've seen NDAs in bug bounties, which means that you cannot go and work for or do penetration testing against like companies, for competitive reasons, which is like why are those restrictions in place? So I think it's also really important to understand when you're getting into those, the NDAs, and simplify them as much as possible. You don't want to get in the situation where I had 10-year NDAs.

Chloe Messdaghi:

Yeah.

Joseph Carson:

Ideally, they should be for the term until the vulnerability is disclosed, and then the NDAs done. Or I've had basically some restrictions because that even though the organization that I discovered a vulnerability for that they have resolved, but there's other companies that have yet to resolve it as well that are using the same technology and same hardware. So you get into situations as well that sometimes they do go longer than the contract that you're working with directly, just because there's other major organizations that may have not fixed the vulnerability themselves. They may take much longer. So it's also really understand the terms as well. I think that... I probably wish I knew more when I started off doing this earlier and got better NDAs in place, just because of the length of time, I wish they were shorter, to be honest.

Chloe Messdaghi:

Yeah, I think most of the time, it's about 90 days. You cannot-

Joseph Carson:

Correct. From a vulnerability disclosure, yeah.

Chloe Messdaghi:

Yeah.

Joseph Carson:

Once you discover to making it public. But depending on the severity-

Chloe Messdaghi:

But there are cases where they don't.

Joseph Carson:

Yeah. Depending on the severity, I always understand, for me, because my background was in backup and recovery and patch management. That's where I spent a lot of my early days in my career was patching systems, things that stuxnet vulnerabilities, the patches, the SNB, EternalBlue patches, all of that was my background in understanding about what the vulnerabilities are and the patch management processes and updating and securing systems and then the backup recovery process. So I do understand that not every vulnerability is equal, and that not every... I've worked with maritime companies and critical infrastructure to power stations and water supply, and you just can't pass those systems easily.

So you have to, depending on what is the severity of the vulnerability, and also the time that it takes to update and patch those environments, you have to take those into effect. So 90 days is the ballpark figure. That's what ideally you want to be at. But depending on the environment and infrastructure and how hard it is to patch those systems, you also do have to take those into effect. So I've seen variations from the 90 days to even several years, just because it's not just a software problem, it's also a hardware problem. And to replace the hardware in a power station can take some time, and a ship, to take a ship out of operations is very significant financial costs. So I do understand that from my side, yeah, I'm just not working in software, and sometimes there's a lot of critical infrastructure. So therefore, you do have to be flexible in the criticality and the impact and the difficulty in patching in those service scenarios as well.

Chloe Messdaghi:

Yeah, exactly. Well, we just first need to convince like most of the Fortune 200, 500 companies at this time, because majority of them still don't have vulnerability disclosure policies. So until then, we have a problem.

Joseph Carson:

Yeah.

Chloe Messdaghi:

Because we have to get that fixed, and then we can at least have some sort of protection. I mean, there's a reason why I think the statistic is one out of four of security researchers will not report a vulnerability that they find if they don't have vulnerability disclosure policies. So if you think about that way, I just think of all the vulnerabilities out there that are being used at this time by an attacker, because companies are scared to have vulnerability disclosure policies, because they're like, "Well, we're opening a door for hackers to hack on us," or something like that.

Joseph Carson:

Yeah.

Chloe Messdaghi:

Which is so ridiculous, because what we're doing in the first place is when we go to your website, we're already looking at the vulnerabilities that you already have. I mean, you can either see it, or you don't see it. If anything though, how I see it is that it's better to have something set up where you set the game, the rules of, "You want to participate-

Joseph Carson:

Absolutely.

Chloe Messdaghi:

... here you go, this is how you do it." Because it's going to help you when it comes to risk, security and PR purposes, but also because everyone's on the same page of what's okay, what's not okay. We don't have that. It's going to be a nightmare for you.

Joseph Carson:

Yeah.

Chloe Messdaghi:

Yeah.

Joseph Carson:

You also open it up to the right people as well that you want to be doing it as well. You also set even the regions and the boundaries and the financial cost and impact. So you're absolutely... Chloe, that's that think that's one of the most important things here is that for organizations to have their own disclosure program and their own pen testing rules and get to be able to select and define the scope and make sure they're protecting themselves at the same time. And also, it's a willingness to work with the hacking community as well. I think it's always important, because when we talk, I always hit the media portray hacking in a negative sense.

Chloe Messdaghi:

Yeah.

Joseph Carson:

And I think that's something, and we don't do ourselves any justice as well. When we go to these big expos, we use fear in order to promote security solutions. I always think we have to change it into showing how we enable and how we empower and how we are the safety, we're the seatbelt. We saves lives. That's what we should be... We should be promoting that, the positive side of things versus that, "If you don't wear a seatbelt, you're going to die."

Chloe Messdaghi:

Yeah.

Joseph Carson:

Versus that, "If you wear a seatbelt, we're going to save your life." I think the narrative there is so important, and we have to get better at that. But it gets into it. I think this really is important to understand that it gives companies opportunity to work with the community in a positive way.

Chloe Messdaghi:

Yeah.

Joseph Carson:

Most hackers are good hackers. They're using their skills for good. They're here in a positive way. They want to make a positive impact. At the same time, they want to make money. They have to put food on the table, they have to pay their bills, and they want to live life. So of course, at the end of the day, you want to be paid.

I remember, I never forget, it was years ago, maybe about eight years ago, before all the bug bounties and pen testing became something of getting paid or that you would get as a service. I remember, I was sitting, it was at a big event, and we had the hackers at one side, and we had law enforcement the other side. And I always regret, I always remember, after we had the discussion in the speaker room and chatting with law enforcement and so forth, and we had the discussion, I was like, "I wish we had big corporations at the same table in this panel, because that would've been so much fun. I would love to see a panel that has law enforcement, that has hackers, and corporations all around the one panel and having a discussion into how they can work better together."

Because ultimately, what I remember when I was on the panel session, and we're talking about when you find something, "How do you disclose it without all of a sudden turning a target on yourself?" And the law enforcement, they were like, "Well, do it anonymously." And you're like, "What? You spent six months researching, pulling things apart, understanding how it works. You find something that has a big impact, and now you don't want to be rewarded for it?" All the hackers want to do is get acknowledged for their knowledge and intelligence and their hard work and the effort that they put in, and it's a lot of costs. It's a lot of personal costs that they put into it. And then you want to be rewarded for that, whether it's financial, or just recognition in the industry." And law enforcement was like, "Oh, but you're kind of like you're in this gray area. The corporations could target you, could come back, because you're abusing their user license agreement. You're abusing the way the software's intended to be. You're abusing the acceptable use policy that they put on their website."

So it's really important to make sure that the intentions and motives are there, the right motives. Sometimes their methods and techniques going around it may not comply with the organization that's the target, but it's important to make sure that there's a level playing field.

And I wish I could do that panel again today. I would love to come back.

Chloe Messdaghi:

Oh, they're-

Joseph Carson:

It was so much fun to discuss. For the audience in the room, being a fly on the wall, I think that was probably, I think, one of the funnest for the audience to see the communication and the challenges that you face when you're doing this and how important it is to make sure that there is the rules in place and you understand the rules and you adhere to the rules. So you're so absolutely right. The bug bounties and disclosure, and it's having a front door, I would say having a front door to the hacking community is better than having no door at all.

Chloe Messdaghi:

Yeah.

Joseph Carson:

Because it allows you to have at least a discussion.

Chloe Messdaghi:

Yeah.

Joseph Carson:

And have a way of moving forward together.

Chloe Messdaghi:

I just think of you're not ever going to have an attacker knock on your door and be like, "Hey, I found a vulnerability here." It's like, when they find that vulnerability, they're going to go running with that vulnerability. That is what they do. So I mean, if you have any hackers that reach out like, "Hey, I think I found something here," that's someone who's actually doing you a favor, not coming after you. Because if they're coming after you, you would know.

Joseph Carson:

Yeah.

Chloe Messdaghi:

There's a huge difference on how they communicate. But-

Joseph Carson:

Absolutely. The motive is very different and intentional. When a malicious attacker finds of vulnerability, they're going to look to gain access, they're going to look to steal your data, they're going to look to financially benefit from either business EM or compromise invoice fraud, stealing. They're stealing data, malware, blackmail.

Chloe Messdaghi:

Blackmail, IP stealing.

Joseph Carson:

Yeah.

Chloe Messdaghi:

All that stuff. So it's everything. So the only message you're going to get from them is like, "Hey, pay me."

Joseph Carson:

Yes.

Chloe Messdaghi:

That's when you know you have an attacker.

Joseph Carson:

And it's not 90 days.

Chloe Messdaghi:

No, it's not 90 days. And they will publicly state whatever they want. They can do whatever they want, because they're not... There's a reason why they're doing it. It's a different intention. It's an intention of getting money for whatever reason, or it's literally a blackmail situation. Or it can be that they're a nation-state actor, and they're trying to get into your records for reasons, for intel. There's so many different reasons why people come after you.

Joseph Carson:

Absolutely.

Chloe Messdaghi:

But at the end day, that's why we need to build those bridges.

Joseph Carson:

Yeah. So one of my methods anyway, when I got into, early years ago, the engagements, I had my own NDAs that I would prefer to get people to sign. So at least I had a framework of negotiation. I always made sure that everything that I did that I used was sanitized and cleaned. Reusing equipment is always a danger that you might overlap data incorrectly. So making sure your lab environment is very sanitized.

I have a set of virtual environments that I use as a base. But every single time, I always set up with a new baseline, a new set of sanitized discs. I take multiple backups of where my starting point is, and of course, basically hash those into a blockchain for a number. So no one can come back later and say, "You changed or modified it." So that's just my starting point into that. Then I basically set up a whole new password manager, basically, a new vault just for that engagement, for all the credentials you're going to use, all the accounts you're going to set up.

It's so important to make sure that you have very distinct instances of which engagement you're working with. Because a lot of pen testers, they work in multiple companies at a time. A lot of hackers are doing multiple research at a time. So it's always make sure that you have those segmented and clean and different networks and so forth.

My backup process is ridiculous. I take multiple backups, and I store in different locations, just in case of if you ever have any physical damage. So my backup process is just ridiculous, just to make sure that if anything does happen, at least I've got myself and everything intact and integrity. The password process that I have to make sure, because when you get into... A lot of... It's really important for ethical hackers, is that what you don't want to do is take your pen test and create a vulnerability by creating a backdoor, persistent backdoor that has a weak credential. You want to make sure that as you're going through this process as well, you're adhering to making sure that you're maintaining at least a high level of security, especially when you're getting access, and also where you're doing it across.

I remember situations where I was doing a penetration test, and I was overseeing the red team, so I was the recon. So a lot of my role in my recent times has been doing reconnaissance. I do the planning. I look at the organization, I gather all the different, about the organizational structure, about what technology they're using, about what software, looking at the job descriptions to see what they're hiring for, what operating systems they use, what their security team. A lot of the job description security team is great, because it shows you what defenses are in place. So it gives you an idea of what you need to get bypassed. So for me, building that reconnaissance, and then the red team goes and does the actual act of pen test. I remember situations where I had to stop engagement because cross border. I was in a different country at the time, and all of a sudden, I had to disengage. I could communicate with a team, but we weren't allowed to do data sharing, just because you were then working across a border.

I remember a situation as well where there was a large shipping company. This was a fun one. I was doing the review of the pen test. So after the pen test was done, I was... Also, it's good to make sure that you have somebody who's overseeing your work as well. So having another third party that's overseeing. The real purpose is to call it bullshit, is make sure that everything's been done properly.

And I remember a situation with a shipping company, a very large shipping company, that the location of the ship where they were going to do the penetration test was not in the location where they were thinking that it was going to be. It was meant to be originally the Mediterranean Sea. And the team was all prepared, they had all their stuff, and they're just waiting for which, the name of the ship and when to get go. And all of a sudden, the location changed. And it was actually in the South China Sea, which meant, in order to get onto the vessel, they actually had to go through China. And they're looking at all of their equipment. They're just like, "We can't take half of this stuff with us because of export controls compliance." They could not get through customs in China with half the stuff they were planning to do.

So also, it's so important to understand the laws, the rules, especially when you're traveling, especially different countries. You might be working with a company that might be headquartered in the US, but all of a sudden, the scope is in a different region, and you have to understand about the rules there. So it's so important to make sure, and also have somebody who can give you oversight as well. Make sure you've got another person who can help you oversee to make sure that you have a second set of eyes in everything you're doing to make sure they can call up when something might be, you might be getting into a gray area.

Because that's the difference between us and the malicious attackers is we try our best to adhere to law. We make sure that we always stay in the good side. But even when we make mistakes, it's important to understand, I think this goes back, it's the motives. Our motives is good intentions. Your point, when we're contacting organizations, it's to help them. It's to help them and their customers. Not to profit from their kind of mistakes or misjudgments.

Chloe Messdaghi:

Yeah. I mean, one of the cases that always comes to mind also is the Coalfire one.

Joseph Carson:

Oh.

Chloe Messdaghi:

Because Coalfire, you need to know of the state-

Joseph Carson:

Absolutely. I loved-

Chloe Messdaghi:

... the county, the city laws. These are things you're going to have to know too. So depending on who is the person who's enforcing these things, you've got to know.

Joseph Carson:

That's one of my favorite sessions of all time. I never regret listening to them during the talk about how they get arrested and stuff. And you have to be careful as well.

Chloe Messdaghi:

Yeah.

Joseph Carson:

Because law enforcement are not the... Let's say-

Chloe Messdaghi:

They're just following orders, or they're doing their own orders.

Joseph Carson:

It can be scary.

Chloe Messdaghi:

It's really scary.

Joseph Carson:

Yeah.

Chloe Messdaghi:

And these are people that... So it was basically two employees at Coalfire that participated in something where Coalfire was hired to do-

Joseph Carson:

Correct.

Chloe Messdaghi:

... which was to do penetration testing at a facility that happens to be a courthouse.

Joseph Carson:

Correct.

Chloe Messdaghi:

So the thing is, is that even though they had the paperwork that they were doing something legal, that they were hired to do this, it did not give them full protection, because it depends on who's coming to enforce that law.

Joseph Carson:

Yeah.

Chloe Messdaghi:

So if you have someone who's enforcing the law by state, county, or city, but you have paperwork that's for the county, state or city, and they don't match, you're going to have a hard time.

Joseph Carson:

Yeah.

Chloe Messdaghi:

So this is even when you're hired by a company to be a security researcher, you still have to worry about your own safety and your own security concerns yourself. So I always say to people, it's like, "You're not going to know all the answers, but one of the things to think about is that make sure your paperwork for the company that hires you to do these things has protection for you, and making sure that you also have an attorney for yourself in case of these cases do end up being a situation."

Joseph Carson:

Absolutely. And that's specific, the Coalfire one, it was the courthouse, because it was the building itself was leased, and it wasn't owned by the courthouse, it was by the local county, and it was the county police, I think it was the county police that turned up and arrested them. So they actually went to jail. Of course, after going back and forth, eventually, they did get out. But then those moments, it can be scary.

Even I remember John Strand's talks about when they're going into facilities, because the physical security can be very dangerous, especially when you're coming up against maybe even third party hired armed guards for facilities as well. So you can get into situations where it can be very, very tricky and high risk to life as well if they just make the wrong mistake. So I think it's really important to understand, absolutely, what your liabilities are and where you're protected.

And that's why one of the things I had mentioned earlier when we talked about the scope of IP address ranges on-premise and cloud, it's, again, going into the same thing when you get talking about physical pen testing is that you might be dealing with an organization that's leasing the building, but they don't own it. And therefore, the legal boundaries does change, and it goes into the contracts.

I remember even similar thing going back, one of the fun pen tests was with the ship management company. And it was one of the first ones I remember that the team I had on the red team, there was a person that was basically, they were doing their Master's course, and their thesis was on RFID and wifi hacking, and the thesis, one of the things I was actually helping mentoring the person, and they were doing it was drone hacking. So they were looking at the risks of commercial drones. And when they were doing the ship management company, they thought, "Oh, okay." He built this prototype of a bunch of raspberry PIs with SDR antennas, and it was just a radio catcher. You ever see the guy, similar to the guy you see at DEFCON, the Cactus?

Chloe Messdaghi:

Yeah.

Joseph Carson:

Mr. Cactus. I can't remember his... John? I always forget his name. But the Mr. Cactus guy, who has all the pineapples and catches all the radio frequencies. It was very similar, only with Raspberry PIs, and caught all the frequencies. At that point in time, we find it was a vulnerability in a smart LED, a smart light bulb that they were using. But we got into the challenge as well during that pen test. The problem was, again, that they didn't own the building. It was actually another company. They were leasing it. And that's always unique, those challenges. Lucky enough, there was no legal issues or laws broken, because the contracts were properly done. But you have to be careful, because you might get into the situations where there's multiple, multiple parties that you've only got a contract with one. Always make sure you've got some type of protection there.

Chloe Messdaghi:

Exactly.

Joseph Carson:

So it's so important, I think... I mean, I can't emphasize, and I don't know about the US, I do know that each state, I know there's the federal laws, the Computer Misuse Act.

Chloe Messdaghi:

Yeah.

Joseph Carson:

But some states might have different interpretations of those.

Chloe Messdaghi:

Yes.

Joseph Carson:

So it's all of a sudden, understand when you're crossing state lines as well into what you're covered. I think the problem is I think we really need to make sure that, from a government perspective, I think that there has to be some type of protection in place for those, especially when the motive is good. I always look at some of the great ethical hackers out there. It was just stepped over the line, the likes of Chris Roberts and the old United Airlines thing.

Chloe Messdaghi:

Yes.

Joseph Carson:

You get into the Marcus Hutchins, where, okay, he did a bad thing, but also a good thing. So does that outweigh a bad thing that you do?

Chloe Messdaghi:

Yeah. Aaron Schwartz.

Joseph Carson:

Aaron Swartz.

Chloe Messdaghi:

That was... I mean, that's the whole thing is making the case of good intention.

Joseph Carson:

Yeah.

Chloe Messdaghi:

I mean, I think that's always the hardest thing, because you can never be in someone's head. So you never really know if they're playing up a show, or if they actually were doing something for malicious or good reasons. And until we have something where we can figure out that, it's always kind of like a legal game of, "Is this person masking, or are they being sincere?"

Joseph Carson:

Yeah.

Chloe Messdaghi:

So I think that's kind of how it is at the end of the day is knowing that it's a human situation, a human problem, but it is something that we can do better on if we start working together on doing it better.

Joseph Carson:

Absolutely. And it's getting more difficult now as well. I remember there was a couple of ones few years ago, the one in the power station was tricky, because... So the issue was that, and this goes to any type of pen test today, you have to really understand the contracts they have. Because when you're buying things today, so if you even buy a TV, and you bought the hardware. And you install it in your office, and that TV's installed in your office, that you might have agreed a contract with a service, and that contract with that TV is now under a different service contract. So you own the physical hardware, but the software and the data that's been generated is under a completely different contract.

And this gets into a situation where I remember it doing the pen test power station that the engines, the actual engines themselves were owned by the power station, but the data being generated was not. So doing a penetration test on the actual physical engine itself, you would actually get into a situation where you're now having to deal with a third party as well. And this gets into whether it being cars today, even smart homes, security systems, TVs, projectors, any sensors that you might think that you actually own it because you own the physical hardware itself, but the contract that you signed up when you got it is actually saying that you are actually handing over and it's part of a service agreement, not part of the actually hardware purchase.

And it's so important to make sure that when you're getting into the situation today that you actually read the difference into whether you're looking at the hardware or there's actually an existing service agreement. Because in TVs, you've got EULAs. If you're using a service, let's say you use a TV as a initial access, because it's got DNC enabled on the TV, you have to check and make sure what service agreement you've actually in place for that TV.

So it gets so tricky today, especially with hardware versus the service level agreements, especially with manufacturers. So I think it's a tricky area that's a bit challenging, especially that data that's going across countries. So yeah, it's always important. That's one thing that I've seen is the contracts getting much more complicated, that there's no very clear line into who owns what, especially when you're get into pen testing.

Chloe Messdaghi:

Yeah. That whole terms and conditions like, "Okay, let's go read that Apple 50 page thing where I basically they own my life or my soul."

Joseph Carson:

That's... Yeah.

Chloe Messdaghi:

I mean, that's the reality of it. We don't really look at it, we just sign and move on, because we don't have time to read 50 pages.

Joseph Carson:

Yeah.

Chloe Messdaghi:

I mean, that's the reality.

Joseph Carson:

I love the EULA, the EULA one where they actually had at the end of the EULA that if you actually call this number, you'll get, I think, it was like a thousand bucks or something.

Chloe Messdaghi:

Oh, wow.

Joseph Carson:

I thought that was brilliant, because no one ever called the number, because no one ever read the EULA right to the end.

Chloe Messdaghi:

And that's where we are in society. It's like, "Oh, if everyone else is signing it, okay, we're good." But that's the reality is that none of us are really attorneys doing this episode, by the way, everyone, so please don't take-

Joseph Carson:

I'm not.

Chloe Messdaghi:

... ourselves as legal advice.

Joseph Carson:

Yeah.

Chloe Messdaghi:

I mean-

Joseph Carson:

This is not legal advice. This is just to make sure that you know what to look for.

Chloe Messdaghi:

What you should do is read these things before you sign, and also make sure you know someone who can be a contract lawyer who can review your stuff for you.

Joseph Carson:

Absolutely.

Chloe Messdaghi:

I mean, the contracts are written for other attorneys, not for the average person. So just note that they are... It might be good to know someone like that. And then if you're looking into like, "Well, how do I even go about creating vulnerability disclosure policies for my company?" I do recommend checking out disclose.io. It is a great place to get those policies that you can copy and paste, put it for your own, but also, you can have your legal team review it ahead of time. So that's a good way to get started.

I mean, honestly, at the end of the day, we have to have bridges in this community. Otherwise, things are going to stay insecure.

Joseph Carson:

Absolutely.

Chloe Messdaghi:

And why not just use people from all different walks of life to find the vulnerabilities that your security team can't. And that's okay, that's the reality. Diverse eyes are going to find something different.

Joseph Carson:

Absolutely.

Chloe Messdaghi:

That's why we need to have bridges, to keep each other safe.

Joseph Carson:

Absolutely. Because you would never be an expert in everything. In your organization, you're focused, what your team security team is meant to understand is your business and how your business functions. And they're not going to be the security experts in everything. And that's why it's important to surround yourself, that's what I do is I surround myself with really intelligent people in certain areas and expertise in things that I don't know, and I go to them and ask for them for help. And that's what the bug bounty community is all about. That's what disclosure, what is it, programs are all about is to make sure you get the people with specialized skills in those areas that can help you make sure you do the right thing and secure it as best you can. So it's all about working together.

And I think that's one of the important things, I think, is that we have to build those bridges. We have to have the community working together, and we have to show that the motives there. And the organizations need to make sure that they have front doors into, and reach out to organizations that have them. Maybe you don't know, you don't want to do it alone. And there's other organizations that you've seen have created these bug bounty programs, and go and see how it works for them.

And if you're a hacker and you also want to learn, there's a lot of bug bounty platforms out there that can give you mentoring and how to do it right. So definitely look at different... And there's great mentors out there as well. I think Stok has some amazing videos, which we've had in the episode before about doing bug bounty programs and communities and how he got into it. So definitely look for the people who's done it, get advice from them if you're a hacker.

If you're an organization, you want to make sure you have a good front door and set the rules yourself, then look for other organizations that have good, what was it, programs in place.

I think that's how we bridge the gaps. That's how we all make the world a safer place is by making sure we have a good way. And I think what's important here, Chloe, as well, foremost, it's a fundamental way to communicate and get on the same page. It's all about communication. That's, ultimately, what the bug program's there for is to create a means of communication.

Chloe Messdaghi:

And boundaries.

Joseph Carson:

Yes, absolutely. What's-

Chloe Messdaghi:

Boundaries.

Joseph Carson:

... okay and what's not okay. And definitely, that's how we control it, and that's how we make sure it becomes regulated.

Chloe Messdaghi:

Yeah.

Joseph Carson:

And I would definitely love a law to provide some protection. I think EFF is also a great... They're there to provide some protection, so it's always important. I think they're doing a great thing out there as well, so I always make sure we give them a shut out.

Chloe Messdaghi:

Yeah, exactly. And then for anyone who is on marketing teams, or PR teams, if you are being contacted by a friendly hacker, I highly recommend not bringing it to a legal team and be like, "We got to shut this down." Instead, bring it to the security team for them to then follow up to see if this is actually something of a potential vulnerability or not. I mean, that's-

Joseph Carson:

Absolutely.

Chloe Messdaghi:

... the only way we're going to learn is making sure that everyone is aware of security and good security practices. And one of those is from marketing, to make sure whoever owns your social media to be able to follow through. And same with sales emails, if you get anything from a hacker or secure researcher reaching out and be like, "Hey, I found this vulnerability," forward that to the security team. Don't forward it to PR or marketing. Just forward it to security, and they'll take care of it.

Joseph Carson:

Yeah.

Chloe Messdaghi:

If it's going to be a legal issue, they will give it to legal. Trust me, they know this very well, when they need to get legal involved.

Joseph Carson:

Absolutely. I mean, this is so important. I can't emphasize, make sure that you understand what you're getting into and the legal side. Make sure you have a mentor that can help you understand about what things are okay and what's not. Because even when I started off in my career, I was lucky that I was surrounded myself with people that was doing this, and didn't try it alone. And they definitely made sure that I stayed on the right side of it. They made sure that I wasn't making mistakes. Because when you make... Making one mistake, it can be devastating.

I mean I think the difference here, I just watched a great documentary today, which probably gives you a realization, I watched Wingmen, which was a documentary of those who do the wing suits, and they fly through the sky. It's about three friends. And it was all about that when they make one mistake in that such a extreme sport, that you die. And that's the difference about-

Chloe Messdaghi:

Oh, geez.

Joseph Carson:

It's, if you make one mistake, you die.

Chloe Messdaghi:

Game over.

Joseph Carson:

It's game over. So they have no room for mistakes. In our industry as well is that if you make one mistake, you could get yourself into a legal situation where you could go to jail.

Chloe Messdaghi:

Yeah.

Joseph Carson:

And that's the difference is that, and that could be whole changing for, not just your career, but your life as well. So always make sure that when you go into these situations that always make sure that you understand the scope, and you triple check it. You triple check everything.

Chloe Messdaghi:

And assume you can get caught at all times.

Joseph Carson:

Yes. Assume.

Chloe Messdaghi:

Always assume-

Joseph Carson:

Yes.

Chloe Messdaghi:

... you can get caught at all times.

Joseph Carson:

Even-

Chloe Messdaghi:

You're not going to outsmart people, you're going to get caught. Just think that way, and you'll do good in the end.

Joseph Carson:

Yeah. I've got some interesting stories from some of my relatives that do the physical side. So when I hear their stories, I'm just like... We usually have a glass of wine and just talk about it. It's really funny, but scary at the same time. When I listen to the Coalfire and John Strands talks about how close they came, it's scary. I'm glad that I'm not doing the physical security side. I'm glad that that's something that... But always assume you're going to get caught, always make sure you've got the get out of jail free card.

Chloe Messdaghi:

Yeah.

Joseph Carson:

And that's your protection and rights, that's you fundamentally... And you get out of jail free card is how much you do the preparation and homework before you actually start doing the work. So make sure you do a thorough, detailed check on everything. I always say I even... Don't just double check, but triple check everything.

Chloe Messdaghi:

Yeah.

Joseph Carson:

That will definitely-

Chloe Messdaghi:

And be kind.

Joseph Carson:

Yes.

Chloe Messdaghi:

Be kind when you communicate with them.

Joseph Carson:

And humble.

Chloe Messdaghi:

Don't set it up as like, "You're going to pay me now." You submit in a bug, doesn't mean you're going to get paid out. You might get kudos, you might get an offer.

Joseph Carson:

You might get a high five.

Chloe Messdaghi:

Might get a high five, but don't assume you're going to get paid by finding a vulnerability. So just be kind, be nice, and you'll be okay. The thing is that if you're communicating where it comes off as if you're demanding something, or you're too aggressive, that's going to set you up for having a bad situation down the road. So be overly sweet, kind, and you'll be okay. But always keep a backup of everything.

Joseph Carson:

Back up of everything. Absolutely. And I think that's very wise words to end on this, because, yeah, absolutely. It's so important, how you communicate is so critical. So Chloe, it's been awesome. I think for all the hackers out there, or the organizations, law enforcement, we need to work together more-

Chloe Messdaghi:

Yes.

Joseph Carson:

... as a community. Because we all have the same motives, we all have the same intentions to make the world a safer place. So let's make sure that we, hackers, are able to use their skills for good and help your organizations, help the world, and have good understanding of hacker rights. We probably need to get a good visibility into... We've got a lot of good codes and codes of conduct, but I think we need to communicate this more and make sure that we help.

Chloe Messdaghi:

The last thing you want is all the security researchers in the world to go on strike for rights.

Joseph Carson:

Oh my goodness.

Chloe Messdaghi:

Because then we're going to have a very scary place during that strike. So why not just make it a good safe place for everyone and just start building those bridges and building those bridges even stronger so they last for a lifetime.

Joseph Carson:

Absolutely. Chloe, it's been awesome talking with you on this topic.

Chloe Messdaghi:

Likewise. Thanks.

Joseph Carson:

I really enjoyed it. It brings back a lot of memories over the different things. Some things, I can talk about, and some things are still, like the NDA stuff, you have to always be careful. But it's been awesome, and I think that the audience is going to get a lot of value from this session today and the podcast episode. So thank you again. It's been a pleasure.

And for the world out there, this is another episode of the 401 Access Denied podcast with both your co-hosts today, Chloe and Joe. It's been awesome. Stay safe. Hack legally, have fun, work together, build bridges, all important messages, and tune every two weeks for the 401 Access Denied podcast. Subscribe, whatever we have the subscription button, to make sure you get all the latest episodes and stay up to date. Take care, stay safe. And again, thank you.