Insider threat indicators that reduce your risk
“Not every insider risk becomes an insider threat, but every insider threat started as an insider risk.” -- Gartner
Gartner’s take on the difference between insider risks and insider threats highlights an important nuance that drives the cybersecurity strategies and solutions you should apply to safeguard your organization.
Unlike “insider threat,” the term “insider risk” doesn’t imply malicious intent or blame the user. It incorporates the common scenario of a well-intentioned person making a mistake. In fact, Ponemon says that more than 50% of insider incidents are attributed to errors and carelessness, such as system misconfigurations and unauthorized or accidental data disclosure.
When you consider that insider incidents have risen 44% over the past two years, with costs per incident up more than a third to $15.38 million, understanding indicators for both insider threats and insider risks is essential to reducing the potential impact for your business.
If you’re scoping an enterprise risk management program or reinforcing employee awareness training, it’s a great time for a deep dive on insider behavior. Read on to learn insider behaviors that increase your risk and common insider threat indicators to watch for. With this information, you can immediately implement effective strategies to reduce risk and act on those threat indicators.
What are insider threat indicators?
Insider threat indicators are suspicious behavioral patterns or unauthorized activities that identify a person or other entity within an organization as a potential security risk. These risks typically increase when people such as employees, interns, contractors, suppliers, partners, and vendors have privileged access to an organization's data and systems.
What are common insider threat indicators to look for?
Our upcoming Delinea survey provides insights into suspicious credential activity that could be potential indicators of insider threats. We asked IT and business decision-makers at 300 companies, “what, if anything, are the best ways to help you detect suspicious credential activity?”
Their top answers were:
|1. Unusual access times outside of normal business hours||53.62%|
|2. Unusual logon activity accessing credentials such as multiple sessions||50.99%|
|3. Unknown locations accessing resources||43.09%|
|4. Failed logon attempts from passwords or MFA||40.79%|
|5. Unknown devices accessing resources||37.50%|
|6. Impossible travel indicator||25.99%|
What are the differences between malicious and unintentional insider behavior?
Understanding the differences between malicious insiders and those who make unintentional mistakes is crucial for recognizing indicators of insider threats and developing effective security measures to prevent and contain them.
Here are the key differences to consider:
- Malicious insiders are typically motivated by personal or financial gain. For example, disgruntled employees may seek revenge on an organization they feel harmed by. Some are motivated to expose wrongdoings.
- Unintentional insider threats are primarily motivated by the pressure to do more with less, especially in fast-paced industries with budget constraints. They simply want to get their job done.
- Malicious insiders actively seek to shut down or slow down systems through actions like injecting malware. They also engage in actions such as financial fraud or selling sensitive corporate information and intellectual property on the black market.
- In contrast, unintentional insiders often access or share information that they don’t realize should be off-limits and protected by compliance and privacy rules.
- Malicious insiders can cause significant damage to an organization, especially if their intent is to prevent the business from operating.
- Unintentional insider actions may not result in downtime but do commonly expose data, which may result in compliance fines and insurance costs.
Example 1: Indicator of malicious insider threat
- Phineas is in his last round of interviews at another company.
- In his current role, he has access to confidential information that will come in handy at his next job.
- Phineas downloads data late at night and exports it to a USB drive.
- This type of data theft caused by departing employees happens in 56% of organizations, according to research from DTEX.
Q: What insider threat indicator would have alerted Phineas' current employer?
A: Unusual access time and large downloads.
Example 2: Indicator of unintentional insider threat
- Skye is a third-party vendor with privileged access to her client’s servers.
- She gets a new computer and inadvertently downloads a virus while setting it up.
- Then, Skye logs on to her client’s system—and the virus transfers to them.
Q: What insider threat indicator would have alerted her client?
A: Unknown devices accessing privileged resources.
Many unintentional insider behaviors put organizations at risk
Common workplace behaviors and processes expose your organization to insider risk, such as:
- People sending information to their personal email or cloud storage accounts so they can work remotely increases the risk of accidental data leakage.
- People clicking on suspicious links in phishing emails or downloading unapproved applications.
- Poor password management, leading to stolen credentials.
- Poor access management, such as broad-standing access, exposes sensitive, protected data to those who don’t need to see it.
- Misconfigurations of IT systems, such as S3 buckets, leaving them open to users.
- Trust and assumptions. We want to assume people are doing the right things, so we don’t investigate when something seems fishy.
- Over-dependency on all-powerful domain admins. These technical experts have inside knowledge of IT systems and the opportunity to cause serious damage to your organization. They know the back doors that provide entry. While most admins are trustworthy, upstanding people, those who are malicious are incredibly dangerous. Admins with ill intent know how to cover their tracks thanks to their in-depth knowledge of common insider threat detection techniques.
How do you lower the risk of insider threats?
Now that you know the indicators of insider threats, how can you lower your risk?
Traditional threat hunting and eradication methods designed for external actors aren’t appropriate to prevent, identify, and contain insider threats and lower insider risk. Rather, effective detection of insider threats requires a combination of technical tools, behavioral analysis, and employee awareness.
Lowering the risk of insider threat is a three-pronged approach.
1. Deter people from becoming an insider threat
Empower people to get their work done productively and securely, without putting the onus for following complex security processes on their shoulders.
Remove broad, standing access, including local administrative access on workstations. That way, automated, policy-based controls step in to prevent a user from seeing or exposing sensitive data, changing system settings, or introducing malware into your IT environment.
Remember those all-powerful admins? You can reduce the risk of accidental as well as intentional insider abuse by limiting their powers as much as possible. Ensure admins operate with limited privileges unless they need to elevate them, and then only on a limited basis.
2. Detect insider threat indicators with automated tools
UEBA solutions use machine learning to detect anomalies and outliers in user and entity behavior, helping identify potential insider threats. Data Loss Prevention (DLP) systems can alert administrators to unauthorized data downloads and transfers. Privileged behavior monitoring can add more intensive surveillance mechanisms for high-risk environments and users.
Know the common times when insider risk increases, for example, when the company has a large layoff, a new partnership is formed, or new workplace tools are integrated. At those times, it’s a good idea to increase your monitoring of insider threat indicators and add requirements for authentication and access.
3. Disrupt insider threats
Consider software that implements active blocking technology, preventing specific data types, fields or files from being able to leave the organization. Few companies rely on fully automated processes like yanking employee access to critical systems. Rather, they prefer to investigate and confirm the reason for unusual insider behavior, provide warnings, and increase security controls.
Who is responsible for identifying potential insider threat indicators?
Simply put: Everyone is responsible!
Knowing insider threat indicators and knowing how to report them are important aspects of cyber awareness.
At the very least, you should have monitoring and alerting mechanisms in place, based on the common indicators of insider threats and a behavior-based risk model.
In addition, your security operations team might establish an insider threat-hunting program to actively seek out suspicious activities.
Good cybersecurity awareness training allows all staff members to identify and report suspicious behavior. Encourage a culture of trust and reporting among employees so they feel safe to do so. Anonymous reporting mechanisms can provide valuable insights into potential indicators of insider threats.
Insider risk shows no sign of slowing. Remote work, systems integration, and dependence on service partners and other third parties in extended supply chains are bound to increase the potential for both accidental insider behavior and malicious insider threats.
Organizations spent an average of $15.38 million in 2021 to deal with insider threats—up 34% from the $11.45 million in 2020. Cyber leaders need to do all they can to understand and stay on top of the indicators outlined here as cybersecurity budgets get tighter.
This is the time to implement the strategies discussed in this blog so your organization is prepared.
What does cybersecurity like this cost? Not as much as you think