How to Combat Insider Threat Using Behavioral Science
Countering cyber threats in not just a technical challenge, it is also a human challenge. A goal for organizations is to deter insider threat before it even happens, and understanding human behavior is essential for doing that. Organizations have to be able to deter employees from making bad decisions in the first place by first letting them know they are being watched and that they’re likely to be caught. That’s the detection part. And then letting employees know the organization cares if they’re having a rough time and has programs in place to assist them. That’s the mitigation part.
When I was a special agent for several government agencies I learned a lot about understanding human behavior and how I could use that to solve crimes. Now that I work in cybersecurity these same skills come in handy when trying to combat insider threat. If cybersecurity professionals have not been trained in human behavior they can take a few classes on the subject or they can engage behavioral scientist to advise them on defining normal, baseline user behavior, how to identify abnormal changes to that baseline and what those changes might mean.
One of the main things I was taught as an agent was how to look for what were known as fraud indicators, which basically is suspicious behavior. I was trained to recognize deviations from normal behavior, which are called “triggers.” Common triggers for insider threat might include employees’ use of removable media, their use of printers or copiers far from their office, or employees logging onto the computer system during hours when they are not assigned to work.
Cybersecurity engineers rely on triggers such as these when they design automated monitoring systems. Unfortunately, many monitoring systems are designed around single triggers which often leads to false alarms. Automated systems need to be able to base their decisions on multiple triggers, which is easier said than done.
For cybersecurity professionals mitigating insider threats often means fixing the damage done by an insider attack. However, for behavioral scientists, it also means examining the mitigating factors within an organization that can increase or decrease the chance of an insider threat in the first place, or preventing it from happening again.
The known triggers of disgruntlement and ego often play a role in motivating insider attacks. As examples, to mitigate disgruntlement, organizations can provide employees with avenues to vent concerns and frustration. To mitigate ego, organizations can implement employee recognition programs that offer more public praise.
Greed, another trigger that leads employees to such things as selling organizational secrets, can be another motivation for insider attacks. It will be difficult for cyber professionals to deter an employee who conducts and inside attack due to greed, but organizations can certainly address such things as grievances due to perceived inequities in compensation which may have led to the insider threat incident.
Insider threat detection programs need to include consultation with behavioral scientists when it comes to deterring insider threat issues. Technology always in some way involves human beings so you can’t tackle a technological challenge without taking into account human nature. And the experts in human nature are behavioral scientists.
Read about using Behavioral Analytics to help identify warning signs in employee behavior at http://www.nationalcybersecurityinstitute.org/hactivism-terrorism-crime-and-espionage/uncover-insider-threats-through-user-behavior-analytics/
Source: Mitre (2012, October).The Human Factor: Using Behavioral Science to Counter Insider Threats. http://www.mitre.org/publications/project-stories/the-human-factor-using-behavioral-science-to-counter-insider-threats