Even as attacks on corporate networks become more prevalent, insider threats continue to pose the biggest data breach risk for companies in all industries and across all geographies. To better manage and prevent this risk exposure, corporate leaders still have much to learn both from corporate data breaches of the past and from those that have developed best-in-class insider-threat programs.

Findings from several recent surveys highlight the extent of the insider-threat landscape. Kroll’s Global Fraud and Risk Report, for example, found that the biggest internal threats are current, former, or temporary employees.

In that report, 79 percent of 555 senior executives worldwide across multiple industries and geographies identified perpetrators as being:

Internal senior or middle management employees;

Internal junior employees;

Former employees; or

Temporary employees.

“Insider threats can take many forms, from unhappy employees with malicious intent to careless workers who inadvertently install malware, or even third parties that don’t follow security policies,” Kevin Jacobsen, executive director of EY’s Fraud Investigation & Dispute Services (FIDS) practice, said during a recent webcast on insider threats.

EY’s latest “Global Information Security Survey” further highlighted the type of insider-threat vulnerabilities that have most increased companies’ risk exposure over the last year. The top vulnerabilities, cited by the 1,735 executives polled, are careless employees; unauthorized access; and outdated information security controls or architecture.

Kroll’s report cited similar vulnerabilities. Specifically, the highest reported attack vector, cited by 26 percent of respondents, occurred from a software vulnerability. The second and third most commonly cited causes for cyber-incidents were “employee error” and “attacks on the corporate website,” each cited by 22 percent of respondents.

The theft of physical assets was the most common type of fraud experienced over the past year, cited by 29 percent of respondents. The other top two most common types of fraud cited were vendor, supplier, or procurement fraud (26 percent) and information theft, loss, or attack (24 percent).

In a third survey, the Ponemon Institute’s latest “Cost of a Data Breach Study” found that the most data breaches were caused by hackers and criminal insiders, including employees, contractors, or other third parties. Among the 383 participating companies, 48 percent of reported breaches were caused by criminal or malicious attacks, such as malware infections and phishing schemes.

“Insider threats can take many forms, from unhappy employees with malicious intent to careless workers who inadvertently install malware, or even third parties that don’t follow security policies.”
Kevin Jacobsen, Executive Director, EY, Fraud Investigation & Dispute Services

“It is hard to underestimate or understate the level of threat that companies are subject to from outside penetration in the form of hacking and phishing schemes,” says Daniel Karson, chair of Kroll’s Investigations and Disputes practice. “The bad guys only have to be right once.” Companies most vulnerable to an attack are those whose information security systems are not state-of-the-art, or are not up to industry standards, he says.

In the Kroll report, the most frequent type of cyber-incident, cited by 33 percent of executives, was a virus or worm infestation, whereas 26 percent of respondents cited e-mail-based phishing attacks as the second most frequent type of cyber-incident.

It is also important to note that malicious or criminal attacks vary significantly by country, according to the Ponemon Institute data breach study. For example, 60 percent of all breaches in the Arabian region (United Arab Emirates and Saudi Arabia) and 54 percent of all breaches in Canada were due to hackers and criminal insiders.

Among South African companies, only 37 percent of all data breaches were due to malicious attacks, with the highest percentage due to human error. Indian companies, in comparison, were most likely to experience a data breach caused by a system glitch or business process failure (37 percent and 35 percent, respectively).

Case study: Lockheed Martin

All of these insider-threat characteristics combined—the type of perpetrator, the form of attack, and the region of a data breach—should be taken into consideration when implementing a state-of-the-art insider-threat program, given that each demands different response tactics.

During the EY webcast, Doug Thomas, director of counter-intelligence at aerospace giant Lockheed Martin, shared the arduous journey that Lockheed took to implement its state-of-the-art insider-threat program. That started with getting buy-in from the senior leadership team, including the chief executive officer, chief operating officer, the executive vice president, and senior vice presidents—a process that was easier said than done.

Developing an insider-threat program that was legally and regulatory sound was the easy part; what was difficult was aligning it with Lockheed’s corporate values, Thomas explained. “Just because you can do something doesn’t mean it’s the right thing to do,” he said. “You have to tailor your program to your culture.”

Although Lockheed’s senior leaders were more than willing to embrace counter-intelligence tools to spot and mitigate external threats to the company, some were not as comfortable with the idea of monitoring human behavior, which was the biggest sticking point.

KEY STEPS FOR BUILDING AN INSIDER THREAT PROGRAM

Below are some key steps for building an insider-threat program from EY’s “Managing insider threat through the lens of a seasoned investigator” webcast.
Gain senior leadership endorsement, develop policies that have buy-in from key stakeholders and take into account organizational culture;
Develop repeatable processes to achieve consistency in how insider threats are monitored and mitigated;
Use analytics to strengthen the program backbone, but remember implementing an analytical platform does not create an insider threat detection program in and of itself;
Coordinate with legal counsel early and often to address privacy, data protection and cross-border data transfer concerns;
Screen employees and vendors regularly, especially personnel who hold high-risk positions or have access to critical assets;
Implement clearly defined consequence management processes so that all incidents are handled following consistent standards, involving the right stakeholders;
Create training curriculum to generate awareness about insider threats and their related risks;
Leverage information security and corporate security programs, coupled with information governance, to identify and understand critical assets.
Source: EY FIDS.

Many companies employ technology that monitors only online anomalous activity or behavior, such as downloading sensitive company information at a higher volume than other employees, for example. “If you have a data loss prevention tool, and you think that’s your insider-threat tool, you’re mistaken,” Thomas said. “That’s only half the solution to the problem.”

If you’re truly going to have a robust insider-threat program, Thomas explained, you have to also understand the human behavior element. For Lockheed, the idea was to monitor every aspect of employee behavior.

To overcome any doubts, Lockheed created an insider-threat advisory review committee, made up of human resources, compliance, legal, privacy, ethics, and information security. This committee was tasked with writing a “Concept of Operations,” describing what the insider-threat program is and is not, Thomas explained. “This is a team sport,” he added. “Where you house this doesn’t really matter.”

“While we were building this Concept of Operations, I can’t tell you the amount of conversations that went into privacy and the importance of the communication campaign,” he said. Absolute transparency in the purpose and objective of the program is paramount.

“We don’t profile people; we profile behavior,” Thomas said. “We have a human behavior and digital behavior baseline of everybody in the company. You’re looking for anomalous behavior.”

For example, if somebody intentionally violates policies, especially IT policies, that could signal a red flag. Personal financial stressors or behaviors in the workplace, including the quality and quantity of the employee’s work, are also signs of human behavior to keep any eye on.

“One person you need to get in front of is first-line supervisors, because they are the ones who are going to know their employees the best to see if there have been any changes or concerning behavior,” Thomas said.

If a supervisor or other employee identifies anomalous activity or behavior, they should have the ability to confidentially or anonymously report the issue to an appropriate stakeholder—ideally, a senior-level executive with the authority to investigate the potential insider threat. To help foster a speak-up culture and encourage people to come forward, however, “we don’t use the word ‘report,’ ” Thomas said. “We’re not encouraging our employees to ‘report,’ because we don’t want to create a culture of snitches.” Instead, he said, “we want employees to be ‘engaged.’ ”

Additionally, Lockheed has in place a “very robust governance structure,” Thomas explained. At the vice-president level is a steering committee that has to approve any changes or enhancements made to the Concept of Operations, and every three months the steering committee is briefed on the program.

Furthermore, because espionage and information theft for our company is a high risk, Lockheed’s risk and compliance committee is also briefed every six months. Internal audit is also invited every year to audit our program, and the board of directors is briefed every nine months on the program itself.

Another important aspect of establishing a robust insider-threat program is to have clear policies and procedures from the get-go. What does the company consider to be confidential and proprietary information? What are its crown jewels?

In those policies and procedures, “you want to reserve the right to monitor and inspect company systems and devices that you’ve provided to the employee,” Luke Dembosky, a cyber-security partner at law firm Debevoise & Plimpton, said during the webcast. If the company allows employees to use their own devices to access the corporate network, make sure you delineate exactly what inspection and monitoring rights the company maintains, he said.

Companies must also limit employee access to certain systems through network segmentation. “This is where your insider policies and procedures marry up with broader cyber-security defenses,” Dembosky said.

Finally, training employees is also an important element of an insider-threat program not just to educate employees on how to spot insider threats, but also to remind them about the company’s policies and procedures, Dembosky added. It’s also important to log who showed up, so that if a legal matter were to arise, the company has that documented evidence to show that the employee was aware of the policies and aware of the company’s inspection and monitoring rights, he said.

Looking ahead

In many respects, insider threats and outsider threats are one in the same. Increasingly, malicious outsiders are using internal “spotters” to identify specific targets, server information, and individuals to be hacked—and, more troubling, is that these people can stay active in the company for a long time without being discovered, Dembosky warned.

As companies and government agencies fortify their networks, “you’re going to see more human-enabled cyber-attacks—and that is your insider,” said Louis Bladel, executive director of EY and former special agent in charge of the Counterintelligence Division of the FBI’s New York Field Office.

All of this is to say that companies must continue to do everything in their power to enhance their cyber-security defenses. With time, the risk will grow only more complex, and the repercussions, more severe, making a resilient insider-threat program more critical than ever before.