Close

Are you in compliance?

Don't miss out! Sign up today for our weekly newsletters and stay abreast of important GRC-related information and news.

Cyber-Breaches and Other Threats Involving Conscious Opponents

Jose Tabuena | June 30, 2015

Since my last column we’ve learned about another string of high-profile cyber-attacks. Breaches have happened at the Internal Revenue Service, the Office of Personnel Management, and even in American baseball. Any organization that uses technology and has people (and it’s hard to imagine one that does not) faces increased risk.

So what should boards and their organizations do?

The starting point remains the same: Assess cyber-security as you would any other risk that becomes part of the risk appetite discussion. Cyber-security is still a business issue, rather than a technology one. As with other significant risks, eradicating cyber-threats completely is not possible. But your organization needs to understand what the risk is and decide what you are willing to accept or to try to mitigate it.

In most respects the main differences with cyber-security risk are (1) the specific expertise that is needed, and (2) that assessment and action plans can involve different types of technology. But you need to consider another factor that makes cyber-security risk different: You have a conscious opponent you are trying to thwart.

You cannot say that about all risks you face. For example, some risks are catastrophic in nature: global pandemics, a meltdown at a nuclear power plant, earthquakes, tsunamis. They happen rarely, but they bring extremely serious consequences when they do. Managing catastrophic risks requires a mix of continued prevention and response planning should catastrophe strike.

Conscious Opponents

As labeled by Malcolm Sparrow with the Kennedy School of Government, there are risks that involve “conscious opponents” where a brain lies behind the harm. Cyber-risks involve a conscious opponent, and lead to a continuous duel with those seeking to circumvent your controls. The conscious opponent makes deliberate adaptations that are purposefully designed to defeat the effectiveness of preventative controls.

In most respects the main differences with cyber-security risk are (1) the specific expertise that is needed, and (2) that assessment and action plans can involve different types of technology. But you need to consider another factor that makes cyber-security risk different: You have a conscious opponent you are trying to thwart.

The key to effective controls for conscious opponents is to understand their thinking and motivation. The existence of a conscious opponent should influence the tools and methods you select to control your risk. The task of getting into an opponent’s head shifts the emphasis to active intelligence gathering, an exercise that goes beyond the range of basic process controls.

How do you figure out your opponent’s thinking? A few ideas:

  • Establish networks of contacts among peer organizations (including competitors) to facilitate the exchange of information about new (and often rapidly emerging) patterns of behavior, techniques, and players on the scene.
  • Try data mining that employs an improving range of pattern recognition and anomaly detection on large datasets with the timely examination and resolution of red flags and false alarms.
  • Use focus groups in key operational areas to solicit input from that ground (staff, customers, business partners) on vulnerabilities and exposures that may not readily be apparent.
  • Form tiger teams within the company to test the adequacy of existing controls—by putting themselves in the shoes of opponents, can the team devise novel approaches and breach the security in place?

Depending on the company’s circumstances and exposure to the risk, you can even borrow techniques from law enforcement and competitive intelligence in developing methods to thwart a conscious opponent. These can range from the less intrusive such as monitoring market conditions to those that involve direct contact with perpetrators like surveillance and undercover operations.

Application to Cyber-security

Cyber-security can be broken down into three categories: human error; system vulnerabilities; and direct attacks by a conscious opponent. The first two apply to all businesses and can be addressed by traditional administrative, physical, and technical controls to include policy changes, training, and sophisticated technology.

The allegations of hacking by the St. Louis Cardinals into the Houston Astros’ computer network illustrates that the basics are still important. Investigators believe that Cardinals personnel examined a master list of passwords used by a former employee who now works for the Astros. The employees are believed to have used those passwords to gain access to the Astros’ network. Clearly a lesson learned is that passwords need to be strong and periodically changed.

Sharing Is Good

With conscious opponents, organizations are realizing the need for enhanced focus on detection. Sharing threat information before, during, and after an attack with a trusted group of peers is one way to instill readiness. Information sharing within business sectors is particularly advantageous, because the organizations often face similar threats. By sharing cyber-threat information, organizations can gain valuable insights about their adversaries, such as the types of systems and information being targeted, the techniques used to gain access, and indicators of compromise. This information can be used to prioritize defensive strategies including patching vulnerabilities, configuration changes, and focused monitoring capabilities.

Organizations rarely speak publicly about the mechanics of how they were breached, or what they were (or weren’t) doing to protect themselves. While laws require disclosure of certain information about breaches—how many records were breached, or what types of data were stolen—much of this does little to help us understand how breaches happen or how to defend against them.

Sharing operational information outside an organization does present concerns about protecting proprietary information from competitors. The National Institute of Standards and Technology (NIST) prepared a “Guide to Cyber-Threat Information Sharing” that provides key practices to consider when planning, implementing, and maintaining information-sharing relationships. The guidance references the NIST “Framework for Improving Critical Infrastructure Cyber-security,” which was developed in response to President Obama’s executive order on reducing cyber-security risks.

The guide examines the benefits and challenges of coordinating and sharing, presents the strengths and weaknesses of a variety of information-sharing models, explores the importance of trust, and addresses specific data handling considerations. It provides a collection of scenarios that demonstrate the value of information sharing by describing real-world applications of threat intelligence sharing and coordinated incident response.

A good example of the value information sharing is the recent IRS breach. At a U.S. Senate hearing, how the breach happened was disclosed and reviewed. The thieves were able to download taxpayer information from the IRS “Get Transcript” application (intended for use by taxpayers to access their own records) by submitting personal information about taxpayers, including their Social Security numbers, birth dates, filing status, and addresses. At the Senate hearing, it was observed that these pieces of information were “obtained from sources outside the IRS.” So one lesson is that data breaches can spawn further breaches. Poorly protected pieces of information may be stepping stones for accessing more sensitive data.

Yet the hearing seemed more focused on why the IRS had failed to prevent the breach. It was noted that the IRS had not implemented 44 recommendations made in previous security audits. Keep in mind that these security weaknesses, while important, were not the specific cause related to the actual breach. The breach didn’t occur because the IRS systems weren’t patched or because servers weren’t being monitored. Instead, the hack’s success appears hinged on allowing users to authenticate their identities with just a few pieces of personal information—pieces of information not too difficult for bad guys to obtain.

One lesson learned is that companies should do a better job with authentication. Research on Big Data has shown that we can piece together scattered bits of personal information to assemble a fairly compelling facsimile of our identities. A noteworthy item: As a result of the personnel files hack at the Office of Personnel Management, the government’s chief information officer is requiring agencies to start using of multifactor authentication, a common security feature that’s been around for years.

Moreover, the bigger lesson is that the risk conversation needs to incorporate more strategies and methods to address conscious opponents. Individual incidents need to be examined and shared to provide opportunities to learn specific, tangible lessons about security so that opponents’ next move can be anticipated.

Another consideration is how open should organizations (and especially governments) be about their purposes and methods? Public constituents and company shareholders do need to know what risks the organization is addressing and how. But with conscious opponents there is a need to be selective about what is shared openly—e.g., the analytic techniques used for fraud detection if widely known could enable the opposition to determine the parameters and fly under the radar. With conscious opponents, control effectiveness may require less transparency, artful unpredictability, and an air of mystery.

Ultimately the board of directors and auditors need to recognize that this new environment requires a change in focus. Different skills and advisers may be needed depending on your environment, and you may have to decide if you need to build internal resources or tap outside expertise. For the distinct risk of fraud, professional auditing standards already require consideration of the use of specialists due to the unique characteristics and challenges of the risk. Audit teams and compliance units now need to evaluate if their organization’s available cyber-security expertise is sufficient, or whether it’s time to add a cyber-security specialist.