Be honest: risk and compliance can sometimes feel like an uphill struggle. Many of you have experienced commercial colleagues ignoring your advice, operations staff trying to work around controls, or senior managers secretly wishing regulatory demands and ethical considerations didn’t exist.


The International Compliance Association (ICA) is a professional membership and awarding body. ICA is the leading global provider of professional, certificated qualifications in anti-money laundering; governance, risk, and compliance; and financial crime prevention. ICA members are recognized globally for their commitment to best compliance practice and an enhanced professional reputation. To find out more, visit the ICA website.

As part of a series on culture change for the International Compliance Association, my aim is to demonstrate how studying human behavior can help alleviate some of the challenges of the compliance profession.

However, I must admit this series is also born of frustration. Let me explain.

Behavioral insights

The tools traditionally deployed by organizations to achieve cultural change are blunt instruments. Performance evaluations, bonuses, change management training, staff surveys, and intranet sites crammed full of policies form the core of most firms’ armories.

These tools have been used for many years, and their operational track record isn’t great. Indeed, they can seem like medieval forms of medicine–well-intentioned but without any grounding in science.

Real cultural change requires an understanding of the drivers of human behavior. And the most effective means of grasping these drivers is through behavioral science.

This is where my frustration comes in: Few organizations consult behavioral science when seeking to shape their internal culture. Why are these well-founded techniques not more widely used?

The British government has employed a “Behavioral Insights Team” for more than a decade. This unit works on how best to implement government policies using insights from behavioral science and has applied its findings in a range of interventions, from improving vaccination take-up and general practitioner cancer referrals to boosting exam results and encouraging green investment. The science offers simple and cost-effective interventions that can dramatically improve outcomes and even save lives.

A practical example—policies

Behavioral insights can help design systems that work with human beings rather than against them.

Consider your firm’s policies: do they say things like “documents containing sensitive personal information must not be saved to a shared drive on our network”? Such a policy is setting colleagues up to break the rule—after all, they may not know certain files contain personal data, nor may they know what constitutes “sensitive” personal data. So, expecting them to acknowledge the security settings on shared drives seems like a big ask.

If we do not anticipate the rule to be complied with, then why do we write it?

What if we designed our systems and processes to mitigate these human risks? If we are aware our colleagues tend to store personal data in openly shared areas, why not employ automated controls to subvert that human behavior? File scans for zip codes, automated document mark-up, email data leak protection—all these controls exist and have been used successfully, but for many of you reading this, I’d bet the only controls you have are an aspirational policy and training program for the pesky IT users who keep doing this.

Applying the science

Very few risk and compliance teams use behavioral science techniques to influence culture. In the same way compliance with government policy can be improved through informed intervention, so can embedding risk and compliance goals within an organization.

In this series, I aim to challenge assumptions on business I believe are without a solid, scientific foundation. These misconceptions include:

  • Staff behave in a way that is consistent with their expressed attitudes;
  • Staff go through a “change curve” when we redesign our strategy or organization;
  • Mistakes, failures, noncompliance, rule breaches, and process slip ups are most caused by “human error”;
  • Employees are rational adults and will align with our policies and culture as long as we clearly communicate them;
  • Ethical business is a concept we can all buy into and strive toward; and
  • Artificial intelligence (AI) can replicate and replace increasingly complex roles in our business.

Now you might consider all or some of the above as undeniable truisms. But I will argue they have no basis in the psychological literature. If anything, the evidence suggests the opposite:

  • Colleagues will say one thing and do another, especially if you ask them moral questions;
  • Staff often don’t move through a “change curve” from anger to acceptance;
  • Attributing a problem to “human error” is diagnostically lazy;
  • Staff do not behave in a rational manner and “treating them like adults” won’t help;
  • Ethical values are irrational and “communication” won’t solve that; and
  • AI absolutely should not replicate what we do; it must be better than us in important ways.

Listening to psychology

Most organizational goals are noble in intention. Helping our people through traumatic change, embedding ethical values, designing work environments with “human factors,” and modernizing organizations with powerful technology are, without doubt, laudable aims. And it is certainly the case building a compliant, risk-managed, ethical culture is the right thing to strive toward.

We are, however, pursuing these goals in the wrong way. I aim to demonstrate by listening to what psychology tells us about being human, we can create human-focused organizations that achieve those goals. To do this, we must abandon axioms and instead listen to what psychology tells us will work.

The International Compliance Association is a sister company to Compliance Week. Both organizations are under the umbrella of Wilmington plc.