You roll out an online training program focused on anti-money laundering regulations. The information collected by the program indicates that everyone who needs to took the course and passed the brief exam at the end.
So, does that mean the training was effective? As online compliance training has become more popular, a growing number of compliance chiefs are asking this very question.
According to NAVEX Global’s 2014 Ethics and Compliance Training Benchmark Report, 71 percent of compliance training programs use online tools. They’re used more frequently than any other methods, including live training and print resources.
Given the popularity of online training, compliance and other executives need to know these tools are effective—that the employees who participate gain a solid understanding of the relevant regulations and then apply their knowledge. After all, training of all types can account for a significant portion of a compliance budget, and the courses often take non-compliance employees away from their official responsibilities. “Executives are asking if the training is valuable,” says Ingrid Fredeen, vice president of advisory services with NAVEX Global.
More is at stake than just the dollars, although the costs can add up. An effective compliance training program can earn companies favorable treatment in the event of a compliance lapse. The Federal Sentencing Guidelines state: “The organization shall take reasonable steps (a) to ensure that the organization's compliance and ethics program is followed, including monitoring and auditing to detect criminal conduct; (b) to evaluate periodically the effectiveness of the organization's compliance and ethics program.”
“Just as organizations monitor business operations and make adjustments to boost performance, they need to monitor and adjust their compliance training programs.”
Joan Meyer, Chair NA Compliance & Investigations Practice, Baker McKenzie
Just as organizations monitor business operations and make adjustments to boost performance, they need to monitor and adjust their compliance training programs, says Joan Meyer, chair of the North American compliance and investigations practice at Baker McKenzie. The goal here, however, is to reduce risk.
More companies are going beyond the old method of putting out compliance training programs and just hoping they work. When asked about training trends they’re currently applying or will apply, “measuring training effectiveness” came in second, just below “adding more course titles.” Fredeen says she’s noticed this shift in her conversations with companies. “Over the last twelve to 18 months, I’m hearing more clients talk about effectiveness.”
At the same time, accurately determining effectiveness is tough. Indeed, 96 percent of participants in the report ranked it as a moderate or significant challenge – putting it ahead even of budgetary challenges. “It’s a hard, practical problem,” says David Guralnick, president of the International eLeaning Association.
Ideally, compliance professionals would be able to measure how well employees understand the material and how their behavior changes as a result of the training. But in contrast to, say, answers on multiple choice quizzes, behavior changes can’t be easily quantified. Moreover, it’s always risky trying to connect a change in behavior to a specific course or class, since numerous factors influence how individuals act.
As a result, few organizations appear to have mastered this. “I’ve not talked to one person who has it totally figured out,” Fredeen says.
Many organizations start with statistics that are relatively easy to assemble, such as the percentage of employees who completed a training course. In fact, nearly three-quarters of survey respondents measure this. Another attribute, course quality—that is, whether the material is relevant and presented in an engaging way—is another area in which compliance professionals are interested.
Again, this information can be important. Rebecca Herold, an information privacy, security and compliance consultant and co-owner of HIPAA Compliance Tools, says she’s come across purported courses that were nothing more than hundreds of PowerPoint slides, each containing text excerpted directly out of some regulations, such as HIPAA. “It’s not even training. It’s just dumping information that’s boring to most.” Not surprisingly, few employees actually ever read it, she says.
Organizations need to look more deeply than the information presented by these measures to evaluate participants’ true understanding and application of the material, Herold says.
While more companies are interested in gauging the effects of training. The discipline isn’t entirely new. For decades some companies have employed what is known as the “Kirkpatrick Model.” In 1954, Donald Kirkpatrick developed the model for his Ph.D. dissertation, “Evaluating Human Relations Programs for Industrial Foremen and Supervisors.”
Kirkpatrick’s model is based on four principles. As an organization progresses through the four levels, it gains increasingly valuable insight.
Level 1—Reaction: To what degree do participants react favorably to the training?
Level 2—Learning: To what degree do participants acquire the intended knowledge, skills, attitudes, confidence, and commitment based on their participation in a training event?
Level 3—Behavior: To what degree do participants apply what they learned during training when they are back on the job?
Level 4—Results: To what degree do targeted outcomes occur as a result of the training event and subsequent reinforcement?
Here’s how these principles could come into play with online compliance training:
Reaction: Assessing this could start with a survey that asks employees how they felt about the course. Did they like it? Did they find it engaging?
To be sure, any number of variables can influence the responses to such questions. An employee who is unhappy in his or her job may not like the training, no matter how good it is. Still, this information can be useful; participants who can’t wait until the course is over are unlikely to learn much.
An important part of this step is establishing up front a number that indicates some level of success, Fredeen says. For example, if 75 percent of participants say they found the training worthwhile, is that a good number? The idea is to set a baseline and then improve, she adds.
Learning: A combination of quizzes, interviews, or follow-up surveys at several intervals—for example, immediately after the training and then again a few months later—can help to evaluate a participants’ understanding and retention of the material, as well as highlighting areas that require additional explanation. “If 80 percent miss the same question, you need to provide more training on that topic,” Herold says.
Again, it’s important to define success ahead of time. What do participants need to learn for training to be deemed effective?
Fredeen also warns that capturing this data comes with responsibility, particularly if participants’ names are included with the results. If it becomes clear that an employee is likely to violate a rule because he or she doesn’t understand it, the organization needs to take action. “Be careful what you collect, because you own it once you have it.”
Behavior: While the goal of effective compliance training is to change behavior, determining if it actually did is a soft science. Still, compliance professionals can take steps that will offer clues. To start, they can ask managers if they notice appropriate changes in behavior, Fredeen says. Are more employees speaking up when they notice something that looks suspicious? Are their determinations of suspicious activities generally accurate? Compliance officers also can look at hotlines and case management data to see if levels of misconduct are declining.
Communication Formats Used in Ethics and Compliance Training & Awareness
In the following graph from NAVEX Global, companies were asked what form of training they most frequently use.
Source: NAVEX Global.
Observation can also be a powerful tool. Three to nine months after she’s conducted training, Herold often conducts walk-through audits of the organization, checking for any mis-steps. For instance, is confidential information left unsecured on employees’ desks after hours?
Scenario-based assessments—asking course participants, “In the situation described here, what would you do?”—can provide an idea of how employees might apply the information they learned, Guralnick says.
The questions should be tailored to the company’s business model. “Discuss real-world risks that occur in the business,” Meyer says. Companies with extensive distribution channels need to cover the risks in working with third parties, while those engaged in government contracting will want to incorporate anti-bribery regulations.
Results: Credibly assessing the results of compliance training typically requires experts who can appropriately control for variables, then identify links between training and the outcomes desired, such as a more ethical corporate culture. Most organizations will need to use a mix of data, as no single statistic will provide all the information needed. Given the challenges, few have truly been able to master this, Fredeen says.
Even so, trying to capture this insight is worthwhile. “If a company takes the time to properly train employees,” Meyer says, “it minimizes risk to a great degree.”