Skip to content

Making Compliance Analytics Work

Compliance officers spend more and more time these days thinking about data — how to collect it, how to study it, how to report it. So while I attended the Global Ethics Summit in New York this week (excellent event), I dropped into a panel discussion about how compliance officers can build more effective, data-driven programs. Here’s the wisdom they shared.

First let’s understand the challenge, because the pressures bearing down on compliance officers are coming from several different directions at once. Boards want more assurance about compliance risks, so they’re asking more questions that are more sophisticated. Regulators are paying more attention to the effectiveness of compliance programs, and want more evidence that your program actually works. And business technology is diversifying, as more operating units use their own cloud-based service provider rather than some big honking ERP business software system doing all things for all departments.

analyticsTo my thinking, that last pressure about technology is the most significant, because it means how compliance departments consume data depends in great part upon how other parts of the business create data. Once upon a time, those big honking ERP systems generated data in specific formats, that you could use to fulfill specific reporting duties.

That’s fading. Now each business function can use its preferred cloud-based software from a third party: HR using Workday, marketing using Salesforce, web developers using JIRA, and so forth. Even compliance functions use their own cloud-based software now, from any of vendors.

That’s good for the individual business operating unit, but it changes how compliance functions can harness data. You may need to spend more time culling data from other business functions that mostly meets your analytics needs, but doesn’t entirely meet them — so you’ll need to spend more time either harmonizing the data, or convincing employees to change their workflow habits to give you better data from the start.

We talked about that challenge at length earlier this year in a podcast with the head of compliance at AB InBev. I won’t belabor the point here, but compliance officers should understand that it’s an important one if you want to get your analytics right.  

Anyway, that’s the abstract theory of it. Here are more specific examples from the panel at Ethisphere.

From Analytics to Insight

One compliance officer talked about analyzing internal reporting and case management data, and discovering that complaints from female employees were more likely to be unsubstantiated than complaints from male employees.

That was an important insight, because it prodded her to ask new questions about why that might be. Did female employees complain about different issues that are harder to verify? Did managers take their complaint less seriously?

The answers to those questions, the compliance officer said, guided her thoughts about possible new training materials or executive communications that might be necessary. In theory, the answers might even lead a CCO to begin disciplinary action against managers who ignore female employees’ complaints (that’s my example, not the panelist’s).

So this demonstrates how quantitative analysis of data can inform qualitative changes to corporate culture or policy. Bulk analysis driven by technology can identify trends we mere humans might not otherwise see, and prompt use to ask questions we hadn’t previously considered. The answers can pave the way to a better compliance program that’s more responsive to employees’ needs.

Bulk analysis driven by technology can identify trends we mere humans might not otherwise see, and prompt use to ask questions we hadn’t previously considered.

Another speaker gave the example of interactive Codes of Conduct, where employees can poke around and explore issues relevant to them, rather than read a static PDF that serves up the complete Code like a Russian novel.

The benefit here is that a compliance officer can track that interaction. You can see whether employees spend more time on sexual harassment guidance or anti-bribery guidance. You might even build some sort of query function where employees can submit questions to the Code, to see what’s on their mind — including, especially, material that isn’t in the Code.

Then you’re generating usage data about the Code. Collect enough of it, and you can start to assess whether the Code is comprehensive and user-friendly, or is ignores risk areas that employees have on their minds. Quantitative analysis leading to qualitative improvement.

One caveat: You should tell employees that their interactions with the Code are being monitored. You may even want to build anonymized safeguards into the Code, so that compliance managers can only see the interaction data in bulk rather than by specific individual. Transparency about the whole process is paramount; if employees ever discover the monitoring some other way, their trust in the compliance function is sunk.

Interactive Compliance

One other great idea came from a panelist who voiced a common frustration: other companies calling you to evaluate your compliance program, as they mull whether to work with your company. We are all somebody else’s third party, after all.

This panelist created a database of standard questions other companies ask about his program, along with standard answers that his sales department could provide back to those companies.

Great idea. So how does this CCO measure whether it works? He measures how often the sales prospect returns to ask for more information, which tells him whether his standard answer may be incomplete. Over enough time, with enough follow-up questions and feedback from customers, that might even illuminate some areas of your program that aren’t complete — say, a policy it doesn’t have or a training issue it doesn’t typically address.

That’s how your compliance program can improve with age. Even better, other employees in the enterprise are the ones telling you where the program should improve, rather than you guessing where the program should improve. That’s an ideal everyone should strive for.