TEC 2017: Importance of Reliable Data
Workiva’s TEC 2017 conference opened on Tuesday with a group of CFOs fretting about data—and, when you think about it, who can blame them?
Companies are drowning in data. CFOs, CEOs, and other senior leaders spend far more time than they should simply confirming the authenticity and accuracy of it. The morning session even featured a video of a mock CFO panicking just before an audit committee meeting, asking her staff, “Where did this data come from?” (Spoiler alert: Wdesk saved the day.)
All that time spent confirming or correcting data is time not spent analyzing data. And yet, the volume of data companies generate is only going upward in the future. No wonder CFOs’ top priority is to tame it.
The amount of data in a modern enterprise “has changed dramatically,” said Steve Klei, Audit Committee Chair of Demandbase and former CFO of Kabam, during the morning panel discussion. “It’s one of my biggest concerns.”
That’s why having a single source of truth for data is so critical, according to Ian Charles, CFO of Host Analytics. Even amid all the data complexity, at least senior executives can focus their attention on what the numbers mean—not why the deferred revenue number on Slide 6 of the PowerPoint deck isn’t the same as the deferred revenue number on Slide 17. Discrepancies like that only build mistrust in financial reporting processes and the financial executives who oversee them.
“Alleviating that risk from the start, by having a single source of truth, is the most important thing a CFO can do,” Charles said.
Charles is not exaggerating. Without a single source of truth, the explosion of data in the modern corporate enterprise will only trigger another explosion of manually checking all that data. That undermines the whole premise of self-service reporting. If you invest to improve reporting and analytics, without first assuring that the data is sound, you’re just squandering resources on compliance and audit IT projects that won’t deliver.
SOX compliance professionals should already grasp this point—because, really, it’s all about completeness and accuracy of reporting. Auditors grill their clients about completeness and accuracy because they’re paid not to trust the data handed to them; once you can demonstrate completeness and accuracy, they can move on.
“Single source of truth” is just another way of saying “completeness and accuracy” for internal reporting. We have too much data to know where it all comes from, and we don’t trust what it tells us—usually thanks to a history of flawed, manual processes. If you establish that single source of truth, then you can work on building versatile reports. That, in turn, leads to better analysis. Which is why companies are making all this investment in the first place.
I can’t help but think of an especially timely example: the CEO Pay Ratio Rule. That rule (part of the Dodd-Frank Act) requires companies to calculate the ratio of the median annual total compensation of all employees and annual total compensation of the CEO. Corporations hate that idea and had long hoped the Securities and Exchange Commission would never actually implement it.
Well, too bad. Just last week, a top SEC official confirmed that the Pay Ratio Rule will go into effect with 2018 proxy statements.
Calculating that ratio will require an unholy mess of data from payroll, HR, and operating units. The data itself will shift constantly, as total number of employees and total compensation change. Accuracy and completeness of the data will be crucial.
And that’s just one telling example of the reporting challenges to come. Corporations will have (do have already) many more. The technology to extract that data, in swift and precise ways, does exist—but like the old computer programmers have said for years: garbage in, garbage out.
(Today’s item is cross-posted from the Workiva blog. You can view the original post there. Look for more dispatches from the 2017 Wdesk user conference all week!)