A Better Understanding of Estimates

Last month the PCAOB published fresh guidance on how to audit accounting estimates. Today let’s take a deeper look at what that guidance says, because the principles it recommends can be quite useful for compliance and risk officers too.

To begin with the obvious: Yes, this guidance is written for audit firms that are inspecting corporate financial statements. Even then, the guidance is meant to help audit firms as they implement a new standard for auditing estimates that won’t go into effect until 2021. 

So what? The point of the guidance isn’t to help an auditor understand whether an estimate is correct. It’s to help audit firms understand the processes a company uses to develop its accounting estimates — and those processes also turn up in lots of other places in the modern enterprise, including plenty that are relevant to compliance officers and risk managers. 

That is, from today into the future, “management” will really be about relying on IT systems that monitor large amounts of corporate transactions. Those systems will crunch all that data into small summaries that executives get: the estimates, projections, forecasts, and so forth that guide the decisions senior leaders make.

Before we get to those details, let’s consider two recent examples of accounting estimates gone wrong. In July the SEC hit rent-to-own retailer Conn’s with a $1.1 million penalty for low-balling its allowances for bad debts. Last December the agency hit Hertz with a $16 million penalty for sloppy accounting, where management had fudged estimates on allowances for bad debts and the value of its vehicle fleet.

In both cases, management substituted its own judgment into an important accounting estimate, rather than using data that would give a more accurate number. That was the misconduct. 

So a better understanding of where such numbers come from, the reliability of the systems that generate them, and when management should or should not deviate from what the numbers suggest — that goes a long way to better governance. That’s why understanding the issues in this PCAOB guidance is worth your time, even if you’re not an auditor.

Principles for Evaluating Estimates

So what does the PCAOB guidance actually say? A few key points.

First, evaluate the data used to generate an estimate. An estimate can come from the company’s own data, data provided by an external source, or a combination of both. Each type raises certain questions. 

For example, if you’re evaluating internally generated data, you want to understand the completeness and accuracy of the data, and to test the controls over that completeness and accuracy. You also want to understand whether the data is sufficiently precise and detailed to give you an estimate that’s useful.

That’s why using Excel spreadsheets for compliance or financial reporting is such a bad idea: because completeness and accuracy over spreadsheets is a mess. The ideal answer is a single repository of data, where changes to any one piece of information automatically flow through to all the reports that might use that piece of data. 

This is also why data taxonomies and data validation — that is, devising the correct labels for your data, and then confirming each piece of data is labeled correctly — is so important. Taxonomies and validations get to the point about about data “sufficiently precise and detailed” for the purpose at hand. 

When an estimate is derived from external data, auditors spend more time evaluating the relevance and reliability of the data. Auditors have a whole separate standard to use for evaluating evidence, that we won’t explore here. Compliance officers might want to ask questions such as: If we asked a different external vendor for the same type of information, would a different vendor give us different data? (For example, if you’re collecting historical  economic data, you should get the same GDP numbers no matter who supplies it.) 

The fundamental point about external data is that you should trust the supplier’s competency, and don’t be squeamish about testing that supplier’s data and models the same way an audit firm might test yours.

And regardless of whether the data you’re using is internal, external, or a combination of both, several other questions are important to ask as you evaluate whether you have the right data for what you want to do. 

First, is the data relevant to what you’re trying to measure? Could you find data that’s more recent, or more precise? For example, if you’re studying the completion rates of third-party due diligence, are you relying on data from the last calendar year; or data up to the last month? 

And second, if the company has changed the source of data for an important metric, do you understand why that’s happened? Sometimes changing to a new source of data might give you a more accurate result. It might also give a more desired result — say, to hit a certain performance goal. Compliance officers should be able to sniff out those ulterior motives and call them into question.

Study Assumptions and Models

The second part of the PCAOB guidance dwells on the models and formulas a company might use to process all that data and arrive at an estimate, and the assumptions behind those models and formulas.

First, identify the significant assumptions a company uses when developing an estimate. Then evaluate whether those assumptions are reasonable. That’s the process here. 

What’s a “significant” assumption? The guidance gives a few examples: 

  • Are sensitive to variation, so that minor changes in the assumption can cause significant changes in the estimate; 
  • Are susceptible to manipulation and bias; 
  • Involve unobservable data or company adjustments of observable data; or 
  • Depend on the company’s intent and ability to carry out specific courses of action.

None of those criteria automatically mean an estimate is wrong. They only mean an estimate could go wrong in important ways, so watch the estimate closely

For example, one infraction in the Hertz enforcement action was management arbitrarily extending the expected lifetime of its vehicle fleet from 12 to 20 months. That change allowed Hertz to cut $15 million from its depreciation costs. 

Well, that assumption was certainly sensitive to variation, since it saved $15 million. It was susceptible to manipulation. It involved company adjustments to difficult-to-observe data. And the change depended on the company’s intent to carry out a specific course of action: earnings manipulation.

None of that means you can’t change the value of long-lived assets if that’s warranted. It only means people should put such changes under the microscope.

And how does one determine whether the assumptions involved in a model or estimate are reasonable? The key point is whether those assumptions are consistent with the company’s risks, business objectives, and regulatory requirements

As the PCAOB guidance suggests, try to determine whether significant assumptions are consistent with — 

  • The company’s objectives, strategies, and related business risks; 
  • Existing market information; 
  • Historical or recent experience, taking into account changes in conditions and events affecting the company.

Here we could look to the Conn’s enforcement action as an example. In that case, the company was low-balling allowances for doubtful accounts, even as the company expanded into new customers whose credit history wasn’t known. Rather than use a “roll rate” to estimate losses based on historical data, Conn’s management used their own judgment. 

So that’s an estimate based on assumptions not in line with historical data (about past customer losses) even as business conditions were changing (reaching new, less trustworthy customers). Therefore, Conn’s should have paid more attention to its estimate for allowances. 

It didn’t — and we thank the company today for showing us what not to do.

Leave a Comment

You must be logged in to post a comment.