A member of the Public Company Accounting Oversight Board gave a speech last week about the role of audit firms in cybersecurity — and raised a point people seldom say aloud: the audit firm’s role is a relatively small one.
Kathleen Hamm, a member of the PCAOB since last year, gave her remarks at a financial reporting conference in New York City. Those remarks are worth considering because they remind us that while everyone likes to worry about cybersecurity risk, our current regulatory regime leaves many parties several steps removed from actually reducing it.
Hamm, for example, said four times that audit firms play a “limited” role in cybersecurity. Yes, she said, audit firms should consider cybersecurity issues when conducting a risk assessment — but only to the extent that a company’s cybersecurity weakness might somehow contribute to the risk of material misstatement of financial results.
“The auditor does not broadly evaluate the company’s overall cybersecurity risk or the design and effectiveness of operational and other non-financial controls adopted by the company to mitigate that risk,” Hamm said.
Instead, the auditor’s role in cybersecurity is narrow. For example, if the company suffers a huge data breach and establishes a contingency account to pay for litigation and other costs, the audit firm would inspect controls for that account and related disclosures that the company might make to investors.
But then, an audit firm would do that with any account that’s material to the financial statements. That a cybersecurity breach caused the company to create the account in the first place is (no pun intended) immaterial.
Likewise, the audit firm also examines IT controls related to financial reporting, including controls for the reliability of financial reports’ underlying data. If those controls are somehow compromised, the audit firm will certainly flag it.
But again, the auditor is more concerned about reliable financial reporting than data security itself. Plenty of cybersecurity controls are crucial to business operations and regulatory compliance, but not to financial reporting — say, controls to keep personal health data private. An auditor would not look at those.
Should Audit Firms Do More?
Hamm was cagey on that question. Yes, she said, audit firms should think more expansively in their risk assessments, about the many new ways that a cybersecurity event might happen. That’s good.
“As companies become more and more digitally linked with their vendors, customers, and employees, the potential entry points and attack surfaces multiply,” she said. “An auditor “should be clear-eyed about the risk that attackers can operate under the guise of legitimate users, ultimately accessing a company’s systems or subsystems that support the financial reporting process.”
Time and again, however, Hamm also brought all those potential avenues of risk back to the narrow path of what audit firms are paid to worry about: things that could have a material effect on the financial statements. Most breaches don’t.
Hamm did say auditors should assess the potential costs of a breach: lost revenue from disrupted operations, as well as the expense of forensic investigations, lawsuits, compensation to harmed consumers, and so forth. Those numbers can add up, but I wonder how often they add up to a material amount of money that can be associated with a breach; plus, companies do have insurance policies to cover those losses.
For example, if you want to set a materiality threshold at 3 percent of operating income, then among the S&P 500, that would imply a cost of $95 million. (Average operating income for the S&P 500 last year was $3.16 billion.) Some large breaches do ultimately cost that much money, but those are truly large breaches like we saw at Equifax in 2017, Sony in 2015, or Target in 2014.
A ransomware attack that blocks access to your data for a day until you cough up $25,000 in bitcoin to some outfit in the Ukraine — probably not material. Stolen consumer data, where insurance policies cover the costs of credit monitoring services — probably not material.
Caring About Cybersecurity Controls
In a roundabout way, Hamm’s speech underlines an important point. Our regulatory regime has a blind spot about cybersecurity. It doesn’t require any outside evaluation of the design and effectiveness of data security controls, to see whether those controls adequately address the company’s cybersecurity risks. Companies are left to do this themselves, with mixed success at best.
Yes, compliance, audit, and IT security executives work hard to achieve strong data security; in no way do I mean to say that companies don’t take cybersecurity seriously. They face blizzard of regulations, from PCI compliance for credit card data to GDPR compliance for personal data, with many more rules in between. Screwing up those compliance risks can lead to enforcement actions, monetary penalties, and all sorts of bad publicity. Everyone knows this.
Still, for a significant swath of publicly traded companies in the United States, our regulatory regime requires an independent audit of financial controls every year by default. Those audits are meant to help investors decide how trustworthy a company’s financial statements are — and by extension, how trustworthy the company itself is as an investment opportunity.
The audit system is by no means perfect, and even this week the Securities and Exchange Commission is moving to exempt more firms from that investor protection. Still, investors are better off with that system than without it.
We have nothing similar for cybersecurity controls, even though cybersecurity is a far more dangerous and pervasive risk.
Granted, some firms do need to certify the effectiveness of their cybersecurity program under certain laws (financial firms operating in the State of New York, for example). Sometimes an acquisition might lead one firm to audit the cybersecurity of another. That’s not the same as an annual outside audit of cybersecurity controls.
How would such a system work? I don’t even know, because it implies the need for some set of standards for data protection, in the same way financial data follows Generally Accepted Accounting Principles. That would be onerous, and hackers would probably welcome formal standards for data protection because those standards would be a fixed target to exploit.
Still, let’s be clear on the fundamental point. We have a clear requirement for audit firms to assess the potential of financial reporting risk, to protect investors. We don’t have any requirement to assess the potential of cybersecurity risk, to protect consumers.
The result: lots of finger-pointing and lots of hand-wringing, but no real reduction in cybersecurity risk. I wonder if any of that will go away any time soon.