Today I want to circle back to last week’s collapse of cryptocurrency exchange FTX. One allegation is that FTX’s now-former CEO, Sam Bankman-Fried, engineered a “back door” into the company’s financial systems so that he could execute transactions without review. My question: would an audit of internal controls over financial reporting catch something like that?
I ask this question because numerous audit professionals have told me the answer is no, it wouldn’t; weaknesses in the design of IT applications are generally too subtle and difficult for an auditor to find. And while I don’t dispute that these people are giving me an accurate description of auditing today — in no way is that answer satisfactory. On the contrary, if risk assurance professionals can’t address a problem as big as what seems to have happened at FTX (and that has happened at other companies), that’s an even bigger problem unto itself.
First let’s review the facts and rumors surrounding FTX. The exchange collapsed into bankruptcy last week. The story seems to be that over the summer, a hedge fund also run by Bankman-Fried, Alameda Research, ran into financial distress when several high-risk bets went bad. So Bankman-Fried transferred as much as $10 billion in FTX assets, including customer deposits, onto Alameda’s balance sheets. Then customers launched a run on FTX’s assets, which the exchange no longer had, and bankruptcy followed.
What caught my eye was a Reuters article tracing FTX’s downfall, that included this astonishing passage:
We should be clear that, as the Reuters article notes, Bankman-Fried says no back door exists. But if such a back door were to exist, shouldn’t an audit of internal control over financial (ICFR) reporting find it? Or if not, why not?
Audits and IT General Controls
My original thesis had been that a back door would be uncovered during an audit of the company’s IT general controls — controls that, in theory, should prevent something like a CEO transferring billions in assets to another entity without anyone else knowing about it.
Effective IT general controls are guided by Principle 11 of the COSO internal control framework. It states: “The organization selects and develops general control activities over technology to support the achievement of objectives.”
In practice, those controls would govern change management (that is, assuring that people only update the IT applications with approval and documentation) and the software development life cycle (how the company writes, tests, and deploys its IT). The audit team, when auditing your IT general controls, is supposed to examine how those processes work and flag any weaknesses they find.
So how come those audits would miss a weakness as glaring as the back door described above? My friends in the audit world gave several answers.
One person said few IT auditors have the skills to examine software code and system logs directly. “As a result auditors are far too focused on static policy and procedure documents which provide barely any live information,” this person said. “If they knew how to interrogate the system logs they could have found that a non-standard part of the IT system had direct access to the finance data.”
Another said IT auditors typically follow generally accepted IT audit principles, which steers people toward the checklist mentality: “Are they running a firewall? Do people have to change their passwords? How is user access determined? Are there internal controls? Backup systems? Et cetera.” Plus, this friend said, IT auditors start by reviewing the documentation. “Do you think SBF put his secret backdoor in the documentation? Probably not.”
And one more, who said, “This is the problem when assurance activities such as audits do not align with priority value creation & preservation objectives.” (Preservation objectives like, say, keeping $10 billion in assets parked on your own balance sheet, since sudden transfers risk bankruptcy.) Unless auditors can understand what’s most important to protect, this person said, “the assurance provided by traditional audits is of little value.”
We Need to Do Better
Those answers are astute observations about the current state of affairs, but they’re frustrating. Weaknesses in software code — whether deliberately introduced by an insider to commit fraud, or accidentally introduced through a patch from a vendor or a misconfiguration of some sort — are a tremendous risk to financial reporting, data privacy, and mission-critical operations. The risk is in the code, whether traditional IT general controls and audit procedures address that or not.
My friend the Cybersecurity Auditor has ranted to me about this issue many times, complaining that current IT controls and audits miss the point. For example, he says, “What are you providing assurance over for a change management audit today? You’re providing assurance that the right people approved the change. That’s not auditing [expletive]. That’s auditing who looks at an IT help ticket. Where has the actual change to the system been audited in that procedure?” Spoiler alert: It hasn’t.
Instead, the Cybersecurity Auditor says, companies must inspect the software code that runs your business: the SAP or Oracle configurations you run, the third-party Software-as-a-Service apps you rent, the homegrown software you develop in-house, the patches and upgrades that follow — all of it.
Think of it this way: cybersecurity is your paramount risk because so many other risks flow from it. Weak cybersecurity allows financial reporting, privacy, and compliance risks to take root, because weak cybersecurity allows the applications guarding against those risks to be tampered with. It is the blind spot in our risk assurance field of view.
Inspecting the software code might sound challenging; but I’m not sure that fear is well-founded. For example, you could rely on vendors like Veracode, Checkmarx, Onapsis, or AppScan to automate code analysis and spit out a list of weaknesses that need fixes. (Disclosure: my friend the Cybersecurity Auditor works at one such vendor.) That would alleviate at least some, and perhaps a lot, of the skills deficit that IT auditors might have, and you’d be doing what truly needs to be done: auditing the actual change being made to the system.
I keep coming back to the larger, unavoidable point: that we need to do better at finding deep flaws in the IT systems, because those deep flaws are exactly what allows the massive frauds, security breaches, and other misconduct that leave the public infuriated.
Whether it’s billions missing from FTX, or government agencies hacked thanks to SolarWinds, or millions of people’s personal data exposed in the Equifax breach of 2017 — in all of those instances, and so many more, people were left shrieking at the offending companies, “I thought you had handled this!” I bet the executives there thought that too. But they hadn’t, because they overlooked the subtle, insidious threats lurking in the lines of code running their businesses.