Today I want to revisit the new SEC rules for disclosing material cybersecurity incidents, and in particular those qualitatively material incidents that might seem especially tricky to assess and prevent. What internal controls become more important for that type of threat?
This is on my mind because we’re already starting to see some companies disclose cybersecurity incidents under the new rule, which went into effect Dec. 15. Quantitatively material incidents are already painful enough, but at least everyone has a sense of what the word “quantitative” means: a significant amount of money. You can set a quantitative materiality threshold. You can tally up costs and see whether they exceed it.
Qualitative materiality is a more slippery thing; you have to assess the nature and context of an incident, rather than just its size. A company could suffer a cyber attack that doesn’t cause much financial damage, but something about the nature of that attack could still be telling enough to investors —offering some useful glimpse into the company’s priorities, leadership, culture, control environment — that the company is duty-bound to disclose the incident anyway.
What does that mean in practice? Thankfully, the SEC’s final cybersecurity disclosure rule offers a few examples.
One important point here is that your material cybersecurity incident could be “a series of related unauthorized occurrences,” according to the rule. So you could suffer a series of small, immaterial attacks such as:
- One bad actor launching a series of attacks against you in a short time period; or
- Multiple attacks coming from different bad actors, all exploiting the same vulnerability.
In either scenario, the individual incidents might not amount to anything; but considered altogether, those incidents might add up to one larger cybersecurity incident that’s qualitatively material.
So now we’re back to my original question. What internal controls become more important to identify — and ideally, prevent — these types of threats?
Modern-Day IT Controls
To address those two scenarios outlined above, you need internal controls that can help you understand how an attack might happen as well as who might be behind it. In that case, several specific cybersecurity and control capabilities become more important to have in place and functioning properly. For example…
- Vulnerability scanning, to probe your software on a regular basis (if not constantly) and identify potential weaknesses or known vulnerabilities.
- Intrusion detection, to detect an unauthorized user who’s either trying to penetrate your IT systems or has already done so.
- Patch management, to implement software patches promptly when your cloud-based vendors send along a patch to rectify a weakness they’ve found.
- Forensic capabilities, to dissect how an incident has happened and what damage has been done.
- An incident response plan, to put those forensic capabilities to use when an incident happens and to introduce compensating controls as necessary while you stop the damage.
I understand that all five capabilities listed above are primarily the domain of the IT security team, so audit and compliance professionals might be thinking, “Cool story, but what does all that have to do with me?”
To that I would say compliance and audit now have a greater interest in assuring that the IT security team has those tools in place and working — because without those tools, you on the compliance side might not understand that the company has suffered a qualitatively material cybersecurity incident that must now be disclosed.
Or, to put it another way, what previously might have been inept cybersecurity (the company is too slow to figure out that a string of small attacks is actually one large attack) could now also be a compliance failure (the company hasn’t figured out that it suffered a qualitatively material incident).
Now, some people might say, “Hold on — the SEC rule only specifies that you have four days to disclose a cybersecurity incident after you’ve determined that it’s material; you can still take as long as you need to make that materiality evaluation. So does every company really need all these internal controls ready at a moment’s notice? They do cost money, you know.”
That’s a fair point, but I’m not sure it goes very far. It’s fair in the sense that regulators do take a company’s size and sophistication into account when evaluating disclosures, compliance violations, and the like. A huge Fortune 100 company generally would be expected to have all those IT controls I mentioned above, and lots more; a non-accelerated filer, perhaps less so.
That said, I’m not sure how far this argument goes because if you don’t have a set of strong IT controls, you’re already at risk of a failing grade from your auditor over Sarbanes-Oxley controls. Successful SOX compliance depends on strong IT general controls; “ITGCs” are part of the COSO framework against which your internal controls are assessed.
So the question for SOX compliance, IT security, and internal audit teams today is whether your IT general controls — which should already exist for SOX compliance purposes — are also primed to help the company understand when a series of small, seemingly unrelated incidents actually add up to a single qualitatively material incident that will need to be disclosed.
Don’t Forget Upward Disclosures
We can’t ignore the risk of internal processes failing to relay important information about cybersecurity incidents upward, so that senior executives in charge of disclosures to investors understand exactly what’s going on.
For example, I suspect that plenty of companies already do have the strong IT general controls we described above; and that cybersecurity employees generally can figure out when one bad actor is attacking you numerous times or multiple bad actors are using the same exploit against you. But how useful is that knowledge if it’s trapped down at the bottom of the org chart, and disclosure executives at the top don’t know it exists?
This is one question raised by the SEC’s recent lawsuit against SolarWinds and its CISO for making misleading disclosures about the company’s cybersecurity posture. According to the SEC, numerous low-level employees at SolarWinds knew the company’s cybersecurity controls were far weaker than what the company proudly proclaimed to investors; and that CISO Timothy Brown either knew the disclosures were wrong and did nothing about it, or didn’t hear the alarms that minions down below were trying to raise.
We won’t know what really happened at SolarWinds until that case settles or goes to trial — but it does demonstrate the important point that in addition to all those strong IT controls, you also need strong internal processes to send troubling news upward. So that’s another issue that internal auditors might want to analyze closely, to assure that the company is prepared to handle qualitatively material cybersecurity incidents.
And people say cybersecurity compliance is dull!