Craving more information about how companies are disclosing cybersecurity breaches? Audit Analytics has a new report examining what publicly traded firms have been reporting in SEC filings — and you won’t get much guidance there, because those disclosures vary so widely.
For that reason alone, the Audit Analytics report is worth reading. It shows that, yes, the number of cybersecurity disclosures has been rising (briskly) since 2011. On the other hand, only half of the firms disclosing a breach actually describe what attack they suffered; and a significant number of companies seem to take longer to report a known breach than what state disclosure laws require.
First, the number of cybersecurity breaches disclosed in SEC filings has more than tripled this decade, from 28 in 2011 (when the SEC first issued cybersecurity disclosure guidance) to 121 in 2018. See nifty chart, below.
Now, one might ask: Aren’t those numbers low? Don’t cybersecurity firms churn out reports all the time that say Corporate America suffers zillions of breaches every day?
Yes, to both questions. For example, Verizon’s 2018 Data Breach report identified more than 2,200 confirmed breaches last year, plus 53,300 “security incidents.” Those numbers are worldwide, but clearly corporations trading on U.S. stock exchanges must suffer more than then 121 incidents they disclosed in SEC filings last year.
That discrepancy shows one frustrating problem with cybersecurity: too many disclosure laws and regulations, with too many standards for what a company should report. SEC rules, for example, focus on whether a security breach is material information — when (heresy alert!) most breaches aren’t material to a large firm’s financial statements.
But state breach disclosure laws (and the GDPR in Europe) focus more on consumer protection; so even smaller incidents that might not merit disclosure in an SEC filing are still a big compliance priority anyway.
Data in the Disclosures
The Audit Analytics report spotlights two other issues that compliance, internal control, and IT security executives might want to consider.
First is the time lag in breach discovery and disclosure. The median time between when a breach happened and when it was discovered was 35 days; the average time was 123 days, although that average was skewed by a few laggards that went years without discovering a breach.
Meanwhile, median time between breach discovery and breach disclosure was 26 days. One unfortunate company took 367 days to disclose its breach. (Perhaps the firm had to delay at the request of law enforcement investigating the incident.)
Obviously lag periods that long can run afoul of other breach notification laws. The GDPR leans heavily on companies to disclose with 72 hours. The state of Maine requires notification within seven days of law enforcement determining that disclosure won’t compromise any ongoing investigation. Other states require disclosure within 45 days (Alabama) or 90 days (Connecticut), while many states only say something along the lines of “without reasonable delay.”
So this information about attack, discovery, and disclosure times is yet another point of reference for compliance and security officers to evaluate your cybersecurity protocols. The longer you take to discover a breach, or to move from discovery to disclosure — the more questions should be asked about the effectiveness of your internal controls.
Types of Attack
Audit Analytics also broke down the types of attack that companies were disclosing, and the information compromised.
As previously noted, half of all companies disclosing a breach didn’t specify how the breach happened. Within the group that did disclose, malware topped the list. That’s not surprising, really; hackers have been planting malware in corporate IT systems for decades. Malware is a headache for the IT security team.
But phishing attacks, unauthorized access, misconfiguration — those are much closer to the domain of compliance and audit. At the least, compliance and HR should be working together to stress the risk of phishing attacks; audit should be probing for weaknesses in access controls and IT configuration. (That includes use of cloud-based tech vendors.)
Once you understand the extent to which those risks exist within your organization, you can think about proper policy and procedure responses, such as how much to embrace two-factor authentication.
Then, just maybe, your cybersecurity risks might get a bit more manageable.