Today we return to the lawsuit the Securities and Exchange Commission has filed against SolarWinds, the IT services firm that suffered a disastrous cyber attack in 2020. How much does SolarWinds’ compliance with the NIST framework for cybersecurity — or its lack thereof — figure into this risk management morass?
Quite a lot, at least according to the SEC lawsuit, which alleges that SolarWinds was nowhere near as compliant with NIST as the company led investors to believe. This raises interesting questions about how rigorously a company should try to achieve NIST compliance, and who could face what personal liability if the company doesn’t.
First, some background for those who might need it. NIST, the National Institute of Standards and Technology, publishes frameworks that organizations can use to improve their cybersecurity. Federal government agencies and contractors (a group that includes SolarWinds) must comply with a framework known as NIST 800-53, perhaps the most onerous cybersecurity standard NIST publishes.
800-53 is no joke. The version followed by SolarWinds contained more than 400 individual controls grouped into 18 control families — everything from access control to incident response, physical security to employee training, and lots more. Each contractor must review that total set of controls, and then implement whatever smaller subset of controls make the most sense for you, based on your overnment contract and your company’s unique cybersecurity risks. Compliance with NIST 800-53 is a major GRC exercise that can span cybersecurity, physical security, privacy, policy management, HR, internal audit, and much more.
Now let’s pivot back to SolarWinds. In the years leading up to the 2020 cyber attack, the company published a “Security Statement” for investors and the rest of the world to see, which declared: “SolarWinds follows the NIST Cybersecurity Framework with layered security controls to help identify, prevent, detect and respond to security incidents.”
So how, according to the SEC, did that compliance objective go off the rails?
So Much NIST, So Little Progress
The central allegation from the SEC is that SolarWinds told the public that it was “following” the NIST framework — but internally, executives were well aware that the company had managed to implement only a tiny fraction of the 800-53 controls that SolarWinds itself had decided were relevant. More than half of the controls SolarWinds had chosen weren’t implemented, and another third might have been in some phase of implementation; the security team wasn’t sure.
For example, out of the 444 total controls contained in NIST 800-53, SolarWinds had decided to implement 325 of them. But according to the SEC, a security assessment from September 2019 identified “a program/practice in place” for only 21 of those 325 controls, an implementation rate of barely 6 percent. Even worse, the assessment found “no program/practice in place” for 198 controls, and the remaining 106 controls fell into the category of “program/practice may be in place but requires detailed review.”
This was September 2019. By then, Russian hackers had already penetrated SolarWinds’ IT defenses and were quietly planting their spyware into the company’s products. But the security statement from SolarWinds — the one declaring that SolarWinds “follows” the NIST framework — had already been posted online for years.
So does a 6 percent implementation rate count as “following” a security framework? If senior executives know that their company hasn’t implemented the vast majority of controls in a framework, what duty do they have (if any) to be more candid with investors about the true state of affairs?
Those are questions that CISOs, compliance officers, and internal auditors should ponder deeply, because clearly the SEC believes you do have a duty to be more forthcoming. Hence the company also named SolarWinds CISO Timothy Brown as a target of its lawsuit.
That brings us to the other issue the SEC is trying to raise with this lawsuit: holding executives accountable for weak risk management and disclosure processes.
‘Strong Vendors Publish Their Protocols’
When SolarWinds conducted that security assessment in 2019 and found only 6 percent of its selected security controls had been fully implemented, Brown was vice president of security and architecture. According to the SEC, Brown also saw that assessment and its glum conclusion that SolarWinds hadn’t implemented most of the 800-53 controls.
Then in September 2020, when the hackers had already executed their attack to infect SolarWinds customers with spyware, but before anyone had discovered the scheme, Brown published a blog post stressing how important it was for companies to issue and follow cybersecurity protocols. It said, in part:
That’s why it’s important your software vendors take their roles as business partners seriously. Their security is your security … No software is perfect or vulnerability-free forever. But strong vendors put processes and protocols in place to reduce the risk and deal with threats if they crop up. And most importantly, strong vendors publish their security protocols and processes so you can evaluate whether they meet your standards.
The SolarWinds attack finally came to light three months later. The company says the SEC lawsuit is “misguided and improper,” but regardless of how this case unfolds in court, it raises some profound questions about risk management and personal liability that any senior executive involved in risk assurance should contemplate.
First, what has the company already disclosed publicly about the risks I’m responsible for managing? What has the company said in the 10-K or other disclosures, such as SolarWinds’ security statement? Did you get to review those disclosures, or did weenies in legal churn out boilerplate without consulting you?
Second, do I have sufficient visibility into what the company is disclosing internally, up the chain of command, about the risks I’m responsible for managing? This is where internal hotlines and GRC tools to track compliance can be invaluable.
You might be sitting at the top of that chain, mistakenly believing that employees further down the org chart are managing risk just fine, when in reality they’re complaining that the remediation plan is behind schedule, the IT tools are terrible, and management (read: you) has no clue what’s really going on. So how can you gain more transparency into control activities and employee sentiment?
Especially for cybersecurity risks, GRC tools can help bring that transparency to you. Configured correctly, they can let you see how many controls are implemented, what gaps remain, which remediation steps are behind schedule, and where bottlenecks are.
But then comes the third question: What do you do when the internal information you have about risk is at odds with the public disclosures your company is making?
Do you consult with legal to craft a public disclosure that somehow straddles both worlds? Do you bring it to senior management and assume your duty is done? If senior management accepts your warning but does nothing, do you go to regulators? Do you resign? Do you decide that this is the hill to die upon?
I’d be eager to hear what others think the correct answers are; email me at [email protected].