Perhaps you have some free time today, while your IT department tries to extricate the company’s computer systems from a ransomware attack. In that case, let’s talk about whether to disclose it.
This week I had the good fortune to participate in a cybersecurity conference in Stamford, Conn., that brought together the law enforcement, internal audit, and corporate legal communities to talk about cooperation amid soaringly dangerous cybersecurity threats. The conference had been planned for months, but last weekend’s Wannacry ransomware attacks made everything on the agenda that much more urgent. It is a clear, pressing risk to any organization.
Still, if you fall victim to a ransomware attack—should you want to disclose that? Do you have to disclose it?
The question isn’t far-fetched. Ransomware isn’t like data theft, where intruders abscond with sensitive customer data and disclosure of a breach is required. If ransomware attackers simply block access to customer data, nothing has been harmed or stolen. If they encrypt that data but it never leaves your premises, and then they decrypt the data after you pay a ransom, that might not result in any harm to customers either.
Under those circumstances, you might not have any duty to disclose under some breach disclosure laws. And we can’t ignore the ugly but very real urge not to disclose because you don’t want to harm the company’s reputation.
So what’s the best course of action? What’s best for your company today, if you’re the victim? What’s best for the corporate community overall, where your company might be the next victim tomorrow? And, really, there’s a cognitive dissonance to the idea that a ransomware attack might not be something you need to disclose—is that really the case?
Let’s start unpacking.
No Disclosure, Really?
Let’s take my home state of Massachusetts as an example. Our disclosure law defines “breach of security” as “unauthorized acquisition or unauthorized use of unencrypted data… that creates a substantial risk of identity theft or fraud against a resident of the commonwealth.” You’re a broker-dealer firm here in Boston, and you have my personal data in your archives.
Play this out. If the ransomware attackers simply attack your systems, setting up a block to prevent you from accessing your data about me—they’re not “using” my data, per se. They may not even touch it. And typically ransomware attacks start by the hackers sending employees an email, with a link to an infected website. So the hackers haven’t breached your network, either.
Even crazier: the hackers aren’t likely to corrupt your data or not provide decryption, because if they do, targets will stop paying the ransom. Ransomware is a business model, and as strange as it seems, the hackers need to act with integrity if they want their business partners to cooperate. So the “substantial risk of identity theft” may not exist either, especially if you can prove that no exfiltration of data happened.
Given all that, where’s the disclosure duty? Could someone successfully argue no duty? I’ve asked numerous security and legal experts this week, and so far nobody has given a clear answer.
Meanwhile, consider the implications if you do disclose. If you fell victim to Wannacry, for example, odds are high that you were running outdated Windows software without proper security patches—which raises questions about your IT governance and access controls. Patches can be updated automatically, and if you’re not doing that, outsiders (read: regulators and plaintiff lawyers) could challenge your internal control effectiveness.
Organizations like FINRA or the SEC’s Office of Compliance Inspections and Examinations may come visiting. (Here is the SEC’s Wannacry risk alert from May 17.) They would investigate your compliance with Regulation S-P, which requires firms to:
- Insure the security and confidentiality of customer records and information;
- Protect against any anticipated threats or hazards to the security or integrity of customer records and information; and
- Protect against unauthorized access to or use of customer records or information that could result in substantial harm or inconvenience to any customer.
In other words, the regulators will be inspecting your policies and procedures for data protection and availability of IT systems. That’s not the same as investigating a specific ransomware attack. They are investigating the environment that allowed the attack to happen.
So you can disclose and risk a significant regulatory action, with monetary penalties or a settlement that requires corrective action. Or you can pay the ransomware people, who will probably want only a few thousand dollars, and then correct your policies and procedures anyway. Which one is the wiser course?
Enter Law Enforcement
Not surprisingly, the prosecutors and FBI agents at the Stamford conference said paying ransomware and keeping quiet is a terrible idea. You can’t be sure the hackers will give you a key to decrypt your data. You don’t know whether they planted malware on your systems for further damage in the future. You can’t be sure that data theft didn’t happen. More broadly, your payment might go to funding terrorism. It definitely will fund more ransomware attacks.
This gets to an important point for compliance officers: the difference between reporting ransomware to law enforcement, and disclosing ransomware to regulators or the public. Cynics will say reporting to law enforcement is a great idea that might not help your particular company much, yet trigger disclosure to regulators and the public that could pile on the costs even more. The cynics aren’t out of bounds to raise that point.
I’m still in the pro-disclosure camp simply because it’s the right thing to do. And on a practical basis, if your employees can’t get into their systems because of an attack, that sort of thing slips into public awareness anyway. If some other party discloses your company’s problem, you have an even bigger problem.
But ransomware is not going to go away. From the hackers’ perspective, it’s a lucrative business. So compliance officers want to ponder how your firm might respond under various circumstances. Until cybersecurity improves dramatically, the future is going to be a difficult thing to navigate.
I’ll have another post in a few days about some protective measures that compliance and audit executives can implement in their organizations for ransomware attacks.
Also, I’d be remiss if I didn’t praise Neil Frieser, chief audit executive of Frontier Communications; Bill Feher, chief risk officer at ITT; and Vanessa Richards of the U.S. attorney’s office in Connecticut. They organized this event, and it was superb. Every U.S. attorney’s office and local compliance or audit organization in the country should do this.