More on Ransomware & Compliance

Earlier this month we had a post about ransomware, and the rather unsettling idea that under some circumstances, a victim of ransomware might not have legal obligation to disclose the incident. As promised, today we’re going to explore a few more threads related to the subject—since it poses a god-awful mess of risk, that compliance and audit executives need to address sooner rather than later.

First, let’s recap the original conundrum: that ransomware doesn’t necessarily trigger a duty to disclose, because ransomware doesn’t “steal” data in the same manner as traditional cybersecurity attacks. That is, if the attack only locks employees out of their IT systems, the data itself hasn’t been compromised. If the victim company then pays the perpetrators and the lock is released, the data was never harmed. Customers weren’t exposed to the threat of identity theft.

Under a scenario like that, the argument to pay the ransom and keep your mouth shut can be compelling. The ransomware perpetrators usually don’t ask for much money: an average of $1,077 in 2016, according to a report from Symantec. Disclosing a ransomware attack can invite regulatory scrutiny of your control environment. It can bring unwelcome headlines. It can alienate customers. So why not fork over the bitcoin and then scramble (quietly) to clean up your data access and security controls?

This is the point where compliance officers at large companies usually think to themselves, “Interesting idea, but we could never get away with that here. We’re too large. People would talk. Plus, it’s just the wrong thing to do.”

Quite right. The question for large companies isn’t whether you should disclose a ransomware attack to regulators or the public. The question is whether your data service providers are not disclosing ransomware attacks to you.

Enter the SOC 2 Audit

Measures to prevent that sort of abuse are easy enough to envision: include clauses in your contracts with data service providers requiring them to tell you when they experience a ransomware attack. (“Any disruption in service caused by an outside entity seeking financial reward” might be a better way to phrase it.) We still have three problems:

  • Your data service provider might lie to you about having good security;
  • Your data service provider might not know it has bad security;
  • Your employees might use a data service provider with bad security and not tell anyone.

An excellent tool to address the first two points is the SOC 2 audit: a special type of audit that examines the security controls of a data service provider. If you’re a large corporation, you probably already require SOC 2 audits from data service providers working within your enterprise; they’ve been around since 2011. The tricky part is that SOC 2 audits can be designed any way you want—so setting the scope of a SOC 2 audit wisely, to include the risks that ransomware brings, becomes crucial. (I wrote a more detailed post on this subject for Reciprocity Labs not long ago.)

For example, SOC 2 audits can be based on any of five principles, including security, privacy, and availability. Well, if your data service provider suffers a “screenlock” ransomware attack, where users can’t log onto their IT systems to access data, that’s a weakness in availability—but if your SOC 2 audit only examined security and privacy controls, you might discover the availability flaw the hard way.

So you, the large corporation using data service providers, need to ask: What are our most critical assets? To what extent is access a critical asset? What are our disclosure obligations if we don’t have access—even to customers or investors, if not regulators? Have we matched those needs to the correct data service providers in our extended enterprise, and designed SOC 2 audits to match those needs?

Your IT audit or IT security folks will probably oversee the obtaining of SOC 2 audits from your data service providers. The compliance officer’s role is to help tailor the scope of those audits correctly: to identify your regulatory requirements (after all, even if your provider doesn’t have a duty to disclose a ransomware attack, you might), and ensure that those requirements translate into tests and evidence the SOC 2 audit provides.

The Importance of Good Policy

And what about our third bullet point above, of employees using data service providers without alerting anyone in compliance, IT security, or some other risk-oversight function?

Well, that’s a challenge of policy management and monitoring, which is nothing new for compliance officers. First, work with your IT security teams to assess what the risks of employees using “shadow IT” really are. Some data might be low-priority, and nobody will care if it gets locked in a ransomware attack. Some data will be indispensable.

Have you identified which data is which? Have you identified which processes or business units generate critical data? Have you implemented policies against improper data storage for those parts of the enterprise, and trained relevant parties on them? Is the IT department monitoring possible improper data storage?

At this level of information risk management, the distinctions between ransomware and other types of cyber-attack, or between internal employee and external service provider, almost fade away. The goal is more about risk awareness, and proper practices for handling data—and your requirements for those goals should be universal, across your whole extended enterprise.

Yes, the external service providers will have security controls that vary greatly from one party to the next; hence the need for individually tailored SOC 2 audits. Good policy, however, illuminates the outcome your company wants to achieve: to reduce your risk of data disruption as much as possible, and to know how bad that disruption is when it arrives anyway.

Leave a Comment

You must be logged in to post a comment.