SEC chairman Jay Clayton appeared before the Senate Banking Committee on Tuesday, a wonderfully poetic bit of timing to talk about cybersecurity. He and his Senate overlords jousted over the Equifax breach earlier this month, the SEC’s own breach disclosed just last week, and the duties companies may or may not have to investors and consumers.
Lots of political theater. Little help to compliance, audit, and risk professionals struggling to manage cybersecurity risk.
As one might expect, senators were plenty outraged about the Equifax breach. They were upset that senior executives sold stock after the breach happened, but before they disclosed it to investors. They were upset the breach happened at all.
The most telling bit of outrage, however, came from Sen. Jon Tester, D-Montana. He mentioned the 360,000 Montanans who “had their personal information stolen.” That’s a pithy summary of public outrage, but consider the assumption behind his words: that the personal information belonged to them in the first place.
Well, that’s not how it works in this country. The information belongs to Equifax. It belongs to whatever company collects it.
Yes, companies do have a certain duty of care; and in the event of a breach, the company needs to make the consumer whole for whatever harm he or she may have suffered. Hence we get the offers for credit monitoring and a few dollars as part a class-action lawsuit settlement years later. Those protections help, but they are not the same as you owning the personal information about yourself.
A much tougher standard exists in Europe, where data about a person is owned by the person. No company can collect personally identifiable information about an EU citizen without the citizen’s consent, and he or she can revoke it at any time. (This will be all the more true starting May 2018, when the General Data Protection Regulation goes into effect.)
The difference between EU and U.S. standards is no parlor game of legal theory, either. When we apply the U.S. approach to consumer data to cybersecurity and disclosure, we tumble into this frustrating, bizarro world that exists today— where everybody agrees that cybersecurity is important, but nobody can identify how that translates into duty of care to both consumer and investor.
The fundamental problem, really, is that a company’s asset— my birthdate, my Social Security number, my home address— is material to me. Your personal data is material to you.
In the aggregate, however, that information isn’t material to a company’s investors. Time after time, we’ve seen companies experience massive breaches of personal data, and it doesn’t depress their stock price for more than a few weeks. So applying the usual, investor-centric standard of materiality to cybersecurity is like trying to fit a square peg in a round hole.
The Real Cybersecurity Question
Clayton and the senators danced around that predicament numerous times. Sen. John Kennedy, R-La., said in a folksy southern twang: “The credit reporting agencies— I didn’t hire ’em. That’s something we need to talk about… what role do the credit reporting agencies play?”
Clayton didn’t address Equifax specifically. Rather, he gave his standard response that, “I don’t think there’s enough disclosure around the risk profile of companies.”
Really? People don’t know that their PII is at risk? Because among the people I talk to, everyone assumes all companies will fumble security and experience data breaches.
The question is which companies have collected PII about me, and turned my PII into an asset for their business. As Kennedy said, I don’t remember hiring Equifax. Yet the company collected data on half the people in the United States, and then exposed us to considerable, irrevocable harm. We can’t change our dates of birth, hometowns, or previous buying habits.
Now, let’s not kid ourselves: companies will still collect PII about consumers. Some, like Equifax, will do so without your awareness. How many protections should those companies be forced to implement around that data? How does a regulator test those protections to be sure they work, and are proportionate to the risk of theft? Heck, which regulator is even the proper one to do that job?
Those are the questions we need to answer as a body politic. They’re also questions that the compliance community desperately wants answered, so you can build appropriate policies, controls, and safeguards into corporate operations.
Not surprisingly, everyone at Clayton’s hearing retreated from answering them as soon as those questions crept over the horizon.
As one example, consider the PCAOB’s announcement a few weeks ago that it will make cybersecurity issues a priority when inspecting audit firms. At the Workiva TEC conference last week (a gathering full of corporate finance and audit professionals), we tried to wrap our heads around what that would mean in practice. Audit firms should start testing cybersecurity controls for the client’s finance department? Question whether you need a contingency fund on the balance sheet for possible breaches? Something else?
Nobody had any good answers. Everyone agreed that cybersecurity is an important issue for corporate operations, and some agency should enforce standards to protect investors. But the plain truth is that cybersecurity usually doesn’t pose a material risk to the financial statements, so I’m hard-pressed to see how the PCAOB is the right one.
In the accounting sphere, we have internal control over financial reporting, administered by the SEC and the PCAOB because financial statements are material to investors. That makes sense. What’s needed is simplicity and clarity of regulation for personal data companies collect: a duty for internal control over personal information, enforced— and possibly audited— by, well, I don’t know who. The PCAOB doesn’t seem a natural fit. We could say the same of the SEC, or the Federal Trade Commission, or other agencies.
A hole exists in our country’s ability to place safeguards around consumer data that companies collect. Sooner or later, we’re going to need to fill it.