The ‘This Seems Weird’ Control for Data Privacy

Not long ago I heard the story of a CEO who was the victim of attempted “spear phishing”—where some outside hacker impersonates the boss, and via email asks employees at the company to reply back with valuable information.

In this specific case, the hacker posed as the CEO and contacted a junior member of the HR department, asking that person to send all employees’ tax information (names, home addresses, Social Security numbers, and the like). The good news is that the attempt failed; the HR employee had the sense that something seemed off about the CEO asking for information like that, and called the CEO’s admin to confirm. The fraud was exposed and nothing was compromised.

The bad news is that stunts like this are the data privacy threat of our future, and as the attacks get more and more sophisticated, more and more of these vignettes will not end on a happy note.

I like the CEO’s story so much because it strips down data privacy compliance challenges to their bare essence. We need to rely less on technology, and more on human judgment. That sounds simple when you frame it in terms of fake CEO emails and quick-thinking employees. But as we scale the concept up to the modern enterprise—with potentially thousands of managers, asking tens of thousands of employees for reams of sensitive information, plus all the third parties and open networks and cloud computing technologies commonplace in IT systems today—well, we have a mighty big internal control challenge on our hands, with some mighty big policy and security needs to fix it.

Start with the hacker. He was trying to exploit a risk inherent in our cornerstone IT security practice of authentication. Namely, if we protect our networks and data by requiring users to prove that they are who they claim to be before we allow them on the network, then all a hacker needs to do is successfully impersonate a user. After that, he’s in the clear.

That was our hacker’s exploit: spoof the CEO’s name and email address (which is easy enough to determine, he’s a relatively well-known executive), and find a target you can trick into sending useful data. Potential targets are easy enough to find too, either by social media searches or by penetrating network security and then lurking around to see who normally does what.

Now let’s shift to the HR employee. The key phrase in the last sentence above is to see who normally does what—and the company, to its credit, spent time training employees on data privacy and how to handle data carefully. In other words, the company had trained the employee to know who normally does what, including the CEO asking for personal tax data. Which wasn’t normal. So the employee investigated and thwarted the attack.

Let’s Get Conceptual

stressedOver in the IT security department, they call this idea identity assurance—and it’s gaining appeal, since the alternative is to get ever more rigorous with authentication. That would mean more passwords with more special characters, more key fobs, more security tokens, and all the other rigmarole that human beings can’t stand. Better to understand what a normal employee’s behavior on the IT network looks like, identity assurance says, and then apply more security when abnormal behavior happens.

Here in the compliance department, we need to get our heads around the concept in a different way: we need to make training and judgment the key controls for data privacy, rather than a firewall and tough password policies. Those IT controls will never go away, but their binary nature means they can only handle yes-no questions. Humans are still crucial to answer the “this seems weird” questions, and will be for quite a while.

The challenge in the near term, however, is scaling up the “this seems weird” control to large enterprises. Developing profiles of normal users is not easy, and developing a speedy, simple policy for exception requests even less so. (If asking for exceptions is too hard, remember that employees will just order a mobile wi-fi spot, or connect to their cell phones, or otherwise evade your network entirely.) Incorporating third parties that might deserve some access to your data will be no easy task, either.

Alas, the pesky reality is that hackers don’t care how hard fighting them will be. If we keep relying on authentication and IT controls, we keep exposing ourselves to the risk that they will impersonate someone skillfully enough to get by those controls. And hackers will only get better at the task. We will need much more sophisticated systems to understand data usage and employee activity, and much more training (of more employees) to keep one step ahead of cyber threats. My CEO friend and his employee are off to a great start. It only gets harder from here.

 

1 Comments

  1. Grace Hewitt on May 4, 2016 at 2:23 pm

    Very good article. We want to strike the balance between customer service and risk? how do we achieve this? Whenever chase shuts down my card for specific purchases and I embarrassingly have to call them I am annoyed, but I am also happy about the precautions they are taking. The idea that the judgment of a person can be significant is great but who will we use and suppose the hacker is internal?

Leave a Comment

You must be logged in to post a comment.