Keeping Your Compliance Bearings in Big Data World

Corporate compliance officers have another agency firing warning shots across the bow these days: the Federal Trade Commission, which has shared more of its thinking recently about how businesses should handle Big Data.

Last week the FTC published a 35-page report looking at that various ways a firm could get into trouble by using Big Data poorly, plus questions compliance officers should try to ask early to help your company avoid FTC fire. And the title of the report—Big Data: A Tool for Inclusion or Exclusion?—tells you everything you need to know. If your firm uses Big Data in ways that end up excluding certain populations from the products or services you sell, expect to be in trouble.

To a certain extent, that is not news; it’s simply the latest example of the Obama Administration looking more at the outcome of company or industry behavior, to see whether the result inflicts undue difficulty on any particular population group—regardless of whether you had any intent to discriminate or not. Used poorly, that’s what Big Data can do: put a company’s judgment on auto-pilot, and then accidentally exclude one group from something supposedly offered to everyone.

The FTC report gives examples such as the Fair Credit Reporting Act or the Equal Credit Opportunity Act, both of which might cause you Big Data trouble if your data collection and crunching are done thoughtlessly. The buzzword here is “predictive analytics”—taking all those variables you can now collect about a customer (spending history, payment history, social media use, credit scores, zip code, employment history, and so forth) to deduce what someone with those statistics will probably do next.

Marketers love predictive analytics because it helps them sell more products. HR loves it because it them make better hiring decisions. Even lawyers might like predictive analytics because it can help identify high-risk customers or third-parties.

Still, predictive analytics can also be used for, say, redlining or racial discrimination. It can even cause red-lining or racial discrimination if you’re not careful, which is the FTC’s point in its paper. Big Data will not seem so great after that.

If you want to read a useful book about Big Data, try Big Data: A Revolution That Will Transform How We Live, Work, and Think, by Viktor Mayer-Schonberger and Kenneth Cukier. I won’t do a full book review here, but the authors give a fantastic review of Big Data’s potential—and one tenet of the Big Data world, they say, is that we will be able to determine correlation without knowing causation.

That is a nerdish way of saying our software algorithms will be able to crunch astounding amounts of data and conclude, “because X is happening, there is a very high probability that Y will happen next.’ But that’s all the software will do; it won’t tell us whether X somehow causes Y to happen.

In many instances that analysis will be harmless: a retailer can safely use Big Data to, say, conclude that married women in their 30s buying baby cribs correlates to being pregnant. On the other hand, race and employment status correlate, too—but that tells us nothing about why one minority person might have a job or not, or whether his case is a result of his own shortcomings or broader injustice against his race. And using race to set policies about hiring, housing, credit, or anything else would be offensive and illegal.

So how can compliance officers stay ahead of Big Data risks? Consider this line from the FTC report: “Only a fact-specific analysis will ultimately determine whether a practice is subject to or violates [fair credit laws], and as such, companies should be mindful of the law when using Big Data analytics to make eligibility determinations.”

If the FTC comes calling on your firm, you will be heavily involved in finding the facts for investigators to do that fact-specific analysis. You will be defending your firm’s collection of “facts,” which is just another way of saying “data.” So you’ll need to have solid data collection practices and procedures, and controls to prevent people from violating them. You’ll also need to pay more attention to inherent biases—for example, what criteria your firm uses to evaluate credit applicants, when the people selecting those criteria are well-educated, well-paid data scientists with advanced degrees, trying to evaluate the prospects of a much more diverse group.

Neither of these lessons are taught terribly well in Chief Compliance Officer School. You would do well to learn about them anyway, because the problems of Big Data are not going away. Neither is the Federal Trade Commission.

Leave a Comment

You must be logged in to post a comment.