Like so many other adults in this world, I have children who play Fortnite. When they play, they ignore me. This gives me time for other pursuits, such as reading the massive enforcement action and compliance reforms that the feds just imposed on Fortnite.
You may have seen the headlines already. Epic Games, the owner of Fortnite, agreed to pay $520 million to settle civil charges that the company (1) collected the personal data of children playing the game without first getting parental consent; and (2) configured the game’s settings so players could easily contact each other, and then left those settings in place even as evidence piled up that some children were being bullied or harassed, including by adult Fortnite players.
We can get to the details of what Epic Games did wrong in a moment. For the larger compliance community, the real news here is the Federal Trade Commission. This is the third significant enforcement action the FTC has taken in recent months against an online business that indulged in poor data protection practices. Collectively, the settlement terms in these cases read like a set of best practices for information protection programs that other companies should implement ASAP, since the Federal Trade Commission clearly has an ambitious enforcement appetite these days.
Now back to Epic Games and its faulty practices with Fortnite. As described in the complaint filed Monday by the FTC and the Civil Division of the Justice Department, Epic Games launched Fortnite — an online adventure game that people (of any age) can play with friends or others they meet in the game world — in 2017, to immediate success. Fortnite allows players to talk with each other in real time through voice and text; and it also allows players to make in-game purchases for virtual costumes, dance moves, and other novelties sold by third parties under licensing agreements.
The privacy failures fell into two camps:
- The default settings on Fornite allowed any player to talk to any other player. This soon led to some players bullying or harassing others, as well as adult players threatening children. (Roughly 70 percent of Fortnite’s players are teens or children living with their parents.) Even when Fortnite offered mechanisms to turn off the online chat features, those settings were buried away and hard for children to find.
- Fortnite collected lots of personal information about its users without first obtaining parental consent. That is a violation of COPPA, the Children’s Online Privacy Protection Act.
The complaint offers damning evidence that Epic Game designers knew Fortnite had problems, and that management turned a blind eye to them. For example, in 2018 a Fortnite designer emailed the company’s PR manager and creative director to say:
[O]ur voice and chat controls are total crap as far as kids and parents go. It’s not a good thing… This is one of those things that the company generally has weak will to pursue, but really impacts our overall system and perception.
If that’s not evidence of poor tone at the top, I don’t know what is.
Compliance Program Reforms
Let’s jump ahead to the consent decree reached by Epic and the FTC. The agreement still needs federal court approval before it goes into effect, but assuming that formality proceeds as usual, the order is a roadmap for the privacy compliance practices that the Federal Trade Commission wants to see.
Foremost, the consent decree requires that Epic simply implement a compliance program. Yes, the company has been making improvements to its privacy practices already, but the consent decree gives a better glimpse into the FTC’s expectations. Among the specifics:
- Document the program in writing; and then provide that written program and any future evaluations of it to the board.
- Designate a qualified person to run the privacy program on a day-to-day basis.
- Perform a privacy risk assessment at least annually.
- Design a series of safeguards to protect personal data the company collects; and run annual COPPA compliance training for all employees and contractors.
- Test the performance of those safeguards at least annually.
- Include privacy compliance considerations when evaluating and selecting third parties.
Epic also needs to undergo an independent privacy audit every other year, and then provide copies of those audits to the CFPB for the agency’s own review. That audit requirement will be in place for 20 years — which is quite a long time, but also is a standard time period for FTC settlements. (For example, Twitter is in hot water these days because its recent actions might be violating a consent decree the company signed in 2011.)
Most interesting: Epic must also submit certifications signed by the “principal executive officer” (one assumes that would be Tim Sweeney, Epic’s founder and CEO) that the company has established an effective privacy program and is not aware of any material violations that haven’t been reported to the FTC. That certification requirement is in place for 10 years.
The Bigger Picture on Privacy Compliance
What strikes me is how similar these terms are to other enforcement actions the FTC imposed earlier this year against Drizly.com and Chegg.com, two online businesses that allowed bad habits in privacy and information protection to linger for years.
Granted, those companies had less egregious offenses that didn’t involve the privacy of children, so there were no monetary penalties involved. But several of the required privacy reforms in those cases turned up here, too:
- Written policies and procedures
- A qualified, designated person to run information protection
- Annual risk assessments
- Technical safeguards for personal data
- Third-party audits
In an ideal world, such measures should be standard fare for any privacy program. In the real world many companies still struggle with those block-and-tackle fundamentals — especially if you’re in a high-growth, consumer-facing business, which Drizly, Chegg, and Epic all are.
We also need to ponder that compliance program certification requirement, which only applies to the CEO rather than to the chief compliance officer as well. How would an arrangement like that affect the relationship between the CEO and the compliance officer?
For example, I don’t know that it’s reasonable for a CEO to understand all the nuances of privacy compliance programs and the internal controls therein. So what sort of assurance would the CEO want from the chief privacy or compliance officer? Could the CEO impose his or her own sub-certification onto the compliance personnel? Could the CEO delegate supervision of the privacy program entirely to the compliance officer, and hang liability around that person’s neck instead?
One analogue here is how FINRA, the regulator of broker-dealers, handles liability for chief compliance officers. FINRA made clear earlier this year that it typically won’t bring enforcement actions against CCOs for compliance violations because compliance officers don’t inherently have a supervisory role at their firms — but a CEO could delegate such supervisory duties to the CCO, and then liability might rear its ugly head after all.
The other obvious analogue is the Justice Department’s new policy to have CEOs and CCOs certify their compliance programs as part of regulatory settlements. In those cases, however, the CCO and CEO are bound by the plea agreement; the CEO can’t just delegate authority (and risk) to someone else.
Now we have this third way from the Federal Trade Commission. What a time to be alive.