Lessons From Robinhood
FINRA served up an eye-popping enforcement action this week, hitting online trading app Robinhood with a $70 million fine — the largest ever in FINRA history — for a host of poor business practices and misleading statements made to Robinhood customers over the years.
The settlement order is 123 pages long, so clearly it will take some time to digest the whole thing and all the cautionary lessons it offers for other firms. Even after a cursory read, however, some warnings for the rest of us (even compliance officers not in financial services) jump out right away.
First, some background for anyone who doesn’t know what Robinhood is. It’s an app that allows consumers to dabble in the stock market for free. Robinhood actually makes money from interest it pockets on customers’ idle cash, and from fees market makers pay Robinhood for the right to execute all those trading orders that customers are placing.
Financially, business for Robinhood is good. The company filed for an IPO today (yep, one day after the FINRA settlement), and reported $522 million in revenue in Q1 — up more than 400 percent from the year-earlier period. Those frenzies of meme stock trading in Gamestop, AMC, and Blackberry are perfect fodder for Robinhood, so it’s making a killing.
From the compliance and risk management perspective, however… Robinhood still has lots of work to do.
The gist of FINRA’s complaint is that Robinhood (which neither confirms nor denies anything in the order) was woefully inadequate at building customer risk profiles and in governing the use of software “bots” to turbo-charge its customer offerings. The result, FINRA says, was a mess of erroneous statements made to customers, improper trading services offered to the wrong customers, and service outages that left customers unable to access their accounts (which is a big no-no in the world of free, hyperactive, online stock trading).
What were some specific failures worth examining? Let’s take a look.
Poor Business Continuity Planning
Robinhood is a fintech firm, which means technology is crucial to its ability to provide financial services. Still, FINRA said, the firm exercised poor oversight of its technology, which led to numerous system outages from 2018 through 2020. The most serious happened on March 2, 2020, and left the entire Robinhood app and website unavailable to customers for nearly 24 hours. Robinhood actually suffered several outages in early March 2020, just as the pandemic arrived and sent stocks plummeting.
The problem? Robinhood’s business continuity plan at the time only addressed physical disruptions that might limit employees’ ability to reach offices. The plan didn’t address technology-related business disruptions, which could affect customers even while all the employees were dutifully sitting at their desks.
For broker-dealers specifically, that’s a compliance violation. FINRA Rule 4370 specifically says firms must have a business continuity plan that allows the firm to keep providing services, including communications with customers and customers’ access to funds or accounts. FINRA even published guidance at the start of the pandemic reminding firms of their duties on this issue.
For the rest of us, the lesson here is to view business continuity plans expansively. They aren’t about finding a secondary office in the event of disaster; they’re about how to keep providing services to customers. So your business continuity risk assessment, and any ensuing business continuity procedures, should focus on that issue first. You might find that your biggest continuity risk could strike even while all your offices are open and full of employees.
Troubles With Bots and Risk
Robinhood thrives on speed and scale; that’s how the business drew in millions of users and positioned itself for the gigantic IPO it just announced. To achieve that growth, however, Robinhood had to rely on software bots to run all sorts of tasks automatically. Which raises an interesting compliance and risk management challenge.
How do you allow so much automated, high-speed activity, while also applying appropriate oversight to specific transactions or customers?
This is especially tricky for broker-dealers and financial firms, because they have to assess customers’ risk tolerance and then steer each customer to the appropriate products and services. The charges against Robinhood demonstrate how that can go haywire at high speeds.
For example, Robinhood started offering options trading to customers in December 2017. The company had written policies and procedures that assigned “options principals” (read: actual humans) to review and approve customer requests for options trading, but in practice, Robinhood entrusted most of those duties to “option account approval bots” with only limited oversight by the principals.
The bots, FINRA says, relied on programming that led to bad judgments. For example, the bots might approve a customer for options trading because the person claimed three years of experience, but the customer was under 21 (meaning the risk profile was based on the customer being a minor). The bots were programmed only to consider the most recent information provided by customers; so if a customer was rejected, he could submit “updated” information just a few minutes later, and the bot would approve him.
Ultimately, FINRA said, Robinhood approved thousands of customers who didn’t meet the firm’s eligibility criteria, or whose accounts contained red flags that options trading may not be appropriate for them. That violated numerous FINRA rules about customer due diligence.
The question for the rest of us, then, is how to assure that your transaction bots make decisions in step with your written policies and procedures.
It’s supremely important to get this question right, because if you get it wrong, you’re likely to get it wrong at scale. That’s what leads to outcomes like record-setting FINRA settlements.
So you could invest considerably more time at the beginning, to assure that the programming in your bots will work; that means lots of scoping and design meetings, testing, rewriting of code, and so forth. Or you could narrow the duties of the bots, to avoid them making mistakes that might land you in regulatory trouble. Then you’re still left with more human labor, which isn’t cheap, but at least people can’t make bad judgments at blazingly fast speeds.
It’s also important to define the “you” in my above paragraph clearly, since several people should be lumped into that pronoun. The heads of software development, customer service, cybersecurity, sales and marketing — and, yes, compliance. You all need consensus on how to handle bots, or else they’ll end up handling you.
That’s enough on Robinhood for today. My gut tells me we’ll be writing about its compliance challenges more often.