Greystar’s AI Antitrust Settlement
Anyone looking for insights into algorithmic misconduct arising from a company’s use of artificial intelligence may want to look at the apartment industry this week. The Justice Department just settled an antitrust case with the nation’s largest landlord that offers us a few points to ponder.
The settlement was announced last Friday between the Justice Department and Greystar Management Services, which manages 950,000 apartments across the United States from its headquarters in South Carolina. Prosecutors had sued Greystar and five other large landlords back in January, alleging that they used an AI tool called RealPage to help them determine rents — and that by sharing competitive, nonpublic pricing data with RealPage, they were engaging in anti-competitive behavior that kept rents artificially high for consumers.
The Justice Department and numerous states also filed an antitrust lawsuit against RealPage itself in 2024, which is still pending; and shareholder groups have filed their own class-action lawsuits against the six property companies named in the Justice Department’s antitrust lawsuit from January. (Greystar just announced a proposed settlement for one of those class-action shareholder lawsuits, too.)
So why should compliance officers care about this antitrust settlement with Greystar? Because it’s our first glimpse into how the Justice Department will think about algorithmic pricing software, and the liability risks that might arise if companies use such AI tools without taking proper precautions. Given the populist animus against AI pricing — animus that exists among Democrats and Republicans alike, at both the state and federal levels — ethics and compliance teams would be wise to pay attention here.
The Greystar Conduct in Question
Let’s first review the antitrust allegations against Greystar and its five co-defendants in the Justice Department lawsuit.
Some of the allegations were “traditional” price-fixing misconduct, such as property managers among the six firms directly communicating with each other to share information about rents, occupancy, pricing strategies and discounts. For example, one claim was that Greystar supplied a competitor with information about recent renewal rates and Greystar’s approach to pricing for the upcoming quarter.
More interesting are the allegations about how Greystar (and its rivals) all used RealPage to determine apartment rents. Essentially, Greystar and the other property managers all fed their confidential pricing data to RealPage, which then used AI algorithms to set rents for individual apartments. The more data they poured into RealPage, the more precise its rent suggestions were, always with an eye to maximum yield.
Landlords, naturally, loved the data-driven analysis to help them determine exactly how much they could jack up rents. Renters, on the other hand, were relatively powerless to shop around and find better rents. ProPublica (one of the best news sources around) had an in-depth expose of RealPage in 2022 depicting how anticompetitive most U.S. markets became once RealPage took root among landlords there.
The key point is that all six major property companies used RealPage as a back-office system, and its success depended on all of them sharing their confidential data with the software algorithm to get results (rental prices) that consumers couldn’t contest in any real, effective way. That’s what set off antitrust alarm bells in the Justice Department and among state AGs across the country.
All this brings us to last week’s settlement. While Greystar insists that its use of RealPage did not break any laws, the company did agree that it will no longer use RealPage or any other algorithmic pricing software that generates pricing recommendations using competitors’ confidential data. Nor can Greystar use any AI pricing tool that might share Greystar’s own data with other competitors.
Greystar also cannot share its confidential rental information with competitors in any other ways (getting back to that traditional antitrust misconduct we mentioned earlier); and must cooperate with the Justice Department’s case against RealPage.
AI Compliance Lessons to Learn
Let’s first dispense with the Antitrust Compliance 101 lessons. Don’t allow your employees to swap confidential data with their competitors. Have policies on that point, and train relevant employees on exactly what qualifies as confidential data they should be keeping to themselves. Configure your hotline to allow for people to submit anonymous reports about possible antitrust violations, and always remember that if your company is caught in a collusion mess with others, the first one to approach regulators and self-report gets the most favorable deal.
Those practices are the table stakes, and that’s been true for years. I’m more interested in the AI-enhanced antitrust risks here.
The issue at the federal level is that multiple companies were sharing confidential data with the same vendor, so that vendor could give all companies better pricing power over consumers. The anticompetitive scheme worked because all companies colluded to strengthen their power over consumers.
So one lesson for everyone else is to understand your relationship with various Software-as-a-Service vendors; and to determine whether those vendors also work with your competitors and co-mingle confidential data to give you a leg up over your customers.
For example, as I read the RealPage case, my mind immediately went to Workday, the HR software that vast numbers of companies use to recruit job applicants and manage employees. Plenty of people already suspect that Workday blacklists certain job applicants, and discriminates against older workers; a class-action lawsuit against Workday is already proceeding on those grounds.
To be clear, nobody is saying that employers are deliberately conspiring to use Workday to keep wages artificially low — but I do wonder what other AI use-cases might be out there that could be vulnerable to antitrust exploitation.
So any company using a SaaS vendor for tasks such as price determination, hiring decisions, product recommendations and the like should also consider the potential for antitrust abuses. For example, how do you know that confidential data you share with the vendor won’t be co-mingled with data from competitors? What assurances will you have on your contracts? How will you train employees to be sure they don’t find some fly-by-night SaaS vendor online without those protections?
AI Risk at the State Level
I’m even more curious about state AGs and their views on these issues, because state laws will be more rooted in consumer protection. So the question won’t be whether your AI tool is helping you to dominate the market, which is the primary question at the federal antitrust level. The question will be more about whether your use of AI harms consumers, market dominance or not.
Like, if you’re using AI to help you set prices, might you need to disclose, “This price was determined by AI?”
This isn’t hypothetical. Delta Air Lines recently announced plans to use AI for “dynamic pricing” of airline tickets — meaning, the price quoted to a customer will be unique, derived from AI number-crunching a multitude of factors and deciding what you’re able to pay. You might pay $450 for a seat on a flight, while the customer next to you (same flight, same day, same seat category) might pay $270. How would a consumer know that? What recourse might they have to complain?
Consumer groups have already panned this idea, and it’s not clear whether dynamic pricing might violate state laws about uniform pricing. Moreover, if a company goes down this route, how do you assure that the AI won’t end up discriminating based on race, age, gender, marital status, or some other criteria that could trigger a discrimination complaint?
The fundamental question is whether you are using AI systems in a way that might leave consumers overmatched. That’s going to catch the eye of state and federal regulators alike — indeed, it already has caught their eye. Roll up your sleeves and brace for more scrutiny of AI use, because that scrutiny is coming.
