Settlement in Algorithm Discrimination Lawsuit: A Step Toward AI Accountability

Rent ads

When Mary Louis prepared to move into a new apartment in the spring of 2021, she was eager to start fresh. However, her anticipation turned into despair after receiving a curt email that a “third-party service” had denied her tenancy. The Black grandmother, a long-term renter, discovered that her fate had been decided not by a person but by an algorithm.

That algorithm, used by tenant screening company SafeRent Solutions, became the focus of a groundbreaking class-action lawsuit spearheaded by Louis. On Wednesday, the lawsuit culminated in a $2.2 million settlement and marked a pivotal moment in the fight against AI-driven discrimination.

The lawsuit alleged that SafeRent Solutions’ algorithm, known for generating “SRS Scores,” discriminated against rental applicants based on race and income. Specifically, the plaintiffs argued that the algorithm ignored the benefits of housing vouchers—critical financial assistance for low-income renters—and disproportionately penalized Black and Hispanic applicants with lower credit scores, reinforcing systemic inequalities.

“Management companies and landlords need to know that they’re now on notice,” said Todd Kaplan, one of Louis’ attorneys. “These systems that they are assuming are reliable and good are going to be challenged.”

Algorithms and AI systems are increasingly deployed to streamline decisions in housing, hiring, lending, and even healthcare. Yet, as Louis’ case demonstrates, these systems can replicate or amplify biases present in their data inputs. Critics argue that such biases have real-life consequences, particularly for marginalized communities.

Christine Webber, another attorney for the plaintiffs, pointed out that even if an AI is not explicitly programmed to discriminate, it can still produce discriminatory outcomes. “The data an algorithm uses or weights could have the same effect as if you told it to discriminate intentionally,” she explained.

When Louis’ application was rejected, she attempted to appeal, providing references from two landlords attesting to her perfect rental history over 16 years. Still, the management company, relying on SafeRent’s screening report, declined her appeal, stating, “We do not accept appeals and cannot override the outcome of the Tenant Screening.”

For Louis, the rejection felt impersonal and dehumanizing. “Everything is based on numbers. You don’t get the individual empathy from them,” she said. “There is no beating the system. The system is always going to beat us.”

The lawsuit, one of the first of its kind, alleged that SafeRent’s reliance on credit data and its exclusion of housing voucher benefits led to discrimination against low-income renters, particularly those from racial minorities. SafeRent argued that it merely provided screening scores and that landlords or property managers made the ultimate decisions.

However, Louis’ attorneys, supported by the U.S. Department of Justice (DOJ), contended that SafeRent’s algorithm played a direct role in access to housing. The federal judge overseeing the case agreed, allowing the lawsuit to proceed despite SafeRent’s motion to dismiss.

Pay over $2.2 million to impacted plaintiffs.

Remove its scoring feature in certain tenant screening scenarios, particularly when applicants rely on housing vouchers.

Validate any future screening algorithms through a third party, with oversight by the plaintiffs.

While SafeRent admitted no wrongdoing, the settlement is seen as a landmark victory for AI accountability in housing. “Litigation is time-consuming and expensive,” SafeRent said in a statement, maintaining its belief that its practices comply with the law.

The implications of this settlement extend beyond housing. Algorithms now influence a wide range of life-altering decisions, from job applications to loan approvals. Yet regulation of these systems remains sparse. Experts warn that without oversight, AI could exacerbate existing inequities.

“State lawmakers have proposed aggressive regulations for these types of AI systems, but the proposals have largely failed to get enough support,” noted Kaplan. “That means lawsuits like Louis’ are starting to lay the groundwork for AI accountability.”

SafeRent’s case underscores the challenges of holding AI systems accountable. Unlike humans, algorithms lack transparency and empathy. When Louis’ application was rejected, the management company refused to consider her unique circumstances or her long history as a reliable tenant.

“Algorithms don’t know the whole story,” said Louis. “They don’t know you, and they don’t care to know you.”

The settlement sets a precedent that companies deploying AI tools can be held responsible for discriminatory outcomes, even if those outcomes are unintended. It also highlights the importance of considering the social context in which algorithms operate.

Despite the legal victory, Louis’ struggles are far from over. After being denied by SafeRent’s system, her son helped her find an apartment on Facebook Marketplace. The unit, $200 more expensive and in a less desirable area, stretched her budget even with a housing voucher.

“I’m not optimistic that I’m going to catch a break, but I have to keep on keeping, that’s it,” said Louis, who cares for her granddaughter. “I have too many people who rely on me.”

Louis’ story is emblematic of the systemic challenges facing low-income renters and communities of color. It also highlights the pressing need for broader reforms to ensure that technology serves all people equitably.

As algorithms become entrenched in decision-making processes, advocates are calling for more robust oversight. Some have proposed legislation requiring companies to audit their AI systems for bias, while others emphasize the need for transparency in how these tools operate.

“Louis’ case is just the beginning,” said Webber. “It’s a wake-up call that we can’t blindly trust these systems to be fair.”

Related Posts