Tenant Screening Software Faces Legal Reckoning Over AI Bias and Harm to Renters

Kareem Saleh
Start Your ValidationSee the Agents in Action
For Rent sign in window at dusk with phone number

Tenant Screening Software is facing a reckoning.

A newly filed lawsuit accuses a tenant screening company of failing to implement basic AI safety controls, resulting in widespread harm to renters.

According to the complaint, the company’s algorithm flagged people for criminal records they didn’t have and misclassified applicants in ways that disproportionately led to wrongful denials of housing for low-income and minority renters. These errors exacerbate existing inequalities in the rental market.

Many of these issues could have been prevented had the company implemented algorithmic fairness testing and de-biasing techniques.

As AI increasingly powers critical decisions in our daily lives, failures like this will spark more legal challenges.

Make sure your AI is making headlines for innovation, not litigation — FairPlay can help.

Hat tip to Andrew W. Grant for bringing this case to my attention.

Abstract blue and purple gradient digital artwork

Sign Up for Our Newsletter

We cover the latest in financial regulation, compliance regulation and fair lending practices and trends.