Housing Discrimination Trial Against Tenant-Screening Firm Begins In Connecticut

Photos: Brookings\YouTube

In Connecticut a significant discrimination case about tenant-screening algorithms began today. The bench trial will involve third-party tenant-screening firm CoreLogic Rental Property Solutions, now SafeRent Solutions, which is being sued for discrimination under the Fair Housing Act.

The bench trial is being held in Courtroom Two, 450 Main St., Hartford, CT before Judge Bryant. The court calendar can be found here.

Originally filed in April 2018 with the U.S. District Court, District of Connecticut, the case, Connecticut Fair Housing Center, et al. v. CoreLogic Rental Property Solutions (Case No. 3:18-cv-00705), came into the national spotlight amid growing concerns that tenant-screening algorithms discriminate against people of color. This litigation seeks to ensure that CoreLogic RPS and all tenant-screening companies follow fair housing requirements when they functionally make rental decisions on behalf of landlords and use criminal records as a part of the rental criteria.

The following statement was given by legal representatives for plaintiffs Carmen Arroyo and the Connecticut Fair Housing Center, Christine E. Webber, Partner at Cohen Milstein Sellers & Toll, Shamus Roller, Executive Director of the National Housing Law Project, and Erin Kemple, Executive Director at the Connecticut Fair Housing Center.

“Tenant-screening technologies often lack a sufficient review process to ensure fair housing standards, yet are increasingly common as landlords look to third-party companies to evaluate and streamline their rental application process. CoreLogic RPS’s analysis and ultimate denial of Ms. Arroyo’s son’s rental application illustrates how the algorithms these technologies use are discriminatory. We look forward to proving our case in court.

“Amid the systemic racism that exists in our criminal justice system, where people of color are disproportionately burdened by the law, tenant-screening algorithms that deny applicants based on past criminal records without any limitation or individualized consideration are unjust in deciding a person’s right to housing. This is a significant limitation of algorithmic technologies that frequently discriminate against Black and Latino applicants.

“We hope the housing industry is paying attention to this trial, as it will set an important precedent for tenant-screening technologies moving forward.”

Additional background on the case can be found here and here.

Comments are closed.