Computer models can impact housing access, result in denials
A new report from the National Consumer Law Center (NCLC) examines ways tenant screening computer models harm prospective renters. These models can result in denials or make it difficult to gain approval for housing and the report shows the negative impact is disproportionately borne by Black and Hispanic renters.
“Tenant screening scores and recommendations create a misleading veneer of objectivity while concealing underlying racial disparities,” said Ariel Nelson, staff attorney at the National Consumer Law Center. “Landlords frequently make leasing decisions based solely on these tenant screening scores and recommendations, and our research reveals that they are unlikely to hear disputes or to consider mitigating factors.”
The report laments the use of artificial intelligence (AI) in tenant screeing models and notes that these credit reporting models contain biases that are harmful to some groups.
“There’s absolutely no evidence that credit scores have value in predicting whether a renter will pay their rent. Credit scores are designed for one thing only - to predict whether a consumer will be late on a loan,” said Chi Chi Wu, senior attorney at the National Consumer Law Center. “And there are huge racial disparities in credit scoring, which means their use makes it harder for Black and Latino/Hispanic renters to obtain rental housing.”
A previous report noted that inaccurate information and certain outdated or even expunged court records can still appear on tenant screening reports and result in the denial of housing.
NCLC said that even when there are errors on tenant screening reports, landlords tend to refuse to hear any dispute or take action to mitigate harms caused by the errors.