The Discriminatory Impacts of AI-Powered Tenant Screening Programs
July 12, 2025 by Lauren Karpinski
Landlords are increasingly using automated screening programs to evaluate prospective tenants, raising troubling concerns about how these programs’ algorithmic biases discriminate against people of color and hinder their ability to access housing. Tenant screening scores create a veneer of objectivity, touted by landlords as a tool to efficiently grade rental applicants free of bias. But evidence shows that these programs, which run checks on applicants’ credit scores, eviction records, and criminal backgrounds, routinely return incorrect, outdated, or misleading information that landlords use to disproportionately deny applications to Black and Latino renters. Ultimately, these tools worsen housing discrimination and exacerbate racial disparities in the rental application process.
A 2022 report from the Consumer Financial Protection Bureau highlights problems persisting in the tenant background check industry, outlining how errors in these programs lead to higher costs and barriers to housing.[1] The report found that “[t]oo often, these background checks—which purport to contain valuable tenant background information—are filled with largely unsubstantiated information that holds inconclusive accuracy or predictive value.”[2] Specifically, the report noted that landlords routinely deny rental housing because they receive reports with (1) incriminating information that belongs to someone other than the applicant;[3] (2) outdated information; and (3) inaccurate or misleading details about arrests, criminal records, and eviction records that are not corrected or removed from the screening reports.[4] These programs generally do not allow applicants to correct mistakes or provide context for information that turns up in the report.[5] Many landlords do not properly inform applicants of their right to dispute information in reports, as required by the Fair Credit Reporting Act.[6] Without such notices, applicants are unaware of and unable to address errors for future applications.[7]
In addition to reporting false information, the criteria tenant screening services use to evaluate prospective tenants disproportionately impact marginalized communities. Many of these programs look at applicants’ criminal records, but reliance on this data as an indicator for tenant quality is problematic for several reasons.[8] First, some screening tools rely on arrest records, which are not indicative of whether a person actually committed a crime.[9] Due to racist policing practices, racial minorities are disproportionately arrested compared to the general public, so using arrest records to screen out housing applicants disparately impacts these groups.[10] Further, programs that use criminal history to deny applicants may be broadly sweeping in criminal activity that does not bear on a person’s ability to meet their tenant obligations.[11] A recent case on appeal out of Connecticut demonstrates this issue. Plaintiffs sued a rental property for using CrimSAFE—a service that allows landlords to search criminal-records databases for information about prospective tenants—for its discriminatory use of criminal history as rental application criteria.[12] CrimSAFE combines certain “traffic accidents,” which it “concedes ha[ve] no relationship to suitability for tenancy,” into the same category as “vandalism and property damage.”[13] As a result, “a housing provider cannot exclude vandals without also excluding people involved in traffic accidents.”[14] Even if landlords want to mitigate the harm criminal background checks may have on a tenant’s application, adjusting the algorithm to do so may not be possible. CrimSAFE offers limited filters to categorize criminal activity based on the type, severity, and disposition, inevitably resulting in a blanket report that combines varying levels of offenses. The program then generates a general recommendation for the application that landlords can choose to accept or reject.[15]
There is substantial evidence demonstrating how tenant screening programs that rely on credit scores to deny housing applications have a disproportionate impact on minorities. Communities of color are more likely to have lower credit scores due to the economic consequences of the nation’s long history of racial discrimination. Redlining, employment discrimination, and debt collection—practices that have disproportionately harmed minorities— establish the foundation of the data in credit reports.[16] Structural racism has locked these groups out of wealth-building opportunities and standard credit scores don’t account for data points that could make them more racially equitable, such as cash flow, rent and utility payments, or rental payment history.[17] Further, tenant advocates warn that the credit score algorithms that some tenant screening programs use are not scrutinized by regulators and are not predictive of a tenant’s likelihood of paying rent.[18] A 2023 report by the National Consumer Law Center found that there is no empirical or scientific evidence showing that credit reports and scores accurately predict a successful tenancy.[19] Finally, credit reporting is riddled with high error rates, and consumers often have no legal remedy to challenge egregious mistakes, leaving them unable to obtain housing as a result.[20]
The Markup—a nonprofit newsroom specialized in covering the impact of technology on society—also published an investigation revealing the hidden biases in mortgage-approval algorithms.[21] The investigation found that lenders in 2019 were more likely to deny home loans to people of color than to White people with similar financial characteristics. Holding 17 different factors steady in a complex statistical analysis of more than two million conventional mortgage applications for home purchases, investigators discovered that lenders were 40% more likely to deny Latino applicants for loans, 50% more likely to deny Asian/Pacific Islander applicants, 70% more likely to deny Native American applicants, and 80% more likely to deny Black applicants than similar White applicants. In every case, the prospective borrowers of color looked almost the same on paper as the White applicants, except for their race.[22]
Finally, there is data showing why AI is more biased than humans when it comes to reviewing tenant screening scores. A behavioral study using simulated screening reports found that landlords relied primarily on the scores returned, rather than the underlying data—even though the underlying data often contained critical context, such as when a criminal charge or eviction lawsuit had ultimately been dismissed.[23] Because AI-powered screening programs have high error rates and use overbroad categories for criminal background checks, these programs facilitate a landlord’s decision to deny an applicant without inspecting for any nuance, context, or mistakes in the report.[24] Small-scale landlords typically have room for negotiation when it comes to accepting or denying an applicant, but because these reports have such decisive scoring, it reduces the landlord’s inclination for negotiation and limits individual consideration in screening processes.[25] In short, algorithms are producing screening scores that disparately harm minority applicants and landlords tend to take those recommendations.
In the long run, AI may make it harder for humans to overcorrect for algorithmic biases. A psychology study highlighted in Scientific American found that people who interact with these automated systems could be unconsciously incorporating the skew they encounter into their own future decision-making.[26] Crucially, the study demonstrates that bias introduced to a user by an AI model can persist in a person’s behavior—even after they stop using the AI program.[27] If AI-powered tenant screening programs continue to be trained with outdated, incorrect, or discriminatory data, they can pass those biases onto humans and make it harder for applicants to refute the information in their screening reports or convince landlords to rent to them.
As AI-powered tenant screening programs evolve and gain more influence over landlord decision-making and access to housing, it is critical to determine how these technologies disparately impact minority applicants to correct for these biases. In 2023, the MIT Department of Urban Studies and Planning submitted a comment to regulators “recommend[ing] that regulators establish a federal moratorium on tenant screening services until such services can be proven safe, fair, and non-discriminatory.”[28] Policymakers should explore options that will place necessary guardrails on these services to avoid perpetuating further discrimination in the housing industry.
[1] Consumer Fin. Prot. Bureau, Tenant Background Checks Market (Nov. 2022), https://files.consumerfinance.gov/f/documents/cfpb_tenant-background-checks-market_report_2022-11.pdf.
[2] Press Release, Consumer Fin. Prot. Bureau, CFPB Reports Highlight Problems with Tenant Background Checks (Nov. 15, 2022), https://www.consumerfinance.gov/about-us/newsroom/cfpb-reports-highlight-problems-with-tenant-background-checks/.
[3] Consumer Fin. Prot. Bureau, Consumer Snapshot: Tenant background checks 11 (Nov. 2022), https://www.consumerfinance.gov/about-us/newsroom/cfpb-reports-highlight-problems-with-tenant-background-checks/ (finding that the risk of error from name-only matching is likely greater for Hispanic, Asian, and Black individuals because there is less last-name diversity in those populations than among the non-Hispanic White population).
[4] Id. at 3.
[5] Id.
[6] Id. at 24.
[7] Id. at 18.
[8] Marie Claire Tran-Leung, When Discretion Means Denial: A National Perspective on Criminal Records Barriers to Federally Subsidized Housing 16, Shriver Nat’l Ctr. on Poverty L. (Feb. 2015), https://www.povertylaw.org/wp-content/uploads/2019/09/WDMD-final.pdf.
[9] Id.
[10] Id. at VII.
[11] Id. at 24.
[12] Connecticut Fair Hous. Ctr. v. CoreLogic Rental Prop. Sols., LLC, No. 3:18-CV-705-VLB, 2023 WL 4669482, at *2–3 (D. Conn. July 20, 2023).
[13] Brief for the United States as Amicus Curiae in Support of Plaintiffs-Appellants at 7, Connecticut Fair Hous. Ctr., 2023 WL 4669482.
[14] Id.
[15] Id. at 3–4.
[16] Id. at 59.
[17] Abby Boshart, Urban Institute Initiative, How Tenant Screening Services Disproportionately Exclude Renters of Color from Housing (Dec. 2022), https://housingmatters.urban.org/articles/how-tenant-screening-services-disproportionately-exclude-renters-color-housing.
[18] Erin Smith & Heather Vogell, How Your Shadow Credit Score Could Decide Whether You Get an Apartment, ProPublica (Mar. 22, 2022), https://www.propublica.org/article/how-your-shadow-credit-score-could-decide-whether-you-get-an-apartment.
[19] Chi Chi Wu et al., Nat’l Consumer L. Ctr., Digital Denials: How Abuse, Bias, and Lack of Transparency in Tenant Screening Harm Renters 53 (Sept. 2023), https://www.nclc.org/wp-content/uploads/2023/09/202309_Report_Digital-Denials.pdf.
[20] Id. at 58.
[21] Emmanuel Martinez & Lauren Kirchner, The Secret Bias Hidden in Mortgage-Approval Algorithms (Aug. 2021), The Markup, https://themarkup.org/denied/2021/08/25/the-secret-bias-hidden-in-mortgage-approval-algorithms.
[22] Id.
[23] Wonyoung So, Which Information Matters? Measuring Landlord Assessment of Tenant Screening Reports, 33 Housing Policy Debate 1485, 1502–03 (2023), https://doi.org/10.1080/10511482.2022.2113815.
[24] Id.
[25] Id. at 1503.
[26] Lauren Leffer, Humans Absorb Bias from AI—And Keep It after They Stop Using the Algorithm, Scientific American (Oct. 23 2023), https://www.scientificamerican.com/article/humans-absorb-bias-from-ai-and-keep-it-after-they-stop-using-the-algorithm; Lucía Vicente & Helena Matute, Humans inherit artificial intelligence biases, 13 Scientific Reports 15737 (2023), https://doi.org/10.1038/s41598-023-42384-8.
[27] Id.
[28] Rebecca Burns, Artificial Intelligence Is Driving Discrimination in the Housing Market, Jacobin (June 27, 2023), https://jacobin.com/2023/06/artificial-intelligence-corporate-landlords-tenants-screening-crime-racism (citing MIT Dep’t of Urb. Stud. and Plan., Comment Letter on Tenant Screening Request for Information by the Fed. Trade Comm’n and Consumer Fin. Prot. Bureau (May 30, 2023).