Two years ago, Mary Louis has applied to rent an apartment in the Granada Highlands in Malden, Massachusetts. She liked that the apartment had two full bathrooms and that there was a swimming pool on the territory. But her landlord denied her an apartment, allegedly because of the grade assigned to her by SafeRent’s tenant verification algorithm.
Louis responded with links to proof of 16 years of rent punctuality, but to no avail. Instead, she rented another apartment, which cost $200 a month more, in an area with a higher crime rate. But the class action filed by Louis and others Last May, the company alleges that SafeRent’s ratings, based in part on information from a credit report, constituted discrimination against black and Hispanic renters in violation of Fair Housing Act. The groundbreaking law prohibits discrimination based on race, disability, religion or national origin and was passed by Congress in 1968 a week after the assassination of Martin Luther King Jr.
This case is still pending, but last week the US Department of Justice used a memo filed in court to send a warning to landlords and creators of tenant screening algorithms. SafeRent argued that the algorithms used to screen tenants are not covered by the Fair Housing Act because its assessments only advise landlords, not make decisions. A Justice Department memo, filed jointly with the Department of Housing and Urban Development, dismisses this claim, saying that the law and associated case law leave no ambiguity.
“Housing providers and tenant screening companies that use algorithms and data to screen tenants are not exempt from liability when their actions disproportionately deny people of color access to fair housing opportunities,” said Kristen Clark, head of the Department of Justice’s Civil Rights Division. statement.
As in many areas of business and government, algorithms that assign points to people have become more prevalent in the housing industry. But while they claim to improve efficiency or identify “best tenants,” as SafeRent marketing materials suggest, tenant screening algorithms can help historically persistent housing discrimination despite decades of civil rights law. A 2021 US National Bureau of Economic Research study which used bots using names associated with various groups to reach out to more than 8,000 landlords found significant discrimination against tenants of color, especially African Americans.
“It’s a relief that this is being taken seriously – there is an understanding that algorithms are not inherently neutral or objective and deserve the same level of scrutiny as decision makers,” says Michelle Gilman, University of Baltimore law professor and former civil lawyer for human rights in the Ministry of Justice. “The fact that the Department of Justice is involved in this is a big step for me.”
AND 2020 investigation The Markup and Propublica found that tenant verification algorithms often face obstacles such as misidentification, especially for people of color with common last names. An evaluation of Propublica’s algorithms by Texas-based RealPage last year showed that it can raise rent.