Organization
The Markup
Award
Excellence in Social Justice Reporting, Portfolio
Program
2022
Entry Links
Link 1
Link 2
Link 3
Discriminatory lending practices have been well-documented throughout the years. Lenders’ response to researchers and journalists has been that reporters don’t have enough relevant data to make a true determination and, if they did, the disparities would disappear.
The Markup’s investigation Denied debunks the lenders’ argument that the inclusion of certain financial characteristics would eliminate apparent bias in mortgage approval decisions. We analyzed more than two million mortgage applications and found that people of color were denied at higher rates than similarly qualified White applicants. The analysis includes 17 variables, including debt-to-income and combined loan-to-value ratios. Lenders previously said including those specific variables, which were not public at the time, would explain what appeared to be racial disparities in lending. But when we included the newly released variables, we found that wasn’t true.
Instead, lenders were 40 percent more likely to turn down Latino applicants for loans, 50 percent more likely to deny Asian/Pacific Islander applicants, 70 percent more likely to deny Native American applicants, and 80 percent more likely to reject Black applicants than similarly qualified White applicants. These are the national rates. We also found disparities in 89 metro areas across the country.
Our story explained that part of the reason these disparities persist, 50 years after the Fair Housing Act outlawed the racist housing policy known as redlining, can be tied to factors considered by algorithms used in the mortgage application process.
The investigation shows that Fannie Mae and Freddie Mac, two quasi-governmental companies that de facto set the standard for mortgage lending, rely on algorithms that can disproportionately harm people of color.
The two companies require lenders to use a credit-scoring algorithm called “Classic FICO” to determine whether applicants qualify for a conventional loan. But that credit-scoring model was built using data from the 1990s and is more than 15 years old. It’s considered detrimental to people of color because it rewards people who have access to traditional and mainstream types of credit, from which people of color have historically been shut out.
The most important algorithms influencing mortgage decisions are automated underwriting systems developed by Fannie and Freddie. The pair buy half of all mortgages in America, so most lenders use these algorithms for approval decisions. Ultimately, no one outside Fannie and Freddie knows exactly what factors their underwriting algorithms use or how they are weighted. Not even the companies’ government regulator, the Federal Housing Finance Agency, knows exactly how these algorithms work.
In the story’s sidebar, we identified specific mortgage lenders with the most egregious disparities. When we investigated their backgrounds, we found that all of them had faced criticism from at least one government agency in recent years for their business practices. Another pattern: Three of these lenders are affiliated with the nation’s largest home builders. Each lender in our sidebar was at least 100 percent more likely to deny Black and Latino applicants than similarly qualified White ones, and these lenders concentrated their loans in upper- and middle-class neighborhoods.
The Online Journalism Awards™ (OJAs), launched in May 2000, are the only comprehensive set of journalism prizes honoring excellence in digital journalism around the world.