Credit scores and the biases behind them
Although your credit score is generated from data about you, the design of the algorithms behind the score is often based on broader financial trends.
“A lot of times credit scores are built on the history of all sorts of other aggregate data, so people who look like you,” said Safia NobleProfessor of Gender Studies and African American Studies at the University of California, Los Angeles.
Noble, who wrote the book “Algorithms of oppression”, investigates how algorithms can perpetuate racism and gender bias.
“And that’s where we start to get in trouble,” Noble said. “If you are part of a group that has traditionally been denied credit or offered predatory products, then your profile may, in fact, look more like those people and you will be knocked out.”
As a result, consumers may not have access to a loan, mortgage, or better insurance rates.
David Silberman, Principal Investigator at responsible credit centersaid it was part of a bigger problem.
“Credit scores reflect a lot of the history of discrimination in the country,” he said.
Silberman, who spent a decade at the Consumer Financial Protection Bureau and years in the financial services industry, reflected on how algorithms can reflect privilege, or lack thereof.
“If someone starts out with no wealth, with limited income prospects, the types of credit you can get will be affected,” he said.
For example, payday lenders are concentrated in African-American and Latino neighborhoods and tend to offer loans on less favorable termsso borrowers who use these lenders might be more likely to default.
“Your ability to repay that credit is going to be affected, and then that’s going to end up in credit scores,” Silberman said.
According to payment processor Shift, white Americans have an average FICO score of 734 – a relatively good score for most financial products. But for black Americans, it’s 677. A lower score can equate to higher interest rates or result in a loan being declined.
Since accurate historical data can always create biased algorithms, many researchers and companies are looking for new options to determine creditworthiness, but this can also be risky.
Nicholas Schmidt, CEO of SolasAI, checks the algorithms for disparate impact. He said prejudice can “seep” anywhere.
“Most people talk about bias in the data. And that’s kind of an obvious thing,” he said.
One example he shared was a lender’s algorithm for assessing the credit risk associated with people who failed to pay credit card debt. He said the best predictor was how often consumers shop at convenience stores.
At gas stations or malls, even stand-alone stores like Patron Convenience Store in southeast DC, it can get busy on Wednesday mornings, people buying lottery tickets and snacks.
“And I thought about it. What’s in a convenience store – cheap beer, cigarettes, bad candy and lottery tickets? Schmidt said. “These are probably all quite well correlated with risky behavior, which is probably well correlated with poor credit card performance.”
But then Schmidt and his team thought about it some more and realized there was a gaping hole in that analysis: food deserts. These are areas where residents are low income and do not have easy access to supermarkets or large grocery stores, according to the United States Department of Agriculture.
In 2021, about 13.5 million people lived in America’s food deserts — and many of them shopped at convenience stores.
Ekram Aman is a cashier at Penn Way Market, a mall convenience store located in a food desert in southeast Washington, D.C.

She said most of her clients use Electronic Benefits Transfer, a tool to access government food assistance programs, to buy groceries.
“They say because it’s convenient for them. And especially for people who don’t drive, it’s very convenient,” Aman said.
Most customers are from the neighborhood and walk to Penn Way, she said. Sometimes they send their children to fetch food for dinner or some household items packed from the shelves of the cramped store.
SolasAI’s Schmidt said using data generated in this way is a form of discrimination that could seep in when an algorithm lumps all these people together.
“What you’re going to do is capture the risky behavior of white people in the suburbs, going to convenience stores and buying lottery tickets and bad candy and bad beer,” he said.
But, Schmidt said, you’re also going to capture solvent people in cities, low-income people and people of color, but also wealthier people in dense cities who shop at bodegas.
Schmidt isn’t sure if this particular variable ended up in a lender’s final model, as financial services firms often adjust their models to account for built-in biases.
But, said David Silberman of the Center for Responsible Lending, these algorithms can’t do much.
“There may be adjustments that, at the margin, will bring more people into the system or give a more complete picture of their creditworthiness by looking at a richer data set,” he said. “But I think it’s marginal. It will not address the fundamental issues of inequality that we face.