Credit scores have a long history of prejudice. “Most changes in how credit scores are calculated over the years — including the shift from human assessment to computer calculations, and most recently to artificial intelligence — have come out of a desire to make the scores more equitable, but credit companies have failed to remove bias, on the basis of race or gender, for example, from their system,” writes Rose Eveleth via Motherboard.
While credit companies have tried to reduce bias with machine learning and “alternative credit,” which uses data like your sexual orientation or political beliefs that isn’t normally included in a credit score to try and get a sense for how trustworthy someone might be, Eveleth says that “introducing this ‘non-traditional’ information to credit scores runs the risk of making them even more biased than they already are, eroding nearly 150 years of effort to eliminate unfairness in the system.” From the report: Biases in AI can affect not just individuals with credit scores, but those without any credit at all as non-traditional data points are used to try and invite new creditors in. There is still a whole swath of people in the United States known as the “unbanked” or “credit invisibles.” They have too little credit history to generate a traditional credit score, which makes it challenging for them to get loans, apartments, and sometimes even jobs. According to a 2015 Consumer Financial Protection Bureau study, 45 million Americans fall into the category of credit invisible or unscoreable — that’s almost 20 percent of the adult population. And here again we can see a racial divide: 27 percent of Black and Hispanic adults are credit invisible or unscoreable (PDF), compared to just 16 percent of white adults.
To bring these “invisible” consumers into the credit score fold, companies have proposed alternative credit. FICO recently released FICO XD, which includes payment data from TV or cable accounts, utilities, cell phones, and landlines. Other companies have proposed social media posts, job history, educational history, and even restaurant reviews or business check-ins. Lenders say that alternative data is a benefit to those who have been discriminated against and excluded from banking. No credit? Bad credit? That doesn’t mean you’re not trustworthy, they say, and we can mine your alternative data and give you a loan anyway. But critics say that alternative data looks a lot like old-school surveillance. Letting a company have access to everything from your phone records to your search history means giving up all kinds of sensitive data in the name of credit. Experts worry that the push to use alternative data might lead, once again, to a situation similar to the subprime mortgage crisis if marginalized communities are offered predatory loans that wind up tanking their credit scores and economic stability.