Human biases are baked into algorithms. Now what?

Marketplace.org

Algorithms, the computer programs that decide so many things about our lives these days, work with the human data we feed them. That data, of course, can be biased based on race, gender and other factors. 

Recently, regulators began investigating the new Apple Card and Apple’s partner, Goldman Sachs, after several users reported that in married households, men were given higher credit limits than women — even if the women had higher credit scores. 

I spoke with Safiya Noble, an associate professor at UCLA who wrote a book about biased algorithms. She said women having little financial independence or freedom over centuries is reflected in the data algorithms use to evaluate credit. The following is an edited transcript of our conversation.

Safiya Noble: You’d look at that history happening by the hundreds of thousands of transactions a month or more. That starts to become the so-called truth or the baseline of data that gets collected into these systems. This is one of the reasons why women still have a very difficult time, and I think the Apple credit card was a perfect example of the flawed logics. The unfortunate part is this happens a lot to working-class people, people who aren’t rich and who may or may not even know what’s happening to them.