The New York Department of Financial Services is opening a probe into the new Apple Card’s algorithms when determining credit limits after series of tweets from a technology entrepreneur in the past few days alleged gender discrimination.
A series of posts from David Heinemeier Hansson starting Thursday criticized the Apple Card for giving him 20 times the credit limit that his wife got. Hansson is known as being behind the popular programming tool Ruby on Rails. He’s a partner at Basecamp, a web-based software development firm.
Hansson didn’t disclose any income for himself or his wife but said they filed joint tax returns and that his wife has a better credit score than he does.
“The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex,” said a spokesman for Linda Lacewell, the superintendent of the New York Department of Financial Services, according to a Bloomberg report. “Any algorithm, that intentionally or not results in discriminatory treatment of women or any other protected class of people violates New York law.”
“Our credit decisions are based on a customer’s creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law,” Goldman spokesman Andrew Williams said.
Hansson said Goldman’s response doesn’t explain what happened after he started airing his issues on social media.
“As soon as this became a PR issue, they immediately bumped up her credit limit without asking for any additional documentation,” he said in an interview. “My belief isn’t there was some nefarious person wanting to discriminate. But that doesn’t matter. How do you know there isn’t an issue with the machine-learning algo when no one can explain how this decision was made?”
Hansson said his posts had led to an internal review and that he was hopeful it would spark a conversation about black-box algorithms and the inherent biases in those systems.
The use of algorithms by lenders in credit decisions has drawn scrutiny in Congress. In June, the House Financial Services Committee heard about examples of algorithmic decision-making where researchers have found instances of bias targeting specific groups even when there was no intent to discriminate.
“Goldman and Apple are delegating credit assessment to a black box,” Hansson said. “It’s not a gender-discrimination intent but it is a gender-discrimination outcome.”
This is the second such action from the New York regulator in recent weeks. NY DFS opened a probe against health care giant UnitedHealth Group Inc.
after a study found an algorithm favored white patients over black patients.
The tweets, many of which contain profanity, went viral on the internet.