- Herwig Prammer/Reuters
Algorithms are increasingly prevalent in everything we do, from getting insurance to applying to jobs (if the algorithm doesn’t pick your application from a big pile, good luck). While it may seem like these computer calculations are objective measures of whether someone is qualified for a certain position or home loan, human bias is always behind them.
Cathy O’Neil, a data scientist and founder of ORCAA Algorithmic Auditing, had a message for the audience at the 2017 TED talks: “We need to demand accountability from our algorithmic overlords.”
During one of the most popular talks of the Wednesday evening TED sessions, O’Neil laid out the various ways that algorithms can go wrong.
She told the story of the value-added algorithmic model for evaluating New York City teachers, which has penalized seemingly excellent educators, as the New York Times found out. And O’Neil brought up the ProPublica investigations into algorithmic racial discrimination, in which the news organization found algorithms discriminating in prison sentencing, car insurance premiums, and elsewhere.
O’Neil also posited a thought experiment: What if Fox News decided to get a fresh start after its recent sexual harassment scandals by using an algorithm in hiring? And what if that algorithm relied on the last 20 years of hiring data to figure out who might be successful in the future? “It would filter out women because they don’t look like people who were successful in the past,” she speculated.
“Algorithms don’t make things fair. They automate the status quo,” she said. “They’re weapons of math destruction.”
The free market won’t solve the problem, argued O’Neil, because “there’s a lot of money to be made in unfairness.”
Her solution: algorithm data auditing. That would involve scrutinizing algorithms to check for integrity (by dealing with biases in the data used to build the algorithms), examining definitions of success (to make sure the algorithm supports non-biased goals), checking overall accuracy, monitoring long-term effects, and avoiding feedback loops.
Ultimately, O’Neil believes we need a government regulator in charge of algorithms. “I’m not holding my breath in the Trump administration,” she said.