Weapons of Math Destruction

Cathy O’Neil

Humans have had biases for the longest while. But they have come a long way from from the simplistic ones that helped the species band together and survive, or even the the heuristics we have applied and continue to use at an individual level. As search engines and social networks move from tools to utilities, we have now begun to see the dark side of “software is eating the world” – when biases and discrimination are being codified into systems, resulting in blind discrimination that widens inequality by preventing people from climbing out of poverty, unemployment, homelessness and all the things that we as society should be ashamed of. 

The author, thanks to her education and experience, is well placed to write on the subject. She calls the mathematical models or algorithms WMDs – Weapons of Math Destruction. They have three common characteristics – scale, opacity and damage. To elaborate, they typically use data sets to create scoring systems that evaluates people in various ways. Most of these systems are proprietary. The scale at which these systems operate is really large, and therefore the damage they cause is also equally massive. Most of the people affected don’t even realise they’re being discriminated against and sent on a downward spiral. And most importantly, there is no feedback loop to make the system better. 

The author frames it very well through examples across life stages – how teachers and students are evaluated in schools and universities, how employees – potential and existing – are screened, how online ads for predatory services like payday loans are targeted, how insurance premiums are decided, how law and order systems end up “creating” criminals simply based on location data, and perpetrates injustice, how scheduling software destroys any sense of work-life balance. These are not dry, statistical examples, but backed by stories of actual humans devastated by an unfeeling algorithm, with nearly no chance to work their way out of it.

In the future, what really stops us reaching “Minority Report” scenarios? Arresting people because they could be potential criminals, screening applicants based on health issues they could have in the future, and so on. And since everything is based on users sharing data, it seems certain that privacy will come at a price. In the past, when greed and business “progress” began to have harmful effects on society, the government stepped in and put laws in place to safeguard us. But now governments themselves use these systems! We still have a chance to work our way out of this and bring back dignity. For instance, the European model, in which data collection must be approved by the user and is opt-in, and the reuse of data is prohibited. 

But will we? In our efforts to remove biases, we have ended up creating systemic monstrosities that lack empathy, and only focus on efficiencies. Those who learn to game the system profit. Winners keep winning, losers keep losing, as fairness is forgotten because entire business models are built on them. As the author rightly points out, “The technology already exists. It’s only the will we’re lacking.” And that, really, is the problem, because “Algorithms are opinions embedded in code”. An important book for the times we live in, and the future.

Leave a Reply

Your email address will not be published. Required fields are marked *