Book Review: Weapons of Math Destruction of Cathy O’Neil

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.
Everyday activities are more and more shifting to a digital environment. Digital gadgets such as smartphones and werable devices are becoming inseparable part of our lives promising mostly convenience. New digital technologies have been mainly seen as empowering technologies for the users. FitBit, for example, is claimed to be a motivating device to lead a healthy and active life enabling users to achieve their goals analysing their data [1]. The data collected by this kind of devices include sleeping patterns, the number of steps, the amount of time they are engaged in physical activities and so forth. However, these data are not available just to the users but also to companies that can use them for multiple purposes. Health insurance companies, such as Vitality [2], already exploit their customers’ data in an exchange of rewards such as free tickets to the cinema or hot beverages. The potential implications of the collection and manipulation of personal data on a personal and societal level though have been downgraded. Just imagine a National Health insurance business model that operates on the basis of the classification of citizens as high- or low- risk based on their data [3]. Citizens profiled as low-risk will be granted with lower health contributions, while high-risk profiled citizens will be paying expensive and unaffordable plans.

Imagine a society where decisions on public well-being, education and so forth will be dependent on algorithmic predictions. Cathy O’Neil’s book Weapon of Math Destruction; How Big Data Increases Inequality and Threatens Democracy explores exactly these societal consequences emerging from the abuse of big data predictions.

O’Neil gives insights of how algorithms can be misused in the sake of convenience and cost efficiency resulting in practices of discrimination and bias, amplifying inequality and threatening ultimately Democracy. Her book is written for the lay public drawing though upon her academic expertise and her working experience in the financial sector. O’Neil after earning a PhD in Mathematics at Harvard, worked for the D. E. Shaw hedge fund when she initially felt a sense of disillusionment towards mathematics for their part in the financial crisis in the U.S. in 2008. The financial sector was relying on algorithmic models based on mathematical formulas that, using her words, “were more to impress than clarify”. It is when similar incomprehensible models got adopted into other sectors that she started investigating on the matter. 

Cathy O’Neil

In her book, informed by her studies, O’ Neil aims to provide a mean to raise public awareness over the abuse of algorithms and the emerging societal impact. The first chapters of the book introduce the notion of a model, its properties, and defines the so-called Weapons of Math Destruction (WMDs). Through her personal story in the financial sector, she describes in detail the way mathematical models are abused in favour of specific interests which is the reason that she quitted her job to focus on the study of inequality and discrimination practices resulting from the dependency on big data.

Following, O’ Neil offers insights of specific cases, that are briefly described above, where decisions solely depend on ill – informed algorithmic models. A data-driven society destructs any sense of equality and fairness, changing ultimately the priorities of the society.


Education in U.S. In 1988, the first data-driven ranking of colleges was introduced, changing the priorities in educational system. Instead of focusing and improving the education system, colleges well thought to invest time and money in tricking the ranking. This is achievable when you know the variables—or proxies—of the model, then you can game them. Colleges had to come up with new ideas for reaching those coveted positions that eventually evolved in an increase on the tuition fees. Since the society has been embracing the fact that attending top ranking colleges can lead to a life of privileges, those who suffer the most are poor and middle class families that cannot afford those unsustainable tuition fees.

Online Advertising. The second case depicts how targeted advertisements are anything but harmless. Targeted advertisement is the core business of many companies like Facebook and Google that secure their profit through the automated profile analysis and classification of their users based on their online activities.

Crime Prediction. O’ Neil moves then to explore crime prediction practices based on big data using examples of relevant software tools like PredPol, CompStat, HunchLab and so forth. Police employs the — so-called by O’Neil– WMDs for the sake of efficiency leading to discrimination against specific groups of the population due to the bias of the algorithms.

O’Neil also explores cases of WMDs on a personal level such as personality tests that can have an impact on one’s job application, the injustice created by scheduling algorithmic models to improve profitability for companies, the inequality in providing services based on one’s online performance, like the health insurance case described above, and how e-scores are unjust and arbitrary and all the models that are based on them do not encourage class mobility.

At the very end of this round up, O’ Neil explores microtargeting, which I consider one of the most important cases considering its role during the last presidential election in the U.S. [4]. Microtargeting [5] is a marketing process that allows to operate on a fine-grained level, targeting individuals, influencing their thoughts or actions. In this chapter, O’Neil explains how this new science has been exploited for shaping our political beliefs. She shows how social media can foster this process, and how American voters were constantly analysed and bombarded with tailored-made information. Even though this case may sound conspirational reminding yet another plot of an apocalyptic fiction scenario, there is ever more academic and journalistic research on the topic. In a recent article [6], the Guardian reported on the leading actors behind the Trump’s success at the last presidential election, amongst which it was Cambridge Analytica, a data analysis firm performing psychographic analysis [7] on the voters. Similarly, there is some evidence on the use of big data for the leave campaign in the context of the EU referendum in the UK [8]. This shows a shift in the information gate-keeping from editors and journalists to algorithms. Based on algorithmic models, users are delivered tailored news. O’Neil explains how these new practices of user profiling leads to asymmetry in information, that eventually confines us in our own bubble of knowledge, meaning that what we think it is real, might be different from what our neighbours or even our relatives think. Tim Berners-Lee, the inventor of the World Wide Web, in a letter for the 28th birthday of the Web, expresses similar concerns on political advertising online, arguing that it lacks of transparency and understanding, wondering whether this is ultimately according to the values of Democracy [9].

O’Neil concludes the book with an even more pessimistic note: political and algorithmic models feed each other. Social inequality and injustice are amplified in a digital era where predictive algorithms are built on biased data. Even more, the result of these algorithms is considered to be objective as data can never lie, influencing our lives from which college we attend, which job profile we match to who is the best candidate as our life partner. People can be denied opportunities, get jailed, and overcharged for services and loans as a result of an algorithmic process. These data-driven decisions put our society into a vortex-like scenario, if we find ourselves too much close to the eye, it leaves us no chance to oppose, no matter how hard we try to fight and improve.

This book raises a broad range of potential social implications in an era that people, governments, market and policing agencies rely ever more on big data and algorithms that try to make sense of the world in an automated way. O’Neil through her expertise and using real examples comes to break the glass of ignorance to a wide range of audience. She provides a different way to understand digital technologies, big data and the algorithmic models contrary to the dominant one of empowerment. She reveals the surveillant aspects of new technologies portraying them in a symbolic level as weapons of math destruction that rule our lives.

O’ Neil’s book can inform both media and computer science circles and act as a bridge for interdisciplinary collaboration. It is crucial for media studies scholars to understand how algorithms work in order to explore in full the social aspects and implications. At the same time, it is urgent for computer and data scientists to assess whether these models encode human prejudice, misunderstanding and bias.

On this regard, O’Neil, makes a big effort in describing such threat without demonising these technologies but putting forward concepts such as transparency and accountability.

We are generating more data from our daily life than ever before. The data is also generated about us beyond our control, collected and shared by countless entities such as the Internet of Things. This new world brings enormous opportunities but also new risks. Our challenge is to guarantee that fairness and justice are fulfilled.


I am thankful to my friend and colleague Pinelopi Troullinou, who helped me in reviewing this post.


[1] Fitbit defends step goal after experts criticise 10,000 a day target as meaningless.

[2] Activity Tracking. Vitality.

[3] That Health Tracker Could Cost You.

[4] Robert Mercer: the big data billionaire waging war on mainstream media.

[5] Tom Agan. Silent Marketing: Micro-targeting.

[6] Robert Mercer: the big data billionaire waging war on mainstream media.

[7] Goldberg. The Structure of Phenotypic Personality Traits.


[9] Tim Berners-Lee. Three challenges for the web, according to its inventor.