“I will remember that I didn’t make the world, and it doesn’t satisfy my equations,” reads an oath from financial data scientists Emanual Derman and Paul Willmott.
On reading this in O'Neil’s cautionary book, my mind jumps to a great line from the memoir/film, I, Tonya: “Everyone has their own truth, and life just does whatever [it] wants.” Humans are notoriously bad at identifying that just because we think something is true doesn’t mean that it is or (perhaps more importantly) that others will share our opinion.
O'Neil warns about the dangers of giving us irrational humans the power to enforce our world view on millions to billions of people in coding algorithms. In our post-truth era, the pain that results from the biases in life’s pervasive algorithms are borne by society’s most vulnerable. McNeil’s thesis is that if the algorithm is rigged to always watch a specific person, eventually you’ll catch them doing something wrong. All is fair when life screws everyone over equally, but it's an issue when some people are systemically on the losing end.
While I wish O'Neil had spent more time on the positives of big data for prevention and the road forward with evaluating algorithms, it is undeniable that she is exceptional at breaking down complex data stories. She was able to take touchy scenarios like Trump's election and the Facebook emotion experiment, and put them in a context her readers can understand and appreciate.