Notes/References from Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy

Chapter 1: Bomb Parts: What Is a Model?

in fact the model itself contributes to a toxic cycle and helps to sustain it. That’s a signature quality of a WMD. (page 27)

WMDs are, by design, inscrutable black boxes. That makes it extra hard to definitively answer the second question : Does the model work against the subject’s interest? In short, is it unfair? Does it damage or destroy lives? (page 29)

A key component of this suffering is the pernicious feedback loop. As we’ve seen, sentencing models that profile a person by his or her circumstances help to create the environment that justifies their assumptions. This destructive loop goes round and round, and in the process the model becomes more and more unfair. (page 29)

The third question is whether a model has the capacity to grow exponentially. As a statistician would put it, can it scale? This might sound like the nerdy quibble of a mathematician. But scale is what turns WMDs from local nuisances into tsunami forces, ones that define and delimit our lives. (page 29)

As we’ll see, the developing WMDs in human resources, health, and banking, just to name a few, are quickly establishing broad norms that exert upon us something very close to the power of law. (page 30)

Even if other tools supplant LSI – R as its leading WMD, the prison system is likely to be a powerful incubator for WMDs on a grand scale. (page 30)

  Thesis material

So to sum up, these are the three elements of a WMD : Opacity, Scale, and Damage. All of them will be present, to one degree or another, in the examples we’ll be covering. (page 31)

And here’s one more thing about algorithms : they can leap from one field to the next, and they often do. Research in epidemiology can hold insights for box office predictions ; spam filters are being retooled to identify the AIDS virus. This is true of WMDs as well. So if mathematical models in prisons appear to succeed at their job — which really boils down to efficient management of people — they could spread into the rest of the economy along with the other WMDs, leaving us as collateral damage. (page 31)

Chapter 2: Shell Shocked: My Journey of Disillusionment

To be clear, the subprime mortgages that piled up during the housing boom, whether held by strawberry pickers in California or struggling black congregants in Baltimore, were not WMDs. (page 40)

But when banks started loading mortgages like Alberto Ramirez’s into classes of securities and selling them, they were relying on flawed mathematical models to do it. The risk model attached to mortgage – backed securities was a WMD. The banks were aware that some of the mortgages were sure to default. (page 41)

The risk ratings on the securities were designed to be opaque and mathematically intimidating, in part so that buyers wouldn’t perceive the true level of risk associated with the contracts they owned. (page 41)

This is a result of the way we define a trader’s prowess, namely by his “ Sharpe ratio, ” which is calculated as the profits he generates divided by the risks in his portfolio. This ratio is crucial to a trader’s career, his annual bonus, his very sense of being. If you disembody those traders and consider them as a set of algorithms, those algorithms are relentlessly focused on optimizing the Sharpe ratio. Ideally, it will climb, or at least never fall too low. So if one of the risk reports on credit default swaps bumped up the risk calculation on one of a trader’s key holdings, his Sharpe ratio would tumble. This could cost him hundreds of thousands of dollars when it came time to calculate his year – end bonus. (page 46)

Chapter 3: Arms Race: Going to College

If the U.S. News list had turned into a moderate success, there would be no trouble. But instead it grew into a titan, quickly establishing itself as a national standard. It has been tying our education system into knots ever since, establishing a rigid to – do list for college administrators and students alike. The U.S. News college ranking has great scale, inflicts widespread damage, and generates an almost endless spiral of destructive feedback loops. While it’s not as opaque as many other models, it is still a bona fide WMD. (page 54)

Robert Morse, who has worked at the company since 1976 and heads up the college rankings, argued in interviews that the rankings pushed the colleges to set meaningful goals. If they could improve graduation rates or put students in smaller classes, that was a good thing. Education benefited from the focus. He admitted that the most relevant data — what the students had learned at each school — was inaccessible. But the U.S. News model, constructed from proxies, was the next best thing. (page 54)

However, when you create a model from proxies, it is far simpler for people to game it. This is because proxies are easier to manipulate than the complicated reality they represent. Here’s an example. Let’s say a website is looking to hire a social media maven. Many people apply for the job, and they send information about the various marketing campaigns they’ve run. But it takes way too much time to track down and evaluate all of their work. So the hiring manager settles on a proxy. She gives strong consideration to applicants with the most followers on Twitter. That’s a sign of social media engagement, isn’t it? (page 55)

Chapter 4: Propaganda Machine: Online Advertising

Once these campaigns move online, the learning accelerates. The Internet provides advertisers with the greatest laboratory ever for consumer research and lead generation. Feedback from each promotion arrives within seconds — a lot faster than the mail. Within hours (instead of months), each campaign can zero in on the most effective messages and come closer to reaching the glittering promise of all advertising : to reach a prospect at the right time, and with precisely the best message to trigger a decision, and thus succeed in hauling in another paying customer. This fine – tuning never stops. And increasingly, the data – crunching machines are sifting through our data on their own, searching for our habits and hopes, fears and desires. With machine learning, a fast – growing domain of artificial intelligence, the computer dives into the data, following only basic instructions. The algorithm finds patterns on its own, and then, through time, connects them with outcomes. In a sense, it learns. (page 75)

These advances in natural language have opened up a mother lode of possibilities for advertisers. The programs “ know ” what a word means, at least enough to associate it with certain behaviors and outcomes, at least some of the time. Fueled in part by this growing linguistic mastery, advertisers can probe for deeper patterns. An advertising program might start out with the usual demographic and geographic details. But over the course of weeks and months it begins to learn the patterns of the people it’s targeting and to make predictions about their next moves. It gets to know them. And if the program is predatory, it gauges their weaknesses and vulnerabilities and pursues the most efficient path to exploit them. (page 77)

Along come the for – profit colleges with their highly refined WMDs to target and fleece the population most in need. They sell them the promise of an education and a tantalizing glimpse of upward mobility — while plunging them deeper into debt. They take advantage of the pressing need in poor households, along with their ignorance and their aspirations, then they exploit it. And they do this at great scale. This leads to hopelessness and despair, along with skepticism about the value of education more broadly, and it exacerbates our country’s vast wealth gap. (page 81)

Now regulators are pushing for new laws governing the market for personal data — a crucial input for all sorts of WMDs. To date, a couple of federal laws, such as the Fair Credit Reporting Act and the Health Insurance Portability and Accountability Act, or HIPAA, establish some limits on health and credit data. Maybe, with an eye on lead generators, they’ll add more. However, as we’ll see in coming chapters, some of the most effective and nefarious WMDs manage to engineer work – arounds. They study everything from neighborhoods to Facebook friends to predict our behavior — and even lock us up. (page 82)

Chapter 5: Civilian Casualties: Justice in the Age of Big Data

Reading police chief William Heim had to figure out how to get the same or better policing out of a smaller force. So in 2013 he invested in crime prediction software made by PredPol, a Big Data start – up based in Santa Cruz, California. The program processed historical crime data and calculated, hour by hour, where crimes were most likely to occur. The Reading policemen could view the program’s conclusions as a series of squares, each one just the size of two football fields. If they spent more time patrolling these squares, there was a good chance they would discourage crime. And sure enough, a year later, Chief Heim announced that burglaries were down by 23 percent. (page 84)

Jeffrey Brantingham, the UCLA anthropology professor who founded PredPol, stressed to me that the model is blind to race and ethnicity. And unlike other programs, including the recidivism risk models we discussed, which are used for sentencing guidelines, PredPol doesn’t focus on the individual. Instead, it targets geography. The key inputs are the type and location of each crime and when it occurred. That seems fair enough. And if cops spend more time in the high – risk zones, foiling burglars and car thieves, there’s good reason to believe that the community benefits. (page 86)

But most crimes aren’t as serious as burglary and grand theft auto, and that is where serious problems emerge. When police set up their PredPol system, they have a choice. They can focus exclusively on so – called Part 1 crimes. These are the violent crimes, including homicide, arson, and assault, which are usually reported to them. But they can also broaden the focus by including Part 2 crimes, including vagrancy, aggressive panhandling, and selling and consuming small quantities of drugs. Many of these “ nuisance ” crimes would go unrecorded if a cop weren’t there to see them. These nuisance crimes are endemic to many impoverished neighborhoods. In some places police call them antisocial behavior, or ASB. Unfortunately, including them in the model threatens to skew the analysis. Once the nuisance data flows into a predictive model, more police are drawn into those neighborhoods, where they’re more likely to arrest more people. After all, even if their objective is to stop burglaries, murders, and rape, they’re bound to have slow periods. It’s the nature of patrolling. And if a patrolling cop sees a couple of kids who look no older than sixteen guzzling from a bottle in a brown bag, he stops them. These types of low – level crimes populate their models with more and more dots, and the models send the cops back to the same neighborhood. This creates a pernicious feedback loop. The policing itself spawns new data, which justifies more policing. (page 86)

Programmers don’t know how to code for it, and few of their bosses ask them to. So fairness isn’t calculated into WMDs. And the result is massive, industrial production of unfairness. If you think of a WMD as a factory, unfairness is the black stuff belching out of the smoke stacks. It’s an emission, a toxic one. (page 95)

Chapter 6: Ineligible to Serve: Getting a Job

Naturally, these hiring programs can’t incorporate information about how the candidate would actually perform at the company. That’s in the future, and therefore unknown. So like many other Big Data programs, they settle for proxies. And as we’ve seen, proxies are bound to be inexact and often unfair. In fact, the Supreme Court ruled in a 1971 case, Griggs v. Duke Power Company, that intelligence tests for hiring were discriminatory and therefore illegal. (page 108)

As you might expect, human resources departments rely on automatic systems to winnow down piles of résumés. In fact, some 72 percent of résumés are never seen by human eyes. Computer programs flip through them, pulling out the skills and experiences that the employer is looking for. Then they score each résumé as a match for the job opening. It’s up to the people in the human resources department to decide where the cutoff is, but the more candidates they can eliminate with this first screening, the fewer human – hours they’ll have to spend processing the top matches. (page 113)

The result of these programs, much as with college admissions, is that those with the money and resources to prepare their résumés come out on top. Those who don’t take these steps may never know that they’re sending their résumés into a black hole. It’s one more example in which the wealthy and informed get the edge and the poor are more likely to lose out. (page 114)

Chapter 7: Sweating Bullets: On the Job

I’m sure it comes as no surprise that I consider scheduling software one of the more appalling WMDs. It’s massive, as we’ve discussed, and it takes advantage of people who are already struggling to make ends meet. What’s more, it is entirely opaque. Workers often don’t have a clue about when they’ll be called to work. They are summoned by an arbitrary program. (page 128)

The root of the trouble, as with so many other WMDs, is the modelers ’ choice of objectives. The model is optimized for efficiency and profitability, not for justice or the good of the “ team. ” This is, of course, the nature of capitalism. (page 129)

Bogus is the word for it. In fact, misinterpreted statistics run through the history of teacher evaluation. The problem started with a momentous statistical boo – boo in the analysis of the original Nation at Risk report. It turned out that the very researchers who were decrying a national catastrophe were basing their judgment on a fundamental error, something an undergrad should have caught. In fact, if they wanted to serve up an example of America’s educational shortcomings, their own misreading of statistics could serve as exhibit A. (page 136)

Statistically speaking, in these attempts to free the tests from class and color, the administrators moved from a primary to a secondary model. Instead of basing scores on direct measurement of the students, they based them on the so – called error term — the gap between results and expectations. Mathematically, this is a much sketchier proposition. Since the expectations themselves are derived from statistics, these amount to guesses on top of guesses. The result is a model with loads of random results, what statisticians call “ noise. ” (page 137)

Chapter 8: Collateral Damage: Landing Credit

Since Fair and Isaac’s pioneering days, the use of scoring has of course proliferated wildly. Today we’re added up in every conceivable way as statisticians and mathematicians patch together a mishmash of data, from our zip codes and Internet surfing patterns to our recent purchases. Many of their pseudoscientific models attempt to predict our creditworthiness, giving each of us so – called e – scores. These numbers, which we rarely see, open doors for some of us, while slamming them in the face of others. Unlike the FICO scores they resemble, e – scores are arbitrary, unaccountable, unregulated, and often unfair — in short, they’re WMDs. (page 143)

Much of the predatory advertising we’ve been discussing, including the ads for payday loans and for – profit colleges, is generated through such e – scores. They’re stand – ins for credit scores. But since companies are legally prohibited from using credit scores for marketing purposes, they make do with this sloppy substitute. (page 144)

the modelers for e – scores have to make do with trying to answer the question “ How have people like you behaved in the past? ” when ideally they would ask, “ How have you behaved in the past? ” The difference between these two questions is vast. Imagine if a highly motivated and responsible person with modest immigrant beginnings is trying to start a business and needs to rely on such a system for early investment. Who would take a chance on such a person? Probably not a model trained on such demographic and behavioral data. (page 146)

Creditworthiness has become an all – too – easy stand – in for other virtues. Conversely, bad credit has grown to signal a host of sins and shortcomings that have nothing to do with paying bills. As we’ll see, all sorts of companies turn credit reports into their own versions of credit scores and use them as proxies. This practice is both toxic and ubiquitous. (page 147)

An Arkansas resident named Catherine Taylor, for example, missed out on a job at the local Red Cross several years ago. Those things happen. But Taylor’s rejection letter arrived with a valuable nugget of information. Her background report included a criminal charge for the intent to manufacture and sell methamphetamines. This wasn’t the kind of candidate the Red Cross was looking to hire. Taylor looked into it and discovered that the criminal charges belonged to another Catherine Taylor, who happened to be born on the same day. She later found that at least ten other companies were tarring her with inaccurate reports — one of them connected to her application for federal housing assistance, which had been denied. Was the housing rejection due to a mistaken identity? In an automatic process, it no doubt could have been. But a human being intervened. When applying for federal housing assistance, Taylor and her husband met with an employee of the housing authority to complete a background check. This employee, Wanda Taylor — no relation — was using information provided by Tenant Tracker, the data broker. It was riddled with errors and blended identities. It linked Taylor, for example, with the possible alias of Chantel Taylor, a convicted felon who happened to be born on the same day. It also connected her to the other Catherine Taylor she had heard about, who had been convicted in Illinois of theft, forgery, and possession of a controlled substance. The dossier, in short, was a toxic mess. But Wanda Taylor had experience with such things. She began to dig through it. She promptly drew a line through the possible alias, Chantel, which seemed improbable to her. She read in the file that the Illinois thief had a tattoo on her ankle with the name Troy. After checking Catherine Taylor’s ankle, she drew a line through that felon’s name as well. By the end of the meeting, one conscientious human being had cleared up the confusion generated by web – crawling data – gathering programs. The housing authority knew which Catherine Taylor it was dealing with. (page 152)

The systems are built to run automatically as much as possible. That’s the efficient way ; that’s where the profits are. Errors are inevitable, as in any statistical program, but the quickest way to reduce them is to fine – tune the algorithms running the machines. Humans on the ground only gum up the works. (page 153)

When new ventures are built on WMDs, troubles are bound to follow, even when the players have the best intentions. (page 158)

Chapter 9: No Safe Zone: Getting Insurance

consider a hypothetical driver who lives in a rough section of Newark, New Jersey, and must commute thirteen miles to a barista job at a Starbucks in the wealthy suburb of Montclair. Her schedule is chaotic and includes occasional clopenings. So she shuts the shop at 11, drives back to Newark, and returns before 5 a.m. To save ten minutes and $ 1.50 each way on the Garden State Parkway, she takes a shortcut, which leads her down a road lined with bars and strip joints. A data – savvy insurer will note that cars traveling along that route in the wee hours have an increased risk of accidents. There are more than a few drunks on the road. And to be fair, our barista is adding a bit of risk by taking the shortcut and sharing the road with the people spilling out of the bars. One of them might hit her. But as far as the insurance company’s geo – tracker is concerned, not only is she mingling with drunks, she may be one. (page 169)

you can imagine how machine – learning systems fed by different streams of behavioral data will be soon placing us not just into one tribe but into hundreds of them, even thousands. Certain tribes will respond to similar ads. Others may resemble each other politically or land in jail more frequently. Some might love fast food. (page 172)

These automatic programs will increasingly determine how we are treated by the other machines, the ones that choose the ads we see, set prices for us, line us up for a dermatologist appointment, or map our routes. They will be highly efficient, seemingly arbitrary, and utterly unaccountable. No one will understand their logic or be able to explain it. (page 173)

“ It is beyond creepy, ” he says, “ to think of anyone reconstructing my daily movements based on my own ‘ self – tracking ’ of my walking. ” My fear goes a step further. Once companies amass troves of data on employees ’ health, what will stop them from developing health scores and wielding them to sift through job candidates? Much of the proxy data collected, whether step counts or sleeping patterns, is not protected by law, so it would theoretically be perfectly legal. (page 175)

Employers are already overdosing on our data. They’re busy using it, as we’ve seen, to score us as potential employees and as workers. They’re trying to map our thoughts and our friendships and predict our productivity. Since they’re already deeply involved in insurance, with workforce health care a major expense, it’s only natural that they would extend surveillance on a large scale to workers ’ health. And if companies cooked up their own health and productivity models, this could grow into a full – fledged WMD. (page 178)

Chapter 10: The Targeted Citizen: Civic Life

many give a drip – feed of money based on whether the messages they hear are ones they agree with. For them, managing a politician is like training a dog with treats. This training effect is all the more powerful for contributors to Super PACS, which do not limit political contributions. The campaigns, of course, are well aware of this tactic. With microtargeting, they can send each of those donors the information most likely to pry more dollars from their bank accounts. And these messages will vary from one donor to the next. (page 193)

These tactics aren’t limited to campaigns. They infect our civic life, with lobbyists and interest groups now using these targeting methods to carry out their dirty work. In 2015, the Center for Medical Progress, an antiabortion group, posted videos featuring what they claimed was an aborted fetus at a Planned Parenthood clinic. The videos asserted that Planned Parenthood doctors were selling baby parts for research, and they spurred a wave of protest, and a Republican push to eliminate the organization’s funding. (page 193)

With political messaging, as with most WMDs, the heart of the problem is almost always the objective. Change that objective from leeching off people to helping them, and a WMD is disarmed — and can even become a force for good. (page 197)\

Conclusion

In this march through a virtual lifetime, we’ve visited school and college, the courts and the workplace, even the voting booth. Along the way, we’ve witnessed the destruction caused by WMDs. Promising efficiency and fairness, they distort higher education, drive up debt, spur mass incarceration, pummel the poor at nearly every juncture, and undermine democracy. It might seem like the logical response is to disarm these weapons, one by one. The problem is that they’re feeding on each other. Poor people are more likely to have bad credit and live in high – crime neighborhoods, surrounded by other poor people. Once the dark universe of WMDs digests that data, it showers them with predatory ads for subprime loans or for – profit schools. It sends more police to arrest them, and when they’re convicted it sentences them to longer terms. This data feeds into other WMDs, which score the same people as high risks or easy targets and proceed to block them from jobs, while jacking up their rates for mortgages, car loans, and every kind of insurance imaginable. This drives their credit rating down further, creating nothing less than a death spiral of modeling. Being poor in a world of WMDs is getting more and more dangerous and expensive. (page 199)

It’s also important to note that these are the early days. Naturally, payday lenders and their ilk start off by targeting the poor and the immigrants. Those are the easiest targets, the low – hanging fruit. They have less access to information, and more of them are desperate. But WMDs generating fabulous profit margins are not likely to remain cloistered for long in the lower ranks. That’s not the way markets work. They’ll evolve and spread, looking for new opportunities. We already see this happening as mainstream banks invest in peer – to – peer loan operations like Lending Club. In short, WMDs are targeting us all. And they’ll continue to multiply, sowing injustice, until we take steps to stop them. (page 203)

Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit. (page 204)

To eliminate WMDs, we must advance beyond establishing best practices in our data guild. Our laws need to change, too. And to make that happen we must reevaluate our metric of success. (page 206)

If we find (as studies have already shown) that the recidivism models codify prejudice and penalize the poor, then it’s time to take a look at the inputs. In this case, they include loads of birds – of – a – feather connections. They predict an individual’s behavior on the basis of the people he knows, his job, and his credit rating — details that would be inadmissible in court. The fairness fix is to throw out that data. (page 210)

If we’re going to be equal before the law, or be treated equally as voters, we cannot stand for systems that drop us into different castes and treat us differently. (page 210)

If you consider mathematical models as the engines of the digital economy — and in many ways they are — these auditors are opening the hoods, showing us how they work. This is a vital step, so that we can equip these powerful engines with steering wheels — and brakes. Auditors face resistance, however, often from the web giants, which are the closest thing we have to information utilities. Google, for example, has prohibited researchers from creating scores of fake profiles in order to map the biases of the search engine. Facebook, too. The social network’s rigorous policy to tie users to their real names severely limits the research outsiders can carry out there. The real – name policy is admirable in many ways, not least because it pushes users to be accountable for the messages they post. But Facebook also must be accountable to all of us — which means opening its platform to more data auditors. (page 211)

If we want to bring out the big guns, we might consider moving toward the European model, which stipulates that any data collected must be approved by the user, as an opt – in. (page 213)

models that have a significant impact on our lives, including credit scores and e – scores, should be open and available to the public. Ideally, we could navigate them at the level of an app on our phones. In a tight month, for example, a consumer could use such an app to compare the impact of unpaid phone and electricity bills on her credit score and see how much a lower score would affect her plans to buy a car. The technology already exists. It’s only the will we’re lacking. (page 214)

Citation: O’Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York City, Crown, 2016

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s