Cover of There Are No Accidents
books

There Are No Accidents

Jessie Singer

317 highlights
theme writing-quotable questions framework history-of critical-insight favorite

Highlights & Annotations

Tell yourself it was an accident Isolated incident, part of the job Yeah, well, tell that to the families Kids without daddies Tell it to God —Steve Earle Coal Country

Ref. C292-A

This is a book about how we die in the United States. In particular, it is about the significant subset of those deaths we ignore—the accidents. More people die by accident today than at any time in American history. The accidental death toll in the United States is now over 173,000 people a year, or the equivalent of more than one fully loaded Boeing 747-400 falling out of the sky, killing everyone onboard, every day. Americans die by accident more than by stroke, Alzheimer’s disease, diabetes, pneumonia, kidney failure, suicide, septicemia, liver disease, hypertension, Parkinson’s disease, or terrorism. Yet despite the death toll, there is no fun run for accidental death research or public memorial for the accidental dead. What we call accidents—the traffic crashes, the house fires, the falls and drownings—rarely register as a point of public concern. Why are accidents so common? Why are accidents killing us more than ever? Why don’t we talk about it? And what can we do to stem the rising tide of death and life-altering injury? This book seeks answers to those questions.

Ref. 3284-B

We can start with that 747. One of the many reasons accidents fail to register as a cost to society is that they rarely involve a fully loaded 747. Most of the time, when we die by accident, we die in ones and twos. These deaths do not make the nightly news. Rather, accidents are quick and lonely deaths, not reported beyond the police blotter, if at all. To die by accident is to fall down, or get run down, to drink the wrong thing or stand in the wrong place. More than a synonym for a traffic crash or a surprise pregnancy, “accident” is a euphemism for “nothing to see here.”

Ref. 212D-C

Then, on December 1, 2006, Eric rode his bike into Manhattan but never rode home. A year later, I sat at the sentencing hearing of the man who had killed him, the wood-walled courtroom divided into victim and perpetrator like a wedding no one wants to attend. Before a judge sentenced the man who killed Eric to jail for driving under the influence and vehicular manslaughter, the man said he was sorry. “Words cannot express how truly sorry I am,” he told the court, “for this accident that happened.” It blares in my ear now—this accident that happened—the absence of accountability, the disembodied telling, as if the killer had nothing to do with the killing. But that day in court, I did not question it. I watched the children of the man who killed my best friend nod goodbye to their father and felt certain that Eric would not have wanted this—a man going to prison, more lives ruined, and so little done to prevent this from happening again.

Ref. 4655-D

The physician William Haddon, the first administrator of the National Highway Traffic Safety Administration, called the idea that accidents were random and unpredictable, “the last folklore subscribed to by rational men.” He considered it superstitious hooey to call something an accident—a holdover from an era when much of science was inexplicable and mysterious. In his Washington, DC, office he kept a swear jar—anyone who excused a traffic crash as an accident had to forfeit a dime.

Ref. 08A7-E

In the paper, Makary offers this real-world example: A young woman felt ill, and a hospital admitted her for extensive tests. At least one test was unnecessary—a pericardiocentesis, where a needle is used to remove fluid from a sac near the heart. The hospital discharged her, but she returned days later, hemorrhaging and in cardiac arrest. While an autopsy found that the pericardiocentesis needle also pierced her liver, resulting in her death, there is no code for conducting an unnecessary high-risk test and causing accidental injury, or accidentally grazing an organ during pericardiocentesis. Doctors coded the cause of death as cardiovascular—not an accident.

Ref. C9F3-F

shouldn’t be this way. In 1971, the government approved the sale of naloxone, a drug that can reverse an accidental opioid overdose. The U.S. government first required carmakers to install seat belts in every American car in 1974. In 1995, the U.S. Occupational Safety and Health Administration (OSHA) began requiring construction workers to wear safety harnesses. Airbag regulations began in 1998. These are examples of the wealth of harm-reducing innovations that should mean fewer people die from accidents at large, and in traffic accidents, work-fall accidents, and poisoning accidents especially. Yet traffic fatalities skyrocketed in 2020, falls are the leading cause of workers’ deaths in the construction industry, and the Centers for Disease Control and Prevention tallies the death toll of the opioid epidemic at over 840,000 people since 1999, nearly every one classified as an unintentional injury by drug poisoning.

Ref. 3B06-G

When I decided to write this book, in the name of diligence, I did something I did not want to do: I sent a Freedom of Information Act request to the City of New York for the records of my best friend’s death. They arrived in a thick manila envelope a few months later. For weeks, I let it sit unopened. At the time, I did not know what I was afraid to find inside—I was right, though, to be afraid.

Ref. 7113-H

My closer look left me wrecked. It also helped me understand something about accidents that I had not before. It matters who tells the story of an accident. Eric did not live to tell the story of his death, so I never heard how he passed out of this world alone on the cold asphalt. Instead, the man who killed Eric told the story, at least at first—that was a story of a car hitting a person, an accident story, told by a man who had the momentary power of being the survivor. When that man went to prison, the prosecutors who put him there took over the telling—then, the story of Eric’s death was the story of a bad person driving a car, a crime story. In accidents, power, in all its forms, be it a fast car or a plea deal, decides which story we hear. Across the United States, and across history, I found this as a common marker of accidents. The people who tell the story are always the powerful ones, and the powerful ones are rarely the victims. In this way, surviving a

Ref. B607-I

Accidents are not flukes or freak mishaps—whether or not you die by accident is just a measure of your power or lack of it.

Ref. ECB6-J

This book is about the deceptively simple story told after an accident—that a person made a mistake and there is nothing more to see. And this book is about what we can learn—about ourselves and our society—when we seek out the real and complicated story. By doing just that, seeking the intricate story of power, vulnerability, and suffering behind any simple-looking accident, we can find ways to save tens of thousands of lives every year. We conclude with these solutions to the accident problem—how we can overcome the psychological crutch of victim-blaming, how we can repair the race- and income-based vulnerabilities that skew our accidental death rates, how we can revive the systems of accountability that give accidents a real cost for those who can, and fail to, prevent them, and how we can reimagine our streets, our homes, our hospitals, our workplaces, and our world at large, to put human life above efficiency, profit, or power.

Ref. 9FB7-K

To understand accidents, we need to understand error—not just why we make mistakes, but how powerful people can use our mistakes against us. This book begins with error because questions of error almost always follow an accident—what a person, usually the person hurt, did wrong: Why was he driving so fast? Was she drunk at the time of the accident? Did they not know that the stove was on? Who wasn’t paying attention?

Ref. CDF1-L

Inherent in these questions is a presumption—that accidents are caused by accident-prone people, or people making bad decisions, or people we think did things the wrong way. This is true. People make mistakes, and human error is a part of almost every accident. The following pages are full of errors and bad decisions. The following pages are also full of errors that are inevitable, predictable, and unavoidable, and people making mistakes that you or I would have made, just as they did, because there was no other choice.

Ref. 1DEA-M

Accidents happen when errors occur under dangerous conditions, but you can create conditions that anticipate errors and make those mistakes less of a life-or-death equation. Or you can focus all your

Ref. 7A7E-N

Sidney Dekker, airplane pilot, director of the Safety Science Innovation Lab in Brisbane, Australia, and professor of aerospace engineering at Delft University of Technology in the Netherlands, describes the two sides of this debate as the “Bad Apple Theory” and the “New View”—the old theory that a few bad apples cause accidents or the new view that if people are making mistakes and getting hurt, it indicates that conditions are unsafe.

Ref. 09F6-O

We are going to jump back and forth in U.S. history, moving between the two most significant increases in accidental deaths prior to the present one: fatal worker accidents in the Industrial Revolution and fatal car accidents in the first fifty or so years after the invention of the automobile. We will see how automakers and industrialists used error as a pawn in accidents. In these two moments, we can track how the powerful profiteers of each era weaponized the simple fact that people make mistakes in an effort to distract, persuade, and manipulate.

Ref. 82C5-P

That’s how questioning “car murder” as anything more than an accident became a politically radical act. It took a concentrated campaign by carmakers to erase their role in the car accident—one that continues today—so that everyone would stop demonizing cars and start buying them. Automakers and sellers, car parts manufacturers, and oil companies fought against restrictions dictating how cars were built, redirecting car-related ire toward the way that people walked or drove. With concerted campaigns about a few bad apples, these interest groups shifted a conversation about the shocking impact of powerful cars on pedestrian-dense city streets into a conversation about crazy drivers and pedestrians who just don’t walk right.

Ref. 7602-Q

To exceed the speed limit is a mistake, a human error. The fact that cars can go so fast is a dangerous condition. And because the auto lobby was so successful in reframing the car-murder narrative, when we talk about speeding, we almost always talk about speeders as the problem, not how fast those cars can go. In the Cincinnati speed-governor fight, the automobile lobby honed their chief tactic. To distract from the public outcry about how fast cars could go, they instead talked about individual recklessness or a few “bad apple” drivers spoiling the whole road. The gun lobby would copy this strategy decades later with the slogan: Guns don’t kill people. People kill people.

Ref. 626A-R

There is nothing which distresses me more than to read in the newspapers about the death of a child in an automobile accident. I am a manufacturer of automobiles, and I have every appreciation of the contribution that the motor car has made to the happiness of the people of America. But I also have an appreciation of the fact that some practical way must be found to check the increase in the number of deaths and injuries resulting from automobile accidents. Chrysler’s “only cure” was education for the bad apples. He wrote of the need to teach children not to play in the streets, chase after balls, or hitch rides behind vehicles on their roller skates and scooters, and of the need to train police officers to enforce these lessons. The catastrophic rise in deaths and injuries from automobile accidents, he wrote, was a problem of errors made by wild pedestrians, pedestrians in the rain obstructing their vision with umbrellas, and pedestrians crossing the street while reading books.

Ref. 1B7C-S

“They all insisted to me that the fire was an accident, and that it was caused by the owners of the plant, and that they were greedy, and if they were not greedy, nothing would have happened,” Simon says. But this simple story eliminates the history and politics that brought those workers to labor under those conditions. “Those people did not just end up in that plant that day. Historical forces brought a particular kind of person to that plant, and the fact that no one cared about them didn’t just begin that day.” Instead, Simon says, “what we call accidents are in some ways manufactured vulnerabilities.”

Ref. 9B31-T

With both fires, the official story described an owner’s greed. But in both instances, another story passed as a rumor in the aftermath. It went like this: the owners locked the doors because the workers were thieves. In New York in 1911, they allegedly stole fabric. Floor managers searched bags and purses at the end of the workday and kept doors locked to ensure nobody could leave the factory without being searched. In Hamlet, the twenty-five victims died because the owners locked the doors to prevent the alleged theft of chicken.

Ref. E898-U

That big ideological switch did not happen after Hamlet, Simon theorizes, because the people who died were mostly Black women. The lawyer who represented the owners of Imperial Foods recounted to Simon his meeting with the district attorney prosecuting the case against his clients. That prosecutor explained that stealing chicken was just “what these people do.” The victims, the prosecutor said, “were just a bunch of low-down Black folks anyway.” The difference between what happened in response to the death of workers who stole chickens and the workers who stole fabric was that the former were Black.

Ref. BAA9-V

At the white memorial, they prayed for God to deliver the town and for the tragedy to pass. “They’re not callous about it, but they want to see it as a way to bury the past and move on. Implicitly, there is a notion that you can do that, which suggested this was an anomaly,” Simon says. “But for African Americans who are literally living near the plant that is still up with the yellow police tape around it, they understand that there is something much bigger going on here, and it is about their worth in this place.”

Ref. 77BE-W

It is the difference between saying this is a tragic anomaly and this is a tragic inevitability. If it was just an accident, if the dead are just chicken thieves or the factory owners are just greedy, it is far easier to ignore the conditions that drove the owners of that factory to set up shop in that place, far easier to mourn and move on.

Ref. AD4D-X

Chalking this accident up to error was helped, too, by the fact that hardly anyone saw it. Imperial Foods was a small uninspected factory in a rural town, not a multistory building in New York City. And in accidents, this matters. The more people who see an accident, the more difficult it is to distract from how it happened. It is harder to hide dangerous conditions when everyone is watching, which is why, by the time millions of Americans owned a car, it was a lot more difficult for automakers to pretend that accidents only happened because of a few bad apples. They still pulled it off, but this time, instead of a covert campaign to villainize the pedestrian, the refusal to sacrifice profits to prevent accidents played out in plain sight.

Ref. 560D-Y

And this practice continues today. While driverless cars are a ways off, the RAND Corporation estimates that currently available autonomous technologies, such as emergency braking, adaptive cruise control, blind spot detection, alcohol sensor interlocks, and lane control, are so effective that an annual 17,000 people would survive accidents that today kill them, were every vehicle in the U.S. equipped.

Ref. 41CC-Z

Hearing this story, you might have questions: How could we be so reckless? What was my mother thinking? Why did I not play somewhere else? These are questions of human error and personal responsibility. Psychologists call your desire to ask these questions—and especially to ask these questions about other people’s accidents—the fundamental attribution error. There is significant evidence that the vast majority of people tend to see their own accidents as a product of the environment they were in at the time, and other people’s accidents as a problem of human error and personal responsibility, even in the face of evidence to the contrary.

Ref. A837-A

However, no matter the environment, you can always find errors; like I said, I shouldn’t have been doing ballerina twirls next to a cliff. You can find a person like me in most accidents—a person who made a mistake, who made a bad decision, or who somehow erred. Errors are present everywhere in this book, and if you are like most people, you are going to have the urge to steer toward these errors—asking questions of personal responsibility and noting the poor judgment you see in other people’s behavior along the way. I warn you now: I will steer you away from this line of inquiry again and again.

Ref. D750-B

This is not because I believe that poor judgment is absent from accidents or that personal responsibility has never saved a life. Rather, I will steer you away because this line of inquiry—blaming human error—is at best an inconsequential and largely ineffective answer to the accident problem. At worst, it actually sets the stage for the same accident to happen again. Of course I should not have been playing by the cliff, but talking about that will do nothing to prevent the next person from tumbling over. Fixing the guardrail will.

Ref. 0EED-C

From the Industrial Revolution onward, powerful corporate interests insisted that fallible people were the source of all accidents. This chapter is about the ones who proved them wrong. A better guardrail, for example, could have prevented my cliff fall entirely, but not my imperfect twirling. The severity of my injuries was not relative to my mistake but to what I collided with—by chance, I fell in the least sharp, hard place. Today we know that by controlling what we collide with and what collides with us, we can control whether we live or die.

Ref. 63F3-D

Crystal Eastman was the first person to back up this point with data. Between the summer of 1906 and the summer of 1907, she cataloged how workers died over the course of a year in Pittsburgh, Pennsylvania, and effectively debunked an idea that corporations had been pushing for decades—that careless workers caused work accidents. It was the first-ever sociological investigation into the accidental deaths of American laborers.

Ref. C60F-E

“Such catastrophes rouse the attention of the public by their magnitude,” she wrote. “But suppose one man shoveling coal in some small ‘room’ far within a mine suddenly lies buried under a ton or two of slate—this causes no comment in a mining community. The sound of such stories is dulled to the ears of the public by monotonous repetition. Indeed, few of these common mining accidents reach the newspapers.”

Ref. C7D4-F

front-page headlines, but their sensation took up space that distracted from the true nature of the problem—which was that most workers who died did not die a hundred at once but by monotonous repetitions written off as accidents. The deaths of one year of Pittsburgh work accidents reiterated the point. Men died at work in ones and twos, day after day. Eastman wanted people to pay attention to these accidents that lacked the scale to spark the zeitgeist. What of the deaths that disappear? she asked.

Ref. DE5F-G

“Most of the men in the community whose opinions count, have made up their minds about this accident question from what they have heard employers, superintendents, and casualty managers say. In other words, they believe that ‘95 percent of the accidents are due to the carelessness of the men.’ Those emphatic, reiterated assertions, those tales of recklessness often repeated, have grown into a solid inert mass of opinion among business and professional men in the community, a heap of unreasoned conviction,” she wrote.

Ref. 94F0-H

As she cataloged what caused the casualties she investigated, she documented, too, the “chief preventable conditions from which work accidents result.” Failing to create a safe workplace was number one; requiring long hours on the job was number two; three was work made to happen at too great a speed. It was the first time anyone had ever demonstrated that the conditions faced by workers—and not their personal failings—were the cause of accidents at work.

Ref. C3A1-I

The commission came back with much data and one major proposal: workers should no longer need to sue their employer to receive compensation for a work accident. Instead, the commission recommended, there should be a system of guaranteed recompense based on the scale of injury and the potential income lost. After a work accident, employers would be responsible for the wreckage left behind—medical bills, the economic livelihood of a family. It would mean that for employers, accidents would finally have a cost.

Ref. 2289-J

The Triangle fire was not just an accident of magnitude; it was an enormous accident watched firsthand in real time by America’s elite and one whose aftermath played out at length in every American newspaper. Two months later, in Wisconsin, legislators would successfully pass the nation’s first workers’ compensation law, and nine other states would follow that year. New York State amended its constitution to again pass a workers’ compensation law in 1913. Forty-two of forty-eight states would have the law by 1920. In 1925, the five holdouts were all southern states. Mississippi came last, in 1948.

Ref. 452B-K

For employers, this was a massive shift in their economic calculus. Work accidents once cost only as much as replacing a worker. Now the only way for an employer to reduce costs was to reduce accidents. The decline in work accidents was dramatic. Over the next two decades, deaths per hour worked would fall by two-thirds. At U.S. Steel, in the first decade of the 1900s, one in four workers suffered significant injuries every year. By the late 1930s, that number was one in three hundred.

Ref. 30C5-L

On the other hand, maybe there was no such thing as accidents or luck. Out of the hospital, DeHaven examined the wreckage. He realized it was the sharp latch on his seat belt that caused his injuries and that his survival was thanks to being strapped to a cockpit that stood intact through the fall—the other plane’s had disintegrated. His survival, he realized, meant that the human body, properly packaged, could withstand extreme deceleration. DeHaven spent the remainder of his tour on ambulance duty, picking up bodies after car accidents and airplane accidents. He found patterns among the injured and dead, not unlike the ones he discovered in his own survival. The people who crashed while nestled in a safe space were more likely to survive.

Ref. 9500-M

His thinking was that eggs were shipped all around the country. Millions of these fragile things rode trains and trucks, moved from boxes to shelves to carts, and largely remained intact. He wanted to know why a carton of eggs could survive an impact but a human head could not, so he climbed up on his countertop and started to drop his breakfast.

Ref. 2CDF-N

The ten-foot fall from his kitchen counter required far less cushioning. It was not the fall that cracked the egg—it was the floor. The idea was a simple answer to the human error–dangerous conditions debate: Control how the egg hits the floor, and it doesn’t matter if you accidentally drop it. It was all about packaging. A car could be a package. So could a plane, and an elevator, and a miner’s helmet. A skull is a package for a brain.

Ref. 4CD3-O

It was, of course, not luck. DeHaven determined that Zito hit the ground at around 60 miles per hour. The secret to his survival was that he fell not on solid concrete but rubble that absorbed and redistributed the energy of his velocity. (Remember when I fell off a cliff? Same thing. The difference between life and death was the give of the surfaces we hit.)

Ref. D24A-P

Fascinated by what seemed like clear causality, DeHaven began to apply his findings about falling airplanes to moving cars. He found like-minded officials at the Indiana State Police and convinced the force to take him in for a year, sharing photos of traffic accidents and coroners’ reports like Crystal Eastman before him. By the end of 1953, DeHaven had compiled a list of the most dangerous parts in any automobile. The killers were pointed knobs, dashboards without padding, steering columns that could not collapse on impact—and the lack of seat belts to prevent people from smashing into all of that. His list was proof that accidents were in our control, regardless of the nut behind the wheel. All that mattered was how automakers built the cars. DeHaven

Ref. 183F-Q

What happened in the fourteen-year gap was not an accident. If you died impaled on a steering column after 1953, when DeHaven laid it all on the table, but before 1967, when carmakers were forced to finally do something about it, you died because it was cheaper and easier to let you die than to help you live.

Ref. 5435-R

The accident-prone worker is a myth and the nut behind the wheel a clever distraction from the true cause of accidental death and injury—and we miss a wealth of information that could prevent accidents when we pay any attention to these caricatures of personal responsibility. Eastman and DeHaven mined the relevant information, found common causes across accidents of wildly different types, and discovered how to prevent them by controlling the built environment.

Ref. 69C1-S

Denied the power to communicate individual hazards, workers could no longer tell their stories before accidents happened—and in time to prevent future ones. It was a loss of control that resulted in more of the accidents that came in ones and twos—the little deaths no one notices until someone like Crystal Eastman or Hugh DeHaven gathers them up.

Ref. 99ED-T

A single plane, seating far more people, could net a much larger sum with every flight. And a single mistake at any stage of the process, from the drawing board to the assembly line to the cockpit, could produce massive loss of life. These factors grew in tandem. The jumbo jet, more complicated to fly and to build than earlier generations of aircraft, multiplied the dangerous conditions that could lead to an accident and in turn multiplied the number of people who would die when Murphy’s Law inevitably kicked

Ref. D8D3-U

Advances in technology can create built environments that are larger in scale and more complex in operation. The accidents that happen in these environments tend to grow to match. The Triangle Shirtwaist Factory fire was one of the worst accidents in its day; it destroyed one building and killed 146 workers. The explosion of the Monongah mine in 1907—known as the worst mining disaster in U.S. history—killed more than 300. The Chernobyl nuclear power plant meltdown occurred seventy-nine years later, bringing an almost unimaginable increase in the scope of destruction. The range of radioactive contamination includes 40 percent of Europe and parts of Asia, Africa, and North America. Researchers estimate that 41,000 people will develop cancer related to the accident by 2065; it will kill more than a third of them, some 16,000 dead. When we talk about large-scale accidents today, we mean mass death and destruction—commercial plane accidents, oil spills, nuclear accidents—ecosystems decimated and hundreds or thousands dead in one fell swoop. These aren’t just accidents but accidental disasters.

Ref. 07F6-V

“It’s not just nuclear. It’s the Deepwater Horizon. It’s any of these high-technology systems that we take for granted. When they go wrong, they really go wrong big-time,” Gundersen tells me. “Sooner or later, in any foolproof system, the fools are going to exceed the proofs. It’s inevitable as you build more proofs into the system—it delays, but cannot prevent, an ultimate catastrophe.”

Ref. AD1F-W

Accidents in America come in two sizes, relative to likelihood. Most are small and frequent—a drug overdose, a traffic crash. Some are big and rare—an oil spill, a plane crash. Crystal Eastman’s message from Pittsburgh was that the minor accidents, being unsensational, were ignored and thus added up to a massive toll.

Ref. 62AD-X

“The industry, and by industry I do not just mean the nuclear industry—oil and gas, and some chemical industries, all use the term ‘accident,’ because that gives them a path: Oh, it was an accident; we couldn’t help it,” Maggie says.

Ref. 54DF-Y

And they can definitely help it. “At Fukushima,” Arnie explains, “there were warnings for decades that a forty-five- or fifty-foot tsunami could hit the coast, but they built a fifteen-foot wall because building a higher wall costs a boatload of money. The same thing with the Deepwater Horizon—they knew that the shut-off valve was not working, but they didn’t want to spend the money to do it right. When you’re looking at large accidents, it’s a matter of management. In their minds, they discount the probability of failure and focus on the short-term gain.”

Ref. 4802-Z

What the governor meant, I think, was that he was no Chicken Little. Three Mile Island was a large and unlikely accident—many individual things went wrong, adding up to a meltdown—and for the governor, unlikely mattered more than large. He did not feel at risk, so he decided that the city was not at risk.

Ref. 4659-A

Imagine a stack of Swiss cheese. The cheese is full of holes, right? But the holes are unique to each slice, so those holes rarely line up when you stack slices on top of one another. This stack of cheese was James Reason’s analogy for how, in complex technologies, large-scale accidents occur. For example, in a nuclear power plant, each safety system is a slice of Swiss cheese. An alarm is a slice, a cooling system is a slice, and each indicator light is a slice, too. A well-trained operator is a slice of cheese, and so is their boss, who listens when the operator warns of danger, and their union, which makes the boss listen. The holes in all the slices rarely align, but when they do, catastrophe results.

Ref. 341D-B

One hole in one slice won’t bring down the whole house. Ideally, in a safe system, disaster will be prevented by the other slices. When a failure slips through a hole in one slice, the malfunction should smack firmly into the next slice, halting the progress of disaster. Today, we build the most technologically complex parts of our built environment so that if one system fails, it doesn’t cause all other systems to fail—a faulty indicator light will not by itself cause a nuclear meltdown. But in Reason’s model, accidents happen when all the holes, by chance, line up—the failures cascade, and the nuclear power plant melts down. Safety systems fail individually and in failure are discovered to be connected in unpredictable and surprising ways. Reason called this alignment “a trajectory of accident opportunity.”

Ref. 2564-C

First, multiple things went wrong, and second, those various things interacted in unexpected ways. The designers of a nuclear power plant would consider the cooling system, the relief valve, and the control room indicator lights each a layer of safety if functioning correctly. And at Three Mile Island, each protective layer failed in a way that affected all the others.

Ref. F074-D

The role of human error in all this, Reason wrote, is “that of adding the final garnish to a lethal brew whose ingredients have already been long in the cooking.” The worker who supposedly dropped the cigarette at the Triangle Factory, or the nuclear engineer mistaking the meaning of a green light on the control board at Three Mile Island, are not instigators of accidents. Instead, they are “the inheritors of system defects created by poor design, incorrect installation, faulty maintenance and bad management decisions.”

Ref. 981E-E

As each failure passed through hole after hole, the failures interacted in surprising ways.

Ref. 4B98-F

Seeing the Titanic’s demise as the result of a single failure—one hole in one slice of cheese—actually caused more fatal accidents. Similar to nuclear reactors, giant ships are complex systems. To understand why complex systems fail, we need to track backward through systems that should have kept people safe—lifeboats, hull strength, navigational procedures, sharing doors as flotation devices—and see how the failures interacted to lead to a catastrophe.

Ref. C25C-G

Reason was using a stack of Swiss cheese to explain what is known as a complex system accident. A sociologist named Charles Perrow first conceived this concept—that accidents were not chance mishaps but systemic inevitabilities—as the dust settled on Three Mile Island. Ten years before James Reason pitched his Swiss cheese analogy, Perrow investigated the organizational issues that had led to the meltdown. While the commission formed by President Carter to investigate the accident chalked it up to “operator error,” Perrow’s analysis would birth a new understanding of the accidental by rejecting the idea that a single human error could cause a large-scale accident.

Ref. 5398-H

In analyzing Three Mile Island, Perrow wished to avoid this oversimplification. What went wrong at the nuclear facility was not a problem of human error, he concluded, but of the complex interactions of dangerous conditions and fragile safety systems. From his analysis of Three Mile Island, he developed a framework for understanding large-scale accidents called Normal Accident Theory. It said, in short, that accidents were inevitable in systems of a certain complexity.

Ref. 4DE5-I

Perrow defined complexity in terms of coupling. In a “tightly coupled” system, it is hard to separate one failure from the next. In a “loosely coupled” system, this is easier. A factory, for example, is a loosely coupled system—if rags catch fire, and you put out the fire, disaster is averted. A nuclear reactor would be a tightly coupled system; its workings are so hidden and complex that one mistake is likely to trigger another unanticipated error. The more complex and tightly coupled the system, the more likely that one problem would connect to another problem in a surprising way that may result in catastrophe.

Ref. B67B-J

He found patterns and published a book about them all, Normal Accidents: Living with High-Risk Technologies—applying his theory about Three Mile Island to large-scale accidental disasters of all stripes. It held every time. There was a connection between chemical plants, dams, big ships, and airplanes—a complexity of interactions that made accidents inevitable, or “normal,” as Perrow put it. No single human error was to blame for a large-scale accident. To meet growing needs for goods, transportation, technology, and energy, we build systems so complex that accidents must be expected. All these accidents were born of systemic conditions. Since Crystal Eastman went to Pittsburgh, no one in the United States had so comprehensively demonstrated that so many very different types of accidents were in fact rather alike.

Ref. 2961-K

What We Mean When We Say Large

Ref. 6BE8-L

record. “We have reduced the frequency, but we have increased the consequences,” Arnie points out. “That seems to be what happens with these high-risk endeavors, like space shuttles and oil rigs. They’re farther apart but the consequences become more severe.”

Ref. 1705-M

We are solving problems over time in all the stacked safety systems that make up our most complicated technologies—better training, better control room designs, all of it. But all those layers of safety are being added less quickly than our energy demands are growing. While we invent individual survival mechanisms for extremely unlikely accidents—such as a life vest that can fit under the seat on an airplane—we still accept potential group risk so significant that mere exposure is not survivable. We do this, allowing for accidental horrors that are both massive and inevitable, in part because our moods and emotions distort our perception, and we cannot grasp what “likely” really means.

Ref. EC59-N

“As the magnitude of the tragedy of an event goes up, the level of compassion people express for it doesn’t track,” DeSteno tells me. “What’s happening is people begin to feel overwhelmed by the size of the tragedy, and they want to turn away. They want to ignore it.”

Ref. A4D6-O

When we lack the information to explain a complex accident, DeSteno explains, our brains estimate based on what we do know. And without clear memory or factual information, we lean on feeling. This is evolutionary—if you learn to fear when you see a tiger, you are less likely to be killed by a tiger. Emotions shape predictions about what comes next in hopes of steering us toward survival. But the rarity of large-scale accidents means that we likely will not have a memory to cite, and our predictions will veer off course.

Ref. A8D1-P

Like the employer blaming the accident-prone worker in the small accident, the politician or corporate CEO insisting this is no big deal is a fixture of the large-scale accident. You could say, of course, that it’s essential to calm the public. Except that diffusing our panic can also diminish how much we care about a disaster. Our empathy fatigue is a gateway, and powerful people walk right through. When we listen to a politician or an industry official explain why an accidental catastrophe is actually minor or how volunteers are standing by for the cleanup, we are relieved of upsetting feelings about the disaster. As a result, we’re less concerned with how and why the accident began.

Ref. EB67-Q

Still, all this scrubbing serves a purpose. It is oil spill response theater, with the message that these accidents are fine because they can be cleaned up. Pretending that we can clean up an oil spill is one way that oil companies make the risk of an oil spill feel less dire. But what that risk encompasses may also be a mystery, even to the experts. Large-scale accidents come with an unknown number of small ones. To look at the “little” in big accidents, we need to talk to the person with the best job title in this book: Prosanta Chakrabarty, curator of fishes.

Ref. B6FE-R

Planktonic marine organisms are perhaps the smallest death to result from the spill. The discovery of their potential mass extermination a decade later shows that Charles Perrow and James Reason were right. Accidents are layered, and when things go wrong, the failures intersect in complex, surprising, and often unseen ways. This is true of large-scale accidents such as nuclear meltdowns and oil spills, but it is also true of accidents that, at first, might not appear

Ref. 807D-S

physically confined systems. But the more accident stories I read, the more I became convinced that all accidents are systemic and normal—routine failures of design, management, and organization in complex settings—and that a system could be far less tangible than a single factory or reactor. A network of highways could be a system, or an ocean, or an energy market. To really understand accidents, I realized, I needed to apply Perrow’s and Reason’s theories to systems beyond just physical, confined spaces—to social systems, economic systems, and the entire built environment.

Ref. 009E-T

These are normal accidents that grow out of complex and surprising interactions of large-scale systems—the energy market, the drug market, the rural highway system unprepared for an influx of people. The four walls of a factory do not confine these systems. Instead, they are social systems, economic systems, and organizational systems. All accidents are systemic, but to understand some of the systems, we will need to zoom far out from one man falling off an oil rig, or another, flush with cash, taking too much of a drug. Racism is a system, and so is stigmatization, and so is the federal infrastructure budget. The accidents themselves will not get larger in scale than the accidents in this chapter, but the systems that can cause accidents will get more encompassing, and the risks will get harder to control.

Ref. 33D4-U

One way to measure the risk of a person being killed or injured in an accident is to estimate crashworthiness—crashworthiness being the degree to which any container you may be strapped inside, be it a car or boat or plane, is resistant to allowing injury to occur. This can be measured in the presence of protections, such as airbags and cushioned seat backs, and in the absence of dangerous conditions, such as sharp knobs or cockpits that disintegrate on impact. Hugh DeHaven perfected a test of this in the 1950s as part of the Automotive Crash Injury Research project he founded at Cornell University. First, he would put crash test dummies inside a car, and then he would crash the car. How those dummies fared in the crash told him how at-risk you or I would be in a similar car at a similar impact. This sounds straightforward, but it’s not.

Ref. 70A7-V

Risk perception experts like Fischhoff and Slovic would argue that we as individuals are pretty good at perceiving and avoiding risk. More often than not, however, when accidents happen, the risk is taken out of our control. Consider the passengers on the DC-10, who didn’t know about the whistleblowing engineer who tried to ground their plane. Or the chicken processing plant workers who had nowhere else to work if they were unwilling to work behind a locked door in a factory never inspected by OSHA.

Ref. 32F6-W

accident. The risk was not just taken out of these people’s control—these people didn’t know the danger they were exposed to or the resources available to avoid it.

Ref. CC70-X

We can understand these two factors as actual risk and risk perception—the dangers we face and how we understand those dangers—and the way that these two factors interact is very important. Risk perception experts have found that feeling in control of a situation makes us feel less at risk—regardless of actual risk. Because the DC-10 passenger or the chicken plant worker or the Ford driver did not know about the dangers they faced, risk was both present and imperceptible, doubly unavoidable and out of their control.

Ref. A8A4-Y

The Benefit of Feeling in Control

Ref. ECB0-Z

“You feel if you own a gun that you are in control of your safety,” Paul Slovic tells me. “There are a lot of elements that are out of your control, but that sense of control is very powerful in both diminishing your perception of risk and increasing your acceptance of risk.”

Ref. FF90-A

People are also more willing to tolerate a risk if it comes with benefits attached—another reason we underestimate the risk of a car accident. Consider our strong and opposing reactions to two chemicals: pesticides and medicine. Surveys conducted by Slovic and Fischhoff found that most people considered pesticides to have little benefit and high risk. People thought of medicine in the opposite way—little risk and high benefit. In reality, both are chemicals. Both are beneficial, and both are dangerous. Arguably, in terms of our individual survival, pesticides are statistically less risky than medicine—an average of twenty-three deaths a year are directly attributable to pesticides in the United States, while around thirty-eight people die every day from prescription opioid medication alone. But because we have bad feelings about pesticides, including the fact that exposure to them is out of our control and we don’t directly observe their benefit (unless we are farmers), we judge them as riskier. With medicine, we get the direct benefit of feeling better, and we are entirely in control of when we take it, so it feels like less of a risk. In this light, it’s easy to see why the risk of a gun, which conveys a sense of security, and the risk of driving, which offers the benefit of convenience,

Ref. B295-B

Of course, these are only perceptions. Whether we are purchasing firearms or taking a pill, we feel more in control and less at risk, but that does not mean we are more in control and less at risk. The people who design the road and manufacture the gun and package the medicine primarily determine our actual risk—but those people have the same problems perceiving risk as the rest of us do.

Ref. 37D5-C

Knowing what we know about systemic accidents, the mismatch of risk and perception in the hands of experts is a frightening premise. Applied to a whole system—whether that system is how engineers design roads or weapon companies design guns—the consequences of misperceiving risk get multiplied across the population. If you drive too fast, for example, because you feel in control and stand to benefit from it, you minimize your perception of the risk and then expose yourself and maybe a few people around you to increased risk. But if you’re a traffic engineer designing a road, your risk perception could affect far more people and result in countless accidents over the decades-long life cycle of a road. The risk perception of a person with power can create dangerous conditions for us all.

Ref. A535-D

There is loads of evidence that traffic accidents repeat in similar places and similar ways until conditions change. When a traffic engineer changes the design of a street—say from one that makes drivers feel comfortable speeding to one that makes drivers feel at risk unless they slow down—accidental deaths and injuries on that street decline in dramatic ways. This tells us that a traffic engineer can prevent accidents. Still,

Ref. C716-E

The problem is that most U.S. road engineering guidelines were written in the 1950s and ’60s, when the U.S. government was in the process of building 41,000 miles of interstate highways as part of the Federal-Aid Highway Act of 1956. At the same time, Ralph Nader was rallying a case against automakers. One tactic automakers used to redirect the heat they faced then was to blame the nut behind the wheel, as we’ve seen. The other tactic was to insist that cars were not inherently dangerous; road design was. Automakers elevated the importance of traffic engineering. But the authors of America’s earliest traffic engineering rule books had limited ability to measure the risk of the roads they were building. The guidelines used to design roads today were written at this intersection: limited ability to measure risk, an elevated focus on road design as the secret sauce of traffic safety, and a well-funded and myopic focus on interstate highways.

Ref. 4499-F

today. “Most urban roadside crashes are not the result of random error,” Dumbaugh tells me, “but are instead systematically encoded into the design of the roadway.”

Ref. B144-G

too fast, people died in traffic accidents. The design of the road induced the errors.

Ref. A5FB-H

Dumbaugh had been taught that roadside hazards were the risk, but the actual hazard is the shape of the street and the speed it encourages. The risk that engineers build into a new roadway only increases over time as development fills in the area around the road.

Ref. 816F-I

They decide the new speed limits by conducting a study that looks at how fast everyone is driving on the street. Then, they chart those speeds by frequency. Almost always in these studies, the majority of people are found to be driving at a similar pace, but around 15 percent are found to be driving much faster than everyone else. Traffic engineers use this latter group as their limit—setting the speed limit at the low end of how fast the fastest 15 percent drive, which is the high end of how quickly the other 85 percent drive. They call this the 85th percentile speed. It is how engineers set speed limits on major roads nationwide.

Ref. 3E65-J

“We look at how fast cars are going and we assume that is the safe speed of the roadway,” says Dumbaugh. “Note that this has no safety basis: it’s simply assumed that most people don’t want to get into a crash and are thus doing what it is safe for them to do.”

Ref. D0AC-K

Most speed limits are not based on physics or crash test expertise but simply the upper limit of what most amateur drivers feel is safe. A speed limit is the perceived safe speed of a road, not the actual risk of traveling that speed on that road. The experts, the engineers, those in control, do not set speed limits to limit the risk of a road or design the road to limit speed. Instead, the experts design a wide, straight road that encourages speeding and decide the speed limit based on how fast nonexperts drive there. One reason that this problem goes unnoticed is that it is not classified as a problem. For example, in 2018, the National Highway Traffic Safety Administration attributed the deaths of about 9,000 people in traffic crashes to speeding. More than 36,000 were killed that year, but if any of those were attributable to a speed limit set too high, those deaths would not be attributed to speeding—those drivers, after all, were just following the rules of the road.

Ref. 5240-L

Engineering schools across the country still teach these rules, and the graduates of those schools still believe that following them will reduce risk for drivers. But these guidelines also give engineers protection. By following the rules, however outdated or unproven they may be, engineers shield themselves from lawsuits. Engineers build roads that put us all at risk of an accident and thus protect themselves from the risk of legal action. When a traffic accident kills someone on an unsafe street, the engineer can claim, accurately but dangerously, that they were just following the rules.

Ref. 7B91-M

“I say this as somebody with a PhD in civil and environmental engineering from a wonderful university. Traffic engineering is a fraud discipline,” Dumbaugh declares. “It presumes knowledge on road safety that it doesn’t have and it educates people generation after generation on information that is incorrect. So it allows people to graduate with this degree, with a designation that they are an expert in the field and which they have no substantive knowledge of whatsoever. Most traffic engineering programs in the U.S. do not have a single course that covers the issue of road safety. If they get it at all, they get it, at best, as a lecture—it is part of a class on traffic engineering, on building streets for high speeds.”

Ref. CED9-N

The gun is a SIG Sauer P320. It is the subject of YouTube experiments not only because it can fire at random but because it is a widely owned weapon. Or at least it was. After SIG Sauer signed a half-billion-dollar contract with the U.S. Army to make the P320 a service weapon, police forces around the country adopted it, too. Gun enthusiasts followed suit. The problem was that in the rigorous testing done before the purchase, army officials found that when you drop the P320 so that it hits the ground right around its rear sight—that outer corner of the L of a handgun—it can fire a bullet without anyone pulling the trigger.

Ref. 86EF-O

As lawsuits mounted, SIG Sauer launched the P320 Voluntary Upgrade Program. On their website, the FAQ began with the most crucial question: Q: Is my P320 safe in its current configuration? A: Yes. The P320 meets and exceeds all US safety standards. However, mechanical safeties are designed to augment, not replace safe handling practices. Careless and improper handling of any firearm can result in an unintentional discharge. It is a bit

Ref. FBE2-P

Semiautomatic pistols are classified by what makes them fire. Unlike a pistol with an external hammer, a striker-fired weapon like the P320 is under spring pressure to fire. Once a striker-fired weapon is cocked and loaded, Bagnell tells me, it is a high-risk situation, very much like a bow pulled back with an arrow in it, and any mechanical or design defect is an accident waiting to happen. Then he interrupts himself. He doesn’t want to call it an accident. “Not when it happens this many times. The word ‘accident’ is gun industry jargon that serves to disguise negligence in design and manufacturing—negligence on the part of the manufacturer, the corporation, not the law enforcement officer, not Marcie, not any civilian,” he tells me. “It has happened so many times that this weapon should have been recalled years ago.”

Ref. DF52-Q

Corporations recall products, from Tylenol to Cheerios, all the time—by government force, or voluntarily, in anticipation of government force. Yet guns never are recalled, because they have a unique privilege: no government agency polices their safety. There are no federal standards for gun design. In 1972, the National Rifle Association and gun manufacturers convinced Congress to forbid the Consumer Product Safety Commission—which sets safety standards for commercial products and issues bans and recalls for hazardous ones—from including gun safety in their remit. Guns are permitted to go off by accident. This completely upends a gun owner’s perception of risk. A trained gun owner understands risk as a matter of how they handle their weapon, but training made little difference for someone like Officer Vadnais. SIG Sauer took the risk out of her control because her weapon fired when she wasn’t even handling

Ref. 12E7-R

Remember what Slovic and Fischhoff uncovered about risk and feeling in control? Owning a gun or driving a car makes us feel in control. Those are our hands at two and ten. We decide when to pull out into traffic. We buy the bullets. We pull the trigger. With guns, this feeling extends beyond the physical aspects of controlling the weapon. Owning a gun also makes people feel in control of more intangible things, such as their own safety. In fact, most gun owners cite protection as their prime reason for owning a gun, and this has increased over time in inverse to the crime rate—people are more likely to buy guns for safety the safer our country gets.

Ref. F22B-S

Psychologists researching why people buy guns have identified two common feelings among the purchasers: a specific perceived risk of being attacked and a diffuse perceived risk of a dangerous world. These are not actual risks but perceptions—which is why gun ownership increases in times of well-publicized civil unrest, like the sustained period of Black Lives Matter protests in the summer of 2020, which brought about an all-time high in background checks, the main metric for gun sales. And even though gun owners may be deciding to purchase guns to increase a sense of control, gun owners are more at risk of accidentally shooting someone or being accidentally shot. Two researchers in Massachusetts tracked a spike in background checks for gun purchases and accidental shooting incidents five months after the Sandy Hook massacre. Gun shops sold 3 million more weapons in that period compared to the same period in previous years, and accidental shootings killed sixty more people than usual. Of the dead, around twenty were

Ref. 9C81-T

Guns purchased in response to perceptions of risk and a desire to be in control left people at greater risk.

Ref. 40F5-U

When we measure risk in our brains, we measure our perception of control as much as anything. But not only are our risk perceptions sometimes wrong, we also may not actually be in control. With the SIG Sauer P320, the risk was out of the control of owners of that one gun because gun lobbyists successfully exempted all firearms from the U.S. regulatory system. Medicine is the opposite—comprehensively regulated in a way that makes people generally feel safe about having medications in the house. But regulation can’t make medicine safe if regulators ignore known risks.

Ref. 299E-V

We also live in a society, and part of that arrangement is trusting that people with power over risks that affect all of society will guard us from hazards that are known. Where protective solutions against risks exist, we expect government officials and corporate executives to implement those solutions. That is not what happened for Maisie Gillan.

Ref. 4D3B-W

When we spoke on the phone, Adam and MaryBeth were both home on parental leave. They recently had a baby. With a newborn in the house, after losing Maisie, they explain that risk feels different now. If before they were cautious, now they parent under conditions of post-traumatic stress. Adam talks about the stories you hear of parents accidentally leaving kids in their car seat—forgetting, somehow—and these haunt him.

Ref. DCDF-X

much more present for me.” Rare as it was, what happened to baby Maisie was not providence or coincidence. Her parents took no risks, but others did: the older woman who dropped the pill but failed to pick it up, the neighbors who owned the home. But this risk, too, is baked into the system in the way that high-risk drugs such as methadone are distributed and packaged.

Ref. 971B-Y

The phrase “parents know what’s best for their children” is often invoked to drive home the point of who is responsible when children get hurt, but in an era when prescription opioids are aggressively marketed to all Americans, children are killed…

Ref. 9913-Z

“The term ‘accident’ is a four-letter word for those of us that practice the discipline of injury prevention,” he tells me. “Whenever we are meeting or writing policy or advocating for improvement, that word will never…

Ref. AA5E-A

For four decades, Tenenbein has been trying to change policies such as the one that allowed Maisie Gillan to come across a methadone pill. He tells me about a law passed almost fifty years before Maisie was born, in 1970—the Poison Prevention Packaging Act. Under the law, medicine would be under the jurisdiction of the Consumer Product Safety Commission. The law required the…

Ref. 4D52-B

Before this, when a child died because they got hold of medication, the medication’s manufacturers would have just chalked the accident up to parents who weren’t careful enough—another human error story. But the Poison Prevention Packaging Act created a system for designing and testing pill containers. The designers brought in little children and older adults to see if they could open the pill bottles—to pass, a design would need to be impossible for the first group but feasible for the second. These tests…

Ref. 3920-C

This legislation was a huge success—proof that this was not a problem of parent error. In the first twenty years, the rate of children poisoned by accidentally eating medication fell by half; the number of children accidentally killed declined by 75 percent. Researchers would later estimate that the law prevented 200,000 accidental child poisonings in the first decade. But from the 1980s to today, the law accounted for no more major decreases. The law didn’t become less effective; high-risk pharmaceuticals…

Ref. DB65-D

More than six hundred children under the age of five have died this way in the last twenty years, poisoned by…

Ref. 402C-E

In 2005, Tenenbein set out to find a more effective approach. He found it in iron pills. After a rash of iron poisoning deaths in young children in the 1990s, the Food and Drug Administration (FDA) mandated blister packs for iron tablets—the ones that only pop out one pill at a time. Tenenbein notes that this is a form of passive safety—like an airbag, compared to a seat belt that works only if you buckle it. A person doesn’t need to do anything to be made safer by an airbag or a blister pack. Of course, we cannot know what would have saved Maisie, but if a known dangerous pill is popped out one at a time, we can safely say that the risk of a misplaced pill would decline quite a bit. The evidence shows a direct risk reduction. Tenenbein examined iron poisoning deaths in young children before and after blister packs were introduced and found…

Ref. 6F9C-F

For Tenenbein, this regulation is the key. Drug companies will not volunteer to change anything, he tells me, without the government forcing change upon the entire industry. No corporation will risk spending money where others do not, preventing accidents but losing their market share, not even to save lives.

Ref. 5412-G

But it was only the year prior, in 2018, after the opioid epidemic had been growing for two decades, that Congress passed the SUPPORT Act. The U.S. government only declared the opioid epidemic a public health emergency the year before that, in 2017. Congress passed the very first piece of national legislation in response to the crisis in 2016. In the seventeen years before that, close to half a million people were killed in accidental drug overdoses, and drug poisoning went from being the third leading cause of accidental death to being the first.

Ref. C889-H

people from an accident differs depending on the person to whom it happens—an innocent child accidentally ingesting a pill merits a response different from that accorded an adult who uses a drug and who accidentally overdoses. In this way, risk exposure can be a moral judgment. The difference between a rapid response to one accident and a seventeen-year wait before responding to accidents that were killing tens of thousands a year represents how we feel about the people having those accidents. We are especially willing to let accidents happen to some people.

Ref. 0CE0-I

In 2021, one of those efforts became law in Monroe County, New York, where they live: Maisie’s Law would require pharmacists to offer a dose of naloxone, the opioid overdose reversal drug, with every opioid prescription.

Ref. 0CA0-J

The reason that leaving unused naloxone around is problematic is not that it can cause harm—there is no risk; unlike an opioid, accidental naloxone ingestion is harmless. Instead, government officials argue in favor of restrictions on naloxone because, they say, it may encourage drug use. This is similar to the debunked idea that giving teenagers condoms will encourage them to have sex. What is behind both of those arguments is stigma, in this case in the form of government officials who believe that teenagers and drug users are insatiable and without willpower. At least when it comes to drugs, most Americans believe the same—that access to naloxone encourages drug use. Unspoken in this belief is the idea that some accidental overdoses might actually be a bit intentional.

Ref. 9403-K

Stigma is slightly different from the human error we’ve discussed until now. In previous chapters, when we saw automakers and factory owners blame human error for accidents, it was related to the task at hand—the nut behind the wheel is only a nut while driving, and “accident-prone worker” is only a relevant insult at work. Stigma is less conditional, an intrinsic flaw that extends beyond the task at hand to define who you are in all circumstances. Race, class, and gender are often stigmatized, as is drug use. Unlike the accident-prone worker or the nut behind the wheel, we may consider a person who uses drugs to be “a drug addict” even when not using drugs—this is why you may feel that a car crash is an accident but that an overdose is not. We have covered how accidents happen. Stigma is one reason why we let them happen.

Ref. 9BE3-L

Stigma is what doctors call a “fundamental cause” of health disparities—an inescapable reason why some people die by accident and others do not. Drug-related accidents can occur in different ways and have different outcomes, such as accidental overdose or accidental disease transmission from shared equipment. There is also “accidental addiction,” a term used to describe some less stigmatized people who use drugs (namely white people whose addictions began with a prescription) to absolve them of the stigma of drug use. The “accidental addict” is an example of how powerful the stigma of drug use is—so powerful that there’s a special term to absolve certain people of it.

Ref. 66E5-M

Erving Goffman is the psychologist who coined the modern understanding of the term “stigma.” He defined it as a “spoiled identity.” Society ostracizes stigmatized people for a characteristic that becomes all-encompassing—a single trait dictates how we judge a whole person. And, importantly, when something goes wrong, the stigmatized, because of their flawed character, are blamed.

Ref. 3A28-N

Goffman also defined two types of people who interact with the stigmatized: the normal and the wise. By “wise,” Goffman meant people who see and understand the reality of the stigmatized; the wise empathize with their pain. The “normal” are people who see the stigmatized by the word’s definition—as tainted and at fault. Goffman considered most people to be “normal.” Research shows that he was right.

Ref. 8368-O

But as Goffman said, this is “normal.” Even twenty years into the opioid epidemic, where more than one in ten Americans know someone killed by the drug, these stigmas are common. Only a slim majority of Americans see drug addiction as a disease. Less than 25 percent would accept a drug-addicted colleague. Less than one in five would associate with a friend or neighbor who was addicted to prescription drugs. A sizable majority believe that people who are addicted to drugs do not deserve the right to equal employment. Four in ten say that health insurers should deny the addicted equal coverage. One in three people say opioid addiction is a character defect or the by-product of shoddy parenting. Two in five say that the addicted simply lack willpower.

Ref. 137E-P

fentanyl. As a drug user, he had an understanding of the risk that did not match the actual risk. But Allen did know. Her story is interesting because she incriminated herself in an effort to prevent an accidental overdose. She revealed her role as a drug dealer in hopes of avoiding an accident. This act appears a marker of goodness and redemption that should be powerful enough to erase some of the stigma of being a drug user and a drug seller, but it was not. A judge sentenced Allen to involuntary manslaughter and drug sales, ten years.

Ref. 0D89-Q

“There seems to be some basic need at the core of human beings to (1) assign blame and (2) persecute others,” she writes. “It’s how we cope with our problems, relate to one another, feel superior, forgive ourselves, protect our beliefs, and accept our situations.”

Ref. EA45-R

Allen’s prosecution is part of a trend that has risen with the opioid epidemic. Prosecutors treat accidental drug overdoses as murders and charge the friends, partners, and dealers who sold or shared the drugs with that crime. The simple fact that we stigmatize Allen, that we are less likely to see or empathize with her pain, also supports sending her to prison.

Ref. 38AE-S

The company would continue to advertise the drug explicitly as non-habit-forming for years until the federal government prosecuted them for that lie. But Richard Sackler understood that stigma was strong enough to let him continue to profit from an addictive drug. His “criminal addict” is not so different from the jaywalker, the accident-prone worker, or the nut behind the wheel. Each is a way to blame human error to distract from dangerous conditions that lead to accidents.

Ref. AC9C-T

Tools needed to use drugs safely, such as clean syringes, are often illegal to own, making accidental disease transmission more likely. Medication to stop an overdose, such as naloxone, is often impossible to buy. Drug illegality makes people feel unsafe calling for help, making accidental overdose more likely to be fatal. And medication that helps control addiction, such as methadone, is extremely restricted in who can administer it, and when and where it is available, so assistance may require two hours of driving a day, making continued drug use simply a more plausible choice for an addicted person.

Ref. 8CBF-U

Researchers estimate that, at most, 4 percent of fatal drug overdoses are intentional. And researchers have also found that the word “accident” can act as kindness in mourning. In a comparative survey, parents who lost a child to overdose suffered more grief and worse mental health outcomes than parents of children who died of other accidents. The study authors concluded that these…

Ref. AD36-V

When I say that there are no accidents, I mean that everything we call an accident is predictable and preventable. When a person who uses drugs says that an overdose is an accident, they mean that what happened was not intentional and was regrettable, and that no one wanted anyone to die. If anyone should be allowed to say “it was an…

Ref. 451A-W

When we protest the word “accident,” it should matter who has been hurt and who is telling the story. This book is full of powerful people who say “it was an accident” to keep making money, to avoid admitting fault, and to not have to be accountable for the people they kill and injure. In so many cases, “it was an accident” is a phrase that absolves powerful people of…

Ref. 2761-X

But when a powerless person says “it was an accident,” the phrase can take on other meanings. It can mean that an overdose was unintentional or that any consequence was regrettable—it can be a way to say: I didn’t mean it. And if “accident” can offer that person some…

Ref. 816B-Y

Systemically, however, there is nothing unpredictable or unpreventable here. Addiction is not an accident when a drug company markets a drug known to be addictive as nonaddictive. Overdose is not an accident when overdose reversal and addiction mitigation medication are made intentionally unavailable. Disease transmission is not an accident when clean equipment to avoid transmission is illegal. People who use drugs face impossibly dangerous conditions. These conditions are also not the fault of a person who uses…

Ref. 683E-Z

What’s at stake is whose story we hear when we talk about addiction and drug use—what drug companies, police, prosecutors, and government officials want us to hear or what people who use drugs are saying. One reason the modern opioid crisis was allowed to get so bad was that we all spent too long listening to these powerful people, and not people who use drugs, about what was going…

Ref. A26D-A

David Herzberg has the second-best job title in this book: historian of drugs. He is also a history professor at the University at Buffalo and the author of White Market Drugs: Big Pharma and the Hidden History of Addiction in America. He notes that one of the reasons that the modern opioid epidemic has gotten so bad—bad enough to decrease the overall U.S. life…

Ref. 6460-B

2021, the vast majority of accidental overdose deaths are still due to opioids. Between 1999 and 2020, well over 840,000 people died of an opioid overdose. As the death toll rose too precipitously to be ignored, the drug companies leaned into this idea: There are no accidental overdoses. There are only reckless criminal addicts. “Once it became clear that addiction was a problem, the drug companies’ first line of defense, and an incredibly successful one, was to say: Look, our products are good, the doctors are good, the patients are good, but there are these evil abusers,” Herzberg tells me. “They are becoming addicted and giving our drug a bad name. So, we should respond to this, not as if this is a crisis of accidents, we should respond to this as if it is a crisis of bad people.”

Ref. A04A-C

What distinguishes a good person from a bad person in this logic is legitimacy—a factor that stigmas around legality, race, and economic class can dictate. Let’s imagine a scenario in which prosecutors would not have charged Allen with Bullabough’s murder. It would require that she be Dr. Allen and that Bullabough buy his opioids at a pharmacy with a prescription and accidentally overdose that way. Looked at broadly, in taking the same actions, some people have accidents and some commit crimes—stigma is the decider. In this way, stigma is a weaponized version of the human error stance. It is the same old story of finding fault with human error, but the stigma of crime ratchets up the stakes. The victim is labeled not just an addict but a criminal addict.

Ref. 71CC-D

As one essay in the New York Times put it, “When you scratch the surface of someone who is addicted to painkillers, you usually find a seasoned drug abuser with a previous habit involving pills, alcohol, heroin or cocaine.” A doctor who worked for a think tank funded by Purdue wrote that piece, wherein she cited studies funded by Purdue, and quoted doctors who worked for Purdue. As long as the criminal addict message prevailed,…

Ref. EBBE-E

accidental addicts. “Some of the work that gets done when you call something an accident is about the innocence or guilt of the person who suffers the problem,” Herzberg points out. “We consider some people to be blame-free and others to be…

Ref. 9606-F

This is stigma exactly—the exact same human errors defined in different terms. One is criminal, one is accidental, and which is which defined by other stigmatized characteristics, such as race and income, and one you might not have heard of: the doctor-visiting class. Using crime to stigmatize drug use is a tradition as old as the mass production of drugs, and that story gets interesting…

Ref. A818-G

But the response to the concurrent addiction crisis instead divided drug users into two classes of people. Some overdoses were accidents, some were crimes, and there was no overlap between the two—race, money,…

Ref. F8A8-H

“Long before smoking opium was illegal, you had oceans of ink spilled on talking about the ‘yellow peril’ and the threat to America from these Chinese men lured into opium slavery—all these prurient tales about them selling their bodies and souls to the yellow devils,” Herzberg explains. “You can look in the New York Times in the same day of the same year of 1874 and you’ll see this vicious takedown of the horrible people that are going to the opium den, then at the same time, these sensitive pieces about the tragic miseries of the unfortunate innocents who had become addicted when they just innocently trusted their doctor.”

Ref. 2B02-I

These laws followed the human error–dangerous conditions divide exactly—fixing dangerous conditions to prevent accidents in some cases and blaming human error, which disregarded and exacerbated the same conditions, to increase accidents in others.

Ref. BCD7-J

By these laws, a person who could visit a doctor could legally obtain a prescription for an opioid—say, morphine—and be legally addicted to morphine. The government would consider both their addiction and their overdose an accident. And both accidents would be less likely to occur, because their drug now had an honest label and a doctor’s observation—and we know that overdoses are more likely when drug potency is unknown and use is unmonitored. On the other side of the coin, a person who could not visit a doctor could be thrown in jail for possessing the same drug.

Ref. 1559-K

the government first criminalized drug use. Even as the drug war intensified through to the 1950s, doctors continued to prescribe morphine to people who had become addicted to it—because it was an “accidental addiction,” thus innocent and deserving treatment. It was not their fault, the story went, and they should continue to get morphine.

Ref. 9439-L

Take a person who is in treatment for opioid use but then relapses. Anticipated and internalized stigmas may drive that person to use drugs in riskier ways, such as not informing loved ones of the relapse, using drugs alone, and hiding their use. In this way, stigmas create holes in that person’s layers of safety. That person is at very high risk of dying of an overdose—they would not have naloxone, or anyone to get help, or anyone to revive them, or anyone who knows to check on them.

Ref. F2E7-M

But even if people are using together, the way stigma appears in laws and policies can increase the risk of accidental death. A person might not call 911 during someone else’s accidental overdose for fear the police will arrest them under drug-induced homicide laws. A law restricting drug paraphernalia such as syringes could make it illegal to use drugs without risking accidental disease transmission.

Ref. 9A8B-N

OxyContin, doctors require no special training or certification beyond their medical degree. To prescribe buprenorphine—an opioid substitute treatment that helps a person addicted to opioids off their edge, allowing safe recovery in the privacy of their own home—doctors must fill out a pile of paperwork, get a special waiver from the Drug Enforcement Administration, and undergo an eight-hour training session. After all that, they’re only permitted to prescribe to a limited number of patients.

Ref. 2663-O

California pharmacies, fewer than one in four pharmacists were willing to distribute the drug without a prescription even though that is perfectly legal there. In a Texas study, 31 percent of pharmacies did not have naloxone in stock, and half the pharmacists refused to bill insurance for the medication. Three years after New York State made naloxone legal without a prescription, fewer than 38 percent of New York City pharmacists both stocked the drug and were willing to sell it. And, because these structural stigmas overlap with our own personal stigmas, it is essential to point out that well over half of Americans approve of these tight restrictions on access to a medication that can stop an overdose.

Ref. FB38-P

Government officials can also stigmatize drug use in a way that causes accidents, as in 2015, when the state of Indiana had an HIV outbreak. There is no reason that any place in the United States should be having an HIV outbreak in 2015—we know what causes HIV and how to prevent it. But in Indiana, it was illegal to have a syringe. Thus syringes were in short supply, so people who used drugs were reusing and sharing syringes, leading to accidental HIV transmission.

Ref. 7D90-Q

We also find stigmas baked into the federal budget, which devotes billions more to the enforcement and prosecution of drug laws than it does to research and to treatment of people suffering addictions. Researchers have found that these policy decisions increase the likelihood of accidental death—increased spending on drug law enforcement increases the number of people who die of an overdose.

Ref. 5DAA-R

The “yellow peril” that Herzberg explained is a prime example. That was a stigma about being an opioid user and a stigma about being a Chinese immigrant. The yellow peril wasn’t just people who used opioids but poor Asian immigrants who used opioids. Drug laws did not develop to address drugs alone; otherwise, they would have applied equally to everyone. Rather, the government made and enforced the laws with the goal of creating two classes of drug use.

Ref. 7E34-S

“Those socially marginalized people, those communities were already ones that authorities wanted to police, and were policing in a bunch of other ways already,” Herzberg says, “so a drug law just added a new tool, new money to hire cops, and new ways to control those neighborhoods and the people living in them.”

Ref. 226A-T

police, prosecutors, and judges doubled the prison population. Between 1954 and 1982, they doubled it again. Then, they picked up the pace. By 1992, they doubled the population again, and then again by 2008. The government holds more people in prison for drugs today than all people incarcerated for any crime in 1980. Judges and prosecutors increased the length of federal prison sentences for drug offenders by 36 percent between 1980 and 2011. Between 1988 and 2012, the average time that a person sentenced to federal prison for drug offenses spent incarcerated rose by 153 percent. And considerably more Black people than white people are imprisoned for drugs. “It is no accident that eras of drug and pharmaceutical reform have so often paralleled each other, nor is it an accident that they map so well onto civil rights history,” Herzberg points out.

Ref. A3E2-U

Americans use illegal drugs at about the same rate. Doctors are significantly more likely to test Black women and their infants for drugs during pregnancy or delivery than they are to test white women and infants. And the results of these tests prove that the choice to test is a racist stigma: Black women are not more likely to receive a positive drug test during labor; they are just more likely to be tested. Dr. Kimberly Sue sees this as drug-use stigma exacerbated by racial stigma.

Ref. 55CA-V

Researchers have found that some doctors and medical students believe racist mythology about Black patients to a significant degree—thinking Black people have thicker skin and less sensitive nerves than others, are less likely to follow doctors’ orders, and feel less pain. As a result, doctors and medical students disregard pain of Black patients twice as often as they do that of other races and are significantly less likely to treat it. Had white and Black prescription opioid rates been similar between 1999 and 2017, researchers estimate that more than 14,000 Black Americans would be dead today—killed by accidental opioid overdoses.

Ref. B2CE-W

It was not the pile of bodies that made the public see the opioid epidemic as an epidemic, but that suddenly the bodies were white.

Ref. ECD0-X

Today, that gap is shrinking. White opioid overdoses are leveling off, and Black overdoses are rising sharply. In 2019, accidental drug overdoses returned to their old pattern, with Black people dying at a rising rate, and at the same rate as white people, for the first time since 2002, because with OxyContin finally regulated, most accidental overdoses now come from illegal, not prescription, opioids. But stigma has not changed as an arbiter of accidents. Doctors prescribe buprenorphine, which lowers the risk and rate of accidental overdose by managing the urge for the drug, almost exclusively to white people. Researchers investigated some 13 million doctor visits between 2012 and 2015 and found a surge of physicians prescribing the medication to white people and no change in the number of prescriptions for Black people. Even while Black overdoses are rising faster than white ones, doctors are thirty-five times more likely to prescribe buprenorphine to white people.

Ref. EE59-Y

When we tell a “human error” story—when we blame the jaywalker, the accident-prone worker, the nut behind the wheel, or the criminal addict—we are being duped into distraction from the ways that we can prevent accidents. And this allows the same accidents to happen again.

Ref. 7672-Z

Finding fault in a person smells like justice and feels like a book being closed. It makes sense that we seek it. But failing to prevent the preventable results in a vast and deadly unfairness—and one outcome is the wildly unequal rate at which people are killed by accident because of racism.

Ref. 75D5-A

Picture a tunnel as seen from the outside. On one end is a person just before an accident. On the other end is the accident’s aftermath. From your perspective, everything about the accident is hidden inside the tunnel, except for the person it began with, their possible error, and the outcome. This is the common perspective from which an accident is interrogated, far away or high above, looking at the beginning and the end…

Ref. 1C4C-B

For one, we could assume that the bad outcome must have started with a bad action—if something went wrong, then someone must have done something wrong. If that’s our conclusion, then the way to fix the accident is to stand outside the tunnel, holding up a safety manual, such as the managers at the Georgia-Pacific paper plant did in chapter 1, imposing those…

Ref. 0BDD-C

Or we could detail all the ways a person could have avoided an accident, as though each detail was a fact—for instance, if we were to say that the cause of the accident that killed Irwin Ouser was that he could have played somewhere else and his parents could have been more responsible. We’d work backward through what happened to ask why a person zigged when they should have zagged. The issue with this…

Ref. 9691-D

“The problem about taking the position of retrospective outsider is that it does not allow you to explain anything,” Dekker wrote. “From that position, all you can do is judge people for not noticing what you find…

Ref. FE7C-E

Dekker encourages us instead to get in the tunnel and see an accident from the perspective of the person inside it. If we can see an accident unfolding from inside the tunnel, not from outside or in hindsight, we can more easily…

Ref. 90B7-F

Let’s be clear: to err is human. People make mistakes. But we have avoided focusing on that fact, because it doesn’t help prevent accidents and instead encourages the sort of sham conclusions that Dekker outlines with his tunnel metaphor. We have also avoided focusing on human errors because many accident experts believe that they don’t really exist—rather, every human action in a built environment is a product of that environment. Dangerous conditions cause mistakes, the thinking goes, so if…

Ref. 5FD0-G

People do not die by accident equally in America. Racism and stigma, for example, both excuse and cause accidents. For this reason, I would take Dekker’s idea a step further: We need to see accidents from the perspective of those involved, and we especially need to see accidents from the perspective of those harmed. By doing this, we can trace an accident beyond the built environment—the fragile nuclear reactor and the wrong color of the indicator light—and see the intangible systems that lead to accidents, too. Racism defines who feels at risk from…

Ref. 4F06-H

If, for example, we know that Black people are more likely to be struck by a car, we could think that Black people make more errors when walking—an assumption of human error based on racism—and issue them summonses for jaywalking. (As we will learn, this is what actually happens.) But the truth is that Black people are more likely to be struck by a car for a stack of reasons that intersect, including drivers who make split-second racist decisions and racist planning policies that leave streets more dangerous where Black…

Ref. 6A2B-I

White people are more likely to have their mistakes absolved as “accidents,” and Black people are more likely to be blamed, often criminally, for the same mistakes. Racism defines which outcome we allocate to which people. This cycle creates stigma and an illusion that racial differences are real. The sociologist-historian team Karen Fields and Barbara…

Ref. CEA9-J

Risk perception researchers have found similar results so often they’ve branded this the “white male effect.” White men feel significantly more comfortable with risk or are just acutely aware that their risk exposure is relatively minimal. They aren’t simply more willing to accept risk; they accurately perceive that they alone are at less risk. Slovic, writing in 1997, makes a similar point: Perhaps White males see less risk in the world because they create, manage, control and benefit from many of the major technologies and activities. Perhaps women and non-White men see the world as more dangerous because in many ways they are more vulnerable, because they benefit less from many of the technologies and institutions, and because they have less power and control over what happens in their communities and their lives.

Ref. 82C1-K

The lead researcher on the Sanford School of Medicine study hypothesized that whoever is diagnosing the cause of death in these infants might at once be less willing to call the death of a non-white child SIDS and less willing to diagnose asphyxia deaths in white babies. It appears that—because SIDS is the more innocuous diagnosis, free of human error—doctors are less likely to diagnose non-white babies with the syndrome. And because ASSB is the more stigmatized diagnosis, bordering on neglect, with definite implications of human error, doctors are less willing to give that explanation for white babies’ deaths.

Ref. A160-L

One reason for this disparity may be that the medical professionals marking these deaths are more willing to tell white parents that their child died of an unpreventable syndrome and more inclined to blame the parents of non-white babies for their children dying of you-should-have-known-better accidental strangulation. By this count, non-white babies appear to die more often by accident because medical professionals absolve white parents of the shame of a child’s accidental death.

Ref. 23C3-M

In Jacksonville, Florida, one of the top ten U.S. cities where pedestrians are most likely to be killed by drivers, police disproportionately issue jaywalking citations to Black people. Researchers at ProPublica found that in a five-year period ending in 2016, police issued 55 percent of the city’s jaywalking citations to Black people, who make up only 29 percent of the population. Black people were three times more likely to be cited for crossing the street than were white people. Similar studies have found evidence of racist enforcement of this violation in cities much safer for pedestrians, such as New York. In 2019, police issued almost 90 percent of jaywalking summonses to Black and Latino pedestrians, who make up about half of New Yorkers. In the first three months of 2020, New York City police issued 99 percent of the jaywalking summonses they wrote to Black and Latino people.

Ref. A7AD-N

Racism is prevalent, too, in sentencing for vehicular homicide. But where being Black means that you are more likely to have to pay a fine for jaywalking, if a driver kills you while you are crossing the street, being Black means that your killer pays less of a debt to society than if the driver had killed a white person. Researchers have found that vehicular homicide, essentially the unintentional but negligent killing of a person with a car, is a crime with a shorter sentence when the victim is Black.

Ref. F4C7-O

If a prison sentence measures the value of the victim’s life, Black life is worth less than white life. The authors of the study concluded that there was something more sinister to sentence length than a measure of justice—they called this “the taste for vengeance.”

Ref. 9A7F-P

Even the moment after stepping into the crosswalk but before being struck and killed in the crosswalk is determined by racism. Researchers at Portland State University have found that drivers yield the right of way significantly less often to Black pedestrians.

Ref. 2186-Q

Not seeing someone is a classic and false explanation for a car accident. Proof that racism can be the actual explanation is reinforced—beyond these statistics—by changing the context in which a person is or is not “seen.” To a person in a car, a Black pedestrian disappears. To a person with a gun, a Black pedestrian is especially visible—so much so that they get shot more often, no matter what the context.

Ref. B78A-R

This trigger-happiness is also present in real life. Black, Latino, and Indigenous people are more likely than white people to be killed by police. While these shootings may come with an excuse, some threat claimed by the officers, one study found that around 6 percent of police shootings are described as accidents by the officers involved. In these, too, the people shot were disproportionately Black. Consider Breonna Taylor in Kentucky in 2020; her killing was called a “miscalculation.” Or Daunte Wright, killed in Minnesota in 2021 by a police officer who said she accidentally pulled a gun instead of a Taser. Or Eurie Stamps, a sixty-eight-year-old grandfather, killed in Massachusetts in 2011, unarmed and lying on his stomach, by police officers who broke into his home with a no-knock warrant for someone else. Or Aiyana Mo’Nay Stanley-Jones, a seven-year-old child killed in Michigan in 2010 by police officers who had the wrong apartment number on their no-knock warrant. Or Iyanna Davis, who woke up to the sound of her door being smashed open in New York in 2010 and hid in a closet. A police officer with an assault rifle said he accidentally tripped before he shot her. He, too, had accidentally gone to the wrong apartment with his no-knock warrant. Or Alberta Spruill, a fifty-seven-year-old employee of the City of New York who died of cardiac arrest in 2003 after police officers threw a concussion grenade into her apartment while accidentally executing a no-knock warrant on the wrong address. Racism influences almost every way to die by accident in America, and has for a long, long time. Black people died by accident at a higher rate than white people every single year since the earliest counts of accidental death in America, as far back as 1900, and through 2002, when prescription opioid drug overdoses shifted the dominant paradigm.

Ref. 5329-S

No one in America is more likely to die by accident than Indigenous people. This is true of every cause of unintentional injury tracked in the CDC’s Web-based Injury Statistics Query and Reporting System, except two: death by accidental exposure to smoke, fire, and flames, where Black people edge out Indigenous people by a small margin, and death by falling, a subcategory which is fatal mainly for older adults. To die by accidental fall, you need to live a long life, which is less likely if you’re Indigenous.

Ref. CD04-T

The way that racism produces unequal death is what epidemiologists call a social determinant of health. Determinants are tangible and intangible—social structures, built environments, and economic systems differentially distributed—and can essentially be understood as your personal stack of Swiss cheese, one that you carry with you into any risk. The holes in the Swiss cheese may line up for us all, based on how a medicine is packaged or a gun is designed, but they also may be more likely to line up precisely for you, based on who you are or how you are perceived.

Ref. 53BC-U

Racism stacks dangerous conditions against some people and drives holes in their layers of safety to such a degree that it would be useless to ask what any one person could have done differently to alter their fate. The inevitability of these accidents becomes even more apparent if we pay attention to which accidents are most starkly divided by race. We can see the racial divide in accidental death most prominently in the most common and preventable accidents, resulting when the safety of the built environment is full of holes.

Ref. 1422-V

Where a seat belt or a life vest is an obvious layer of safety that can make the difference between life and accidental death, these examples—a pool versus a lake, a house of bricks or a house of particle board—are less apparent. What we call accidents can be a matter of geography and resource allocation, and this, too, is racialized. To better understand how resources and locations affect accidental death and how racism decides who survives, let’s look at accidents on the job—where life and death are decided not just by which type of job you do but by who you are and where you do it.

Ref. CEA7-W

The layers of safety that protect us from an accident—and the holes in the Swiss cheese that allow accidents to slip through—can be far less concrete than a seat belt or a broken indicator light. The history of infrastructure financing, the design of the road and the design of the car, the race of the person driving, the race of the person walking, and the neighborhood where they meet—all of this affects whether or not an accident happens, who lives and who dies. When James Reason developed the Swiss cheese model, he believed that the holes were unintended lapses and failures, and that what led to what he called “a trajectory of accident opportunity” was a matter of random convergences. But accidents are a matter of being a certain person at a certain place at a certain time. Whiteness protects. And the one thing that can change the fate of the “accident-prone” is cash.

Ref. 2046-X

In 2012, researchers at Brown University analyzed fourteen years of accidental death in the United States—a massive study of 1.6 million victims. Both individual poverty and impoverished places tracked with the highest rates of accidental death every single year. Places with more poverty also had more people who died by accident. Worse, this gap, between the accident-proneness of places rich and poor, had widened with time.

Ref. A1EF-Y

These dangerous conditions—individual poverty overlapping with institutional poverty—exist outside cars, too. Accidental deaths by falling, drowning, and poisoning are all rising fastest in the poorest places. In counties where automakers have shut down domestic car factories, the number of people who accidentally overdose on opioids is 85 percent higher than the national average.

Ref. E5BA-Z

“It is the same way that a couple of decades ago people blamed AIDS on the actions of the AIDS patients—it just produces cognitive consistency. It makes the world cognitively, mentally stable,” Smith tells me. “Those with a very strong sense of belief in a ‘just world’ think the poor must have done something at some point that resulted in their poverty—that they do not work hard, that they are spendthrifts, that they are lazy, that they are promiscuous, that they abuse drugs—and in that way, they deserve the punishment of poverty.”

Ref. 5E9D-A

The poor are not bad, the rich are not good, and the victims of accidents are not accident-prone. Human error as a cause of accidents is a false construct. Blame, in accidents, reveals the psychology of the blamer and not much else.

Ref. D3D9-B

In 2017, then U.S. secretary of housing and urban development Ben Carson told a reporter that poor people just have a bad attitude: Poverty, to a large extent, is also a state of mind. You take somebody who has the right mindset, you can take everything from them and put them on the street, and I guarantee in a little while they will be right back up there. And you take somebody with the wrong mindset, you can give them everything in the world, they will work their way right back down to the bottom.

Ref. E2B5-C

“Poverty is a state of mind” is Carson’s version of the just world fallacy, and because he was powerful, his belief could translate to more accidental death, especially for poor people. In 2020, Carson endorsed a federal budget that would cut, by $8.6 billion, his ability to do his job—build and maintain housing for people in poverty. It’s a fine decision if you believe poverty is merely a state of mind. But if you live in poverty, that missing $8.6 billion could result in a lot more holes in the Swiss cheese between you and an accident.

Ref. 47A3-D

As individuals, we blame to insulate ourselves from pain and to pretend that justice prevails.

Ref. 458C-E

It’s a belief in justice that leads us to defend and maintain the underlying structures of injustice and then blame those accidentally killed.

Ref. D49D-F

Blame is how we control the terror stirred up by the seeming randomness of accidental tragedy. There is nothing productive in this process. As the shame and vulnerability researcher Brené Brown describes it, “Blame is simply the discharging of pain and discomfort.” To prevent accidents, we need to sit with the discomfort. Kevin

Ref. B013-G

This is Smith’s area of expertise. He knows the psychological comfort that every prying person is searching for when they seek a place to lay blame for the tragedy. But still, his voice cracks when telling me this story.

Ref. ADD2-H

“Like everyone else, I have searched and searched and searched for my cognitively satisfying explanations for why the accident occurred,” he says. “These aren’t mean-spirited people trying to slap you when you are down. They are just trying to make sense of the horror of it. I do not know what happened in that particular horrific moment, and we never will, but people want to know, and they want to make sense of it.”

Ref. F2BF-I

Laying blame makes a terrifying, unfamiliar event less frightening and more familiar. Blame produces not just a sense of relief but also a sense of power. When we blame someone for an accident, we condense all the world’s complexity, all those layers of Swiss cheese, to one kernel of cause: a single villain.

Ref. 12C4-J

The ability to keep the uncontrollable world at bay is quite a power—which is one reason we blame victims most of all: because the dead cannot contest that power.

Ref. 9C9D-K

What happened after Allison Liao’s death is worth examining because it is egregious and there is proof. The collective behavior of bystanders, reporters, and police officers is shocking. A group of eyewitnesses and professional investigators created a completely false narrative—torturing a family already in mourning. One way we can understand their behavior is in how these various parties arrived at the scene.

Ref. A71D-L

Psychologists call this “defensive attribution”—when your blame for an accident is biased against people unlike yourself because you feel threatened by the pain and discomfort that accidents stir up. Rather than serving as an arbiter of justice, blame is just a mechanism to protect ourselves, revealing little other than what team a person is already on.

Ref. E8A6-M

He found that how people felt about what happened before an accident decided everything about how they laid blame after.

Ref. 1CA4-N

The result is a tribalism of blame. Subjects judged the young driver based on his motives, hewing to opinions they would have had about drugs or speeding long before hearing this story. In other versions of the experiment, Alicke and other researchers have found that such judgments can also spring from personal experience (whether or not we ourselves ever drove too fast or hid cocaine, for instance) and personal characteristics (such as reputation, attractiveness, race, class, or gender).

Ref. D6A7-O

“The process has a big corruption component built in,” Alicke tells me. “We’re not born to be Solomon; we’re born to make quick judgments about who’s going to help us and who’s going to harm us, and that can lead to some big mistakes.”

Ref. C768-P

Bias, fear, solidarity, and outcomes can all shape a blame judgment. In this way, how you evaluate an accident is dictated by where you ultimately want to locate blame. If two people have a fight and you prefer one of them to the other, that will define how you see the first insult, the first punch, and your assumptions about their intentions. You have a gut blame reaction first and work backward from that. But if someone wants to, they can ensure you have one gut reaction or the other.

Ref. 1CFD-Q

At the beginning of this book, a mob chased down a truck driver who accidentally killed a boy in the street. It is unlikely that the entire mob witnessed the accident, but the entire mob did agree on whom to blame. Amid the rage and fear that drives a mob, these people likely experienced what social psychologists call blame conformity, or blame contagion—when everyone’s feelings fall in line behind blame for a single villain.

Ref. E864-R