Tools
Tools: The Normalization of Deviance: How Acceptable Risk Creeps Upward
2026-02-28
0 views
admin
The Normalization of Deviance: How Acceptable Risk Creeps Upward ## What Is Normalization of Deviance? ## Examples Across Domains ## Software Development ## Personal Finance ## Healthcare ## Investment Decision-Making ## Why We Normalize Deviance ## How to Resist Normalization of Deviance ## 1. Make Standards Explicit and Visible ## 2. Track Deviations Formally ## 3. Create "Circuit Breakers" ## 4. Conduct Pre-Mortems ## 5. Welcome the Skeptic ## 6. Regular "Return to Baseline" Reviews ## The Deeper Lesson ## Conclusion On January 28, 1986, the Space Shuttle Challenger broke apart 73 seconds after launch, killing all seven crew members. The immediate cause was a failed O-ring seal in a solid rocket booster. But the deeper cause was far more troubling: NASA engineers had known about O-ring problems for years. They had seen evidence of erosion and blow-by on previous flights. Each time, when nothing catastrophic happened, the deviation from safety standards became a little more acceptable. Sociologist Diane Vaughan studied the Challenger disaster and coined the term "normalization of deviance" to describe this phenomenon. It is one of the most important mental models for anyone who makes decisions in complex environments -- which is to say, all of us. Normalization of deviance occurs when people within an organization become so accustomed to a deviation from standard that they no longer consider it as deviant, despite the fact that it exceeds their own rules for acceptable risk. The process is gradual: This pattern shows up everywhere -- not just in aerospace engineering but in business decisions, personal finance, health habits, and software development. Understanding this pattern is essential for navigating real-world decision scenarios where risk accumulates invisibly. A team has a rule: all code must have unit tests before merging. Under deadline pressure, one pull request goes through without tests. Nothing breaks. Then another. Then it becomes normal to skip tests for "simple changes." Six months later, the test coverage has dropped from 90% to 40%, and bugs are appearing in production that would have been caught by the missing tests. You set a budget of $200 per month for dining out. One month you spend $250 -- no big deal, it was a birthday celebration. The next month, $280. Then $300 becomes normal. Within a year, you are spending $400 per month on dining and wondering where your money goes. A hospital has a hand-washing protocol. A busy nurse skips it once and nothing happens. Then occasionally skipping becomes common among the staff. Infection rates gradually climb, but each individual incident has many possible explanations. The connection between the deviated standard and the outcome becomes invisible. An investor establishes strict criteria for buying stocks -- strong balance sheet, consistent earnings growth, reasonable valuation. Over time, in a bull market, they start relaxing these criteria. "This company does not have consistent earnings, but the growth potential is enormous." Each relaxation seems justified in isolation. The portfolio gradually fills with speculative positions that violate the original investment principles that had served them well. Several psychological mechanisms drive this process: Outcome Bias: We judge decisions by their outcomes rather than their quality. If a risky shortcut produces a good result, we conclude the shortcut was fine. But risk is probabilistic -- you can take a bad risk and get lucky many times before your luck runs out. Confirmation Bias: Once we have accepted a deviation, we selectively notice evidence that supports the new standard and ignore evidence that suggests danger. Social Proof: When everyone around you is deviating from the standard, it feels abnormal to insist on compliance. The person who raises concerns becomes the "problem." Production Pressure: In most organizations, there is constant pressure to do more, faster, cheaper. Standards and safety protocols are often the first things sacrificed, especially when the consequences of deviation are probabilistic and delayed. These biases are well-documented in the work of master investors and thinkers like Charlie Munger, who has long warned about the dangers of incremental rationalization. Write down your standards and review them regularly. Whether they are coding standards, investment criteria, health habits, or safety protocols, having them in writing creates an anchor that resists drift. Do not just note deviations informally -- record them. When you can see a pattern of accumulating deviations on paper, the drift becomes visible in a way it never is in memory. Establish hard limits that trigger automatic review. "If we skip tests on more than two PRs in a month, we stop and address the testing backlog." "If my dining budget exceeds $250, I review and recommit to the budget." Before starting a project or entering a new phase, ask: "If this fails spectacularly, what happened?" This exercise often reveals normalized deviations that people have stopped questioning. For more structured approaches, the KeepRule blog covers pre-mortem techniques and other anticipatory decision tools. Create psychological safety for people who point out deviations. The person who says "we are not following our own rules" is providing an invaluable service. Do not punish them for it. Schedule periodic reviews where you compare current practice against original standards. Not to punish deviation, but to make conscious choices about which standards to keep, which to formally update, and which deviations to correct. The normalization of deviance teaches us something profound about human cognition: we are remarkably good at adapting to gradual change and remarkably bad at noticing it. This is both a strength (we can habituate to difficult circumstances) and a dangerous weakness (we can drift into disaster without realizing it). The antidote is not rigidity -- standards should evolve as circumstances change. The antidote is consciousness. Make your standards explicit, track deviations deliberately, and regularly ask: "Have we drifted from where we should be?" For a deeper exploration of how cognitive biases affect decision quality, check out the KeepRule FAQ where common decision-making pitfalls are examined in practical detail. The Challenger did not explode because of a single bad decision. It exploded because hundreds of small normalizations accumulated over years until catastrophic failure became almost inevitable. The same pattern threatens every individual and organization that does not actively guard against it. Stay conscious of your standards. Track your deviations. And remember: the most dangerous risks are the ones you have stopped noticing. Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse - A rule or standard exists for good reason
- Someone deviates from the standard, and nothing bad happens
- The deviation is noted but rationalized ("it worked fine")
- The deviation becomes the new baseline
- Further deviations from the already-deviated standard occur
- Eventually, the gap between official standards and actual practice becomes enormous
- A failure occurs that was "unforeseeable" only because everyone had stopped seeing the drift
how-totutorialguidedev.toai