If you drop a frog in a pot of boiling water, it will, of course, frantically try to clamber out. But if you place it gently in a pot of tepid water and turn the heat on low, it will float there quite placidly. As the water gradually heats up, the frog will sink into a tranquil stupor, exactly like one of us in a hot bath, and before long, with a smile on its face, it will unresistingly allow itself to be boiled to death.

Version of the story from Daniel Quinn’s 1996 novel The Story of B.

Coined by the sociologist Diane Vaughan, normalization of deviance is a process in which deviations from accepted standards become the norm, accumulating over time and eventually leading to disaster. In other words, we define a behaviour that we consider correct, but due to many factors, people start to cut corners and make a few compromises here and there. And since the “compromises” are really minor, they get away with it 99% of the time, as there are no immediate consequences. But soon this behaviour is noticed by other people in the system, and they too start to lower their standards. Why push yourself for higher standards if others aren’t, and they’re OK? With time, lower standards become the norm in the whole system, and overall quality drops significantly. And if nothing bad happens, this process will continue until some disaster hits and the system has to correct itself or die.

An infamous example of this is the Space Shuttle Challenger disaster in 1986. A failure of the primary and secondary O-ring seals in a joint in the right Space Shuttle Solid Rocket Booster caused the shuttle to break apart soon after launch, killing seven astronauts on board. The seal was not new and had performed well in previous launches, but because of a particularly cold morning during launch, it failed, causing the crash. The crash triggered numerous investigations into NASA’s engineering safety practices, communication, and whistleblower protection. The system was somewhat corrected.

Rationalization and Groupthink

How did this happen? It’s not that the engineers at NASA are idiots. Neither are the people at your company. People rarely come to work intending to do a poor job. But they still cut corners and let things slide. And, at least for themselves, they rationalise why they do it. We don’t have time for additional testing; we have a deadline to meet. Why bother implementing this security measure? I can save myself time and the company money by just skipping it. Look at the other departments — they’re doing it, and the world didn’t end, so why should I?

This becomes an acceptable behaviour for the group. Of course, some people speak up, but if the new behaviour is common enough, they might feel pressured to stay quiet and hide their concerns. After all, nobody likes people who speak up against their colleagues or bosses. This creates a culture of “see no evil” and corporate omertà.

Self-Leadership and Communication

It takes integrity to speak up in such situations, to challenge your peers, your bosses, and the status quo in general. But this is what it takes to improve the situation. It’s the duty of any professional valuing their craft to set high standards for their work, demand high standards from others, and voice reasonable concerns about existing processes and practices. Practise self-leadership: don’t lower your standards and don’t give in to degrading groupthink. Don’t be a passenger and allow others to erode the results of your work.

Work on being heard and understood. It frequently happens that technical professionals speak up using language that is simply not understood by management. It’s either too technical or not convincing enough to warrant attention. The management is in a rush, and their attention is limited, so a poorly communicated message can easily go unnoticed.

If you’re already in a position of leadership, then it’s your responsibility to create a culture of open-mindedness, where anybody can speak up and share their thoughts. It doesn’t matter if a person is new in their role or has 20 years of experience — anybody can make a meaningful contribution. Don’t penalise disagreements; create a culture of pragmatic conflict. Otherwise people will, at best, think that speaking up is useless; at worst, be scared to.

Plan the Flight and Fly the Plan

“Plan the Flight and Fly the Plan” is an expression from aviation, where, like in space travel, mistakes are often written in blood. Most of ours are just written in wasted time and money, but they are painful nevertheless. Find the time and resources to plan your work, assess any risks, and work out a plan to execute safely and successfully. And stick to the plan. Don’t let things slide just because, when you get to execution, you don’t feel like doing all the important things you wrote down. Any changes to the plan should be conscious decisions made from a state of understanding their causes and consequences. Anything less leads to poor leadership and disaster.

For a more inspirational talk on this topic, I suggest the presentation by Mike Mullane.