It was this Freakonomics podcast episode that persuaded me to read Allan McDonald’s book Truth, Lies and O-Rings, which I finished on my flight home from DC yesterday. I’m sure you can already guess what it’s about: the Space Shuttle Challenger disaster.
McDonald was in a unusual position and did a difficult thing. He knew more about the solid rocket booster’s O-rings than just about any other engineer on the planet. Because low temperatures degraded their ability to keep hot gas from escaping the booster’s joints, he did not approve of launching the Challenger in the extremely cold conditions on January 28, 1986. He was in the room when the decision was made to do so anyway, over his objection. And in the subsequent investigation into the cause of the explosion of the Challenger, he was one of the few who spoke the full truth without attempts to paper over the flawed decision-making process.
What gave NASA cover to launch—but in the full context of the situation was inexcusable anyway—was a signed blessing of the O-ring’s adequacy from the solid rocket booster’s manufacturer, Morton Thiokol. Despite the low temperatures, McDonald’s superiors at Morton Thiokol decided the O-rings would be fine. This was what drew me to the story.
Why did Morton Thiokol executives make the go-for-launch call when their engineers and the data strongly argued the risk of failure was high?
I imagine it was a tough call. I sure hope it was! Try to put yourself in their position. On one side, you’ve got engineers telling you the O-rings can’t handle the conditions, and they have some data to support that position, though it’s not an air tight argument. There’s always room for some doubt, some probability things will be fine.
On the other side you have what? “System pressure,” as David Newman would call it, also known as “conflicts of interest.” The pressure to please the client, NASA, was high. NASA was, at that time, considering Morton Thiokol’s next contract. A lot depended on keeping the money flowing. Jobs were at stake. It’s no small thing to displease a client, lose a contract, and have to lay off hundreds of workers who are counting on you. NASA had its own form of system pressure, in wanting to maintain a tight schedule of launches to show Congress—which controls the purse strings after all—it could perform as promised.
System pressure should never have ratcheted up so high that it created strong incentives to launch on January 28, 1986. That was NASA’s fault. Perhaps overly politicized “oversight” by Congress can and did play a role as well. (This is not unique to NASA and the Shuttle.)
But even in much less vital circumstances—ones even you and I face—there is some system pressure. We become invested in our positions, feeling our reputations ride on them. We have some responsibility to maintain our salaries and even grow them. Some of us are responsible for creating revenue that others and their families rely on. There are professional and cultural norms that we are loath to cross.
Sometimes, though by no means always, these forces push against doing the right thing. We are conflicted, at least somewhat. And sometimes they do so when there’s some ambiguity as to just what the right thing is. Here’s where it’s easy (or easier) to shade, to lean, to allow those system pressures to tip the scales so we can have it all. We find a way to justify doing the thing that doesn’t disrupt the status quo, even when without those system pressures we would not do that thing.
It’s very hard to be fully in tune to when this is happening. Most of the time it doesn’t matter much. Few decisions are anywhere near as important as whether or not to launch the Space Shuttle in temperatures below those at which its components have been tested. But sometimes a decision matters just enough that one is risking one’s integrity and credibility (if not worse) by succumbing to system pressure when it opposes what is empirically the (more) right call.
McDonald did a hard, brave thing by resisting system pressure. He paid a price for it, though his career seemed to have gone quite well anyway. It’s no small feat, what he did, and some of his colleagues couldn’t do it. Is it so clear you or I could in the same circumstance? In what ways do you or I allow system pressure to chip away at our credibility and integrity, if only imperceptibly? I find it disturbingly interesting to ponder these questions.
The book is long, both because it is so detailed, but also because it tells the history of the aftermath of the disaster linearly. It was investigated several ways: by Presidential Commission, by congressional committees, and in various lawsuits. In each part of the story some of the same arguments and episodes are covered. I found some parts of the book overly technical. But it’s easy enough to skim and skip. If you don’t read it, at least listen to the Freakonomics episode.