We are biased towards prevention...
…and it moves us towards “if something bad happens,” and away from “when something bad happens.”
We couple this bias towards prevention with an extreme focus on prediction – seeking out every minute shred of potential “safety data” to aid in these safety fortunetelling efforts.
So often we seem to place greater importance on safety data than safety reality – we lose touch with the reality of our work worlds.
We believe that – with enough focus on prevention – we eliminate the potential for catastrophic events in our workplaces. And that, if we can just get enough safety data flowing in, we can predict and prevent the next fatality or catastrophe.
Humans are notoriously bad at predicting the future – we are just flat out terrible at it.
As people, we are wired to skew towards what we would like to see happen – the more desirable a future event is, the more likely we think it is to occur. The more we dread or fear a potential outcome, the less likely we think it is to happen.
As an example, in November of 2007, economists in the Philadelphia Federal Reserve’s Survey of Professional Forecasters predicted just a 20 percent chance of “negative growth” in the U.S. economy any time in 2008, despite visible signals of an impending recession (Beaton, 2017). What followed was the most severe economic recession in the United States since the Great Depression of the 1930s.
Relating this to our “safety predictions,” we often assume that our preventative efforts are robust, that our processes are good, that our organizations are inherently “safe,” and ultimately, that they cannot fail – we slant towards optimism and overconfidence while avoiding the fact that all of these preventative efforts will – at some point – degrade, breakdown, and fail.
Rather than wasting our time on safety fortunetelling – seeking to predict and prevent everything – our time is far better invested in designing systems that will not result in catastrophic outcomes when they fail.