Uncertainty, on principle

Science has a curious relationship with confidence and certainty. What goes up may not always come down on the side of the sensible.

 
 Illustration by  Sarah Nagorcka

Illustration by Sarah Nagorcka

 

This is an editorial for Issue 21 of Lateral by editor-in-chief Jack Scanlan, who knows he doesn't know everything, but isn't a weird asshole about it like Socrates was.

Last Friday, the United States fired over 50 rockets at an air base in central Syria, its first military action against the Syrian government. Russia, an ally to the Syrian regime, wasn’t happy, and with an unknown quantity in Donald Trump at the head of the US war machine, there’s a lot of uncertainty over what will happen next; international politics appears to be on a knife-edge. It’s not fun.

Uncertainty comes in multiple flavours. You might not know where to eat for lunch, or how your footy team will go in Saturday’s game, or what your partner has gotten you for Christmas — that’s regular, everyday uncertainty. Then there are larger existential questions that are shrouded in doubt — how will our warming planet look in 50 years? Will the internet birth a super-intelligent mind that will enslave humanity? Is Donald Trump about to start World War III? Not knowing the answers to these can feel excruciating.

And the more uncertain we are about these serious issues, the scarier they can seem, all the possible outcomes unfolding like a multi-jawed bear trap — and we don’t know where we’re stepping. So it’s no surprise that many people like to lock down as much certainty in their lives as they can: we want stable jobs, reliable public transport and predictable weather. With the world potentially unfolding around us, maintaining a bubble of certainty is an understandable desire.

Unfortunately, it can also make a lot of science almost philosophically unpalatable.

This might seem strange. Ask around about science on the street and you’ll probably hear that it’s in the truth game, peeling back the layers of the universe to see what’s really going on. Thanks to science, we know real facts about the world, unshakeable truths. Scientists are some of the most certain people around, because they have the tools they need to come to firm conclusions! The rest of us are merely guessing.

But as nice as these sentiments are, they're also mostly lies. Nothing is truly certain in science, and scientists constantly battle with confidence. (Empirical and psychological — am I right, fellow PhD students?) To borrow a phrase from science journalist Christie Aschwanden, science is a process of uncertainty reduction, not uncertainty elimination.  A good scientist is always second-guessing their work, at least at some level. Models, even useful ones, are made to be broken, and every idea can be questioned if the winds of data change. Assuming we’ve landed on absolute truth means we’ll never be able to move on when a better idea comes along.

Of course, acknowledging uncertainty isn’t the same as undermining science’s conclusions. Everyone has internal, personal thresholds at which ideas flip from “probably true” to “unassailably true for the sake of convenience.” We don’t walk around all tense, preparing for when it becomes clear that we were gravely misinformed about gravity. Likewise, biologists don’t seriously question the existence of DNA or cells — the likelihood of such things being fictions is so low as to be not worth considering.

But these are established ideas. What about when science explores the cutting edge of the knowable? Here uncertainty is crucial — and sometimes confusing. Predictive models of the future, like the ones created by climate scientists, always produce a range of outcomes, some more and some less plausible — knowing how small that uncertainty is allows us to gauge how confident we should be in the results. Medical studies will often compare experimental results to those that would likely happen by chance, and the greater the difference, the higher the likelihood the effect seen truly does exist. The uncertainty never goes away, it just gets overshadowed.

This is where uncertainty about uncertainty can trip up how science is perceived by the media and the public. Scientists showing even a sliver of doubt can signal to some that they shouldn’t trust the reported conclusion. If 97% of climate scientists favour the idea that humans are causing rising temperatures, doesn’t that mean there’s a 3% chance they’re wrong? (Not by a long shot.) On the other end of things, however, ignoring known uncertainties leads to overblown claims that certain foods cure or cause cancer or new theories in physics have overturned everything we thought we knew about the universe. And depending on your preexisting beliefs, uncertainty can be used or ignored to get across whatever point of view you’d like.

This is all to say that we need a healthy understanding of scientific uncertainty; science writers must explain how confident we should be about new discoveries, and scientists shouldn’t let small margins of error outweigh important information they have for the rest of society. We must be confident about our unconfidence. After all, nothing is truly certain but death, taxes and uncertainty.

Of course, "uncertainty" is only one way to parse this month's theme, up in the air. Read the rest of this issue to find out how others interpreted it. 

By Jack Scanlan

Jack is the Editor-in-Chief of Lateral.