What are some reasons I might be wrong?

What would I do if it turned out I was wrong?

Most of us spend a lot of time visualising scenarios we’d like to happen, thinking about reasons the things we believe (or the things we want to believe) are likely to be true. We very rarely do the opposite: really thinking through worst case scenarios, or actively looking for reasons our deepest held beliefs are false. Why would we want to do this? We might found out something we don’t want to know. But this is exactly why we should do it.

In this article on Overcoming Bias, Nick Bostrom writes:

“Let’s say you have been promoting some view (on some complex or fraught topic – e.g. politics, religion; or any “cause” or “-ism”) for some time. When somebody criticizes this view, you spring to its defense. You find that you can easily refute most objections, and this increases your confidence. The view might originally have represented your best understanding of the topic. Subsequently you have gained more evidence, experience, and insight; yet the original view is never seriously reconsidered. You tell yourself that you remain objective and open-minded, but in fact your brain has stopped looking and listening for alternatives.”

Applying this idea to career choice: suppose you’ve always assumed you’ll be a doctor. You’ve long believed that this is the best way for you to use your skills to do good in the world, and have found you can provide convincing reasons why it’s a better career path in this respect than any other options you’ve come across. A couple of people have questioned certain aspects of your reasoning, but you’ve always managed to provide convincing responses, which increases your confidence. Yet you’ve never really seriously reconsidered the idea that medicine is the best route for you. You tell yourself that you remain objective and open-minded, but in fact your brain has stopped looking and listening for alternatives.

I think this is pretty common when thinking about careers. I’m pretty sure I’ve been guilty of it myself in the past (and may even still be now!). There are a number of cognitive biases at play here which we’ve talked about before: anchoring, confirmation bias, the sunk cost fallacy. To quote from Nick’s article again (he puts it much more eloquently than I could!) here’s a debiasing technique you might try: writing a “hypothetical apostasy.”

“Imagine, if you will, that the world’s destruction is at stake and the only way to save it is for you to write a one-pager that convinces a jury that your old cherished view is mistaken or at least seriously incomplete. The more inadequate the jury thinks your old cherished view is, the greater the chances that the world is saved. The catch is that the jury consists of earlier stages of yourself (such as yourself such as you were one year ago). Moreover, the jury believes that you have been bribed to write your apostasy; so any assurances of the form “trust me, I am older and know better” will be ineffective. Your only hope of saving the world is by writing an apostasy that will make the jury recognize how flawed/partial/shallow/juvenile/crude/irresponsible/incomplete and generally inadequate your old cherished view is.”

So imagine the world will be destroyed unless you can write a paper that convinces a jury, composed of earlier stages of yourself, not to become a doctor (or embark on whatever career path you’re inclined to prefer.) This technique may help by forcing you to focus attention on reasons against your belief that you likely wouldn’t otherwise have considered, rather than simply finding ways to defend against them when other people throw them your way. There’s experimental evidence that in general, the method of “considering the opposite” – looking for reasons your initial judgement might be wrong – is effective at reducing the effects of an impressively broad number of biases.

Looking for arguments against something you believe strongly or have long held might be difficult, even painful. So as well as asking “Why might I be wrong about this career?” another useful technique might be to start by asking “What would I do if I were wrong?” Visualising scenarios in which the things we believe turn out to be false might be uncomfortable, but it can be really useful to have a backup plan – a line of retreat. You might find out the escape route isn’t as bad as you thought and actually opens up new opportunities. Continuing in the vein of quoting those more eloquent than myself, Eliezer Yudkowsky explains:

“As Sun Tzu advises you to do with your enemies, you must do with yourself—leave yourself a line of retreat, so that you will have less trouble retreating. The prospect of losing your job, say, may seem a lot more scary when you can’t even bear to think about it, than after you have calculated exactly how long your savings will last, and checked the job market in your area, and otherwise planned out exactly what to do next. Only then will you be ready to fairly assess the probability of keeping your job in the planned layoffs next month. Be a true coward, and plan out your retreat in detail—visualize every step—preferably before you first come to the battlefield.
The hope is that it takes less courage to visualize an uncomfortable state of affairs as a thought experiment, than to consider how likely it is to be true. But then after you do the former, it becomes easier to do the latter.”

The thought of realising you’re wrong and having to do something about it may be scary. But isn’t the thought of never realising you’re wrong, and so not being able to do anything about it, even scarier?