“Going with your gut” – part 2

Career choice is complex. There are a lot of uncertainties. This means we can’t necessarily just trust our intuitions on what job to choose. What can we do instead?

One suggestion might be to use some kind of decision rule to help us. A large body of research suggests that rules can improve decision making in other areas, so this idea is worth paying attention to. However, it’s unclear how reliable these studies are, or how similar the cases in question are to career choice.

Before we can draw any strong conclusions about the use of rules in career choice, therefore, we need to either test this directly, or do a more detailed survey of the literature (or both.)

Decision rules in interviews

Earlier I talked about the problems with interviews. They encourage decisions based on global impressions, which can be overly influenced by factors such as likeability and attractiveness. But we probably don’t want to stop interviewing people altogether. Interviews are an efficient and flexible way of assessing candidates. If used more effectively, they can be a valuable part of the recruitment process.

Research into different interviewing methods suggests that we can improve the accuracy of interviews. The biggest and most consistent gains come from the use of structured interviews.1,2 The main idea is to design questions before the interview to test for certain characteristics, develop a way of scoring the answers, and treat all candidates identically.

Why are structured interviews so much better than unstructured ones? The explanation seems to be that they direct focus onto relevant factors and away from irrelevant (or less important) ones. So decisions are less likely to be swayed by something like how immediately likeable a candidate is.

Should we use formulas to make our decisions for us?

In recent psychology literature there’s been a fair amount of attention paid to the idea that decision rules may be able to make better predictions than humans. The success of structuring interviews seems to fit in with this. A number of studies across different areas have compared the predictions made by experts with those made by simple formulas. Examples include predicting parole outcomes, the future grades of students, the longevity of cancer patients and the chances of success for a new business. In a meta-analysis of over 100 studies, around half show that the formulas make significantly better predictions. The remainder (bar a very small handful) show a tie.3 This doesn’t mean experts are completely useless. There’s a common thread in all these cases: a large amount of uncertainty and poor feedback on whether the decision was a good one or not. Studies show that when feedback is good, expert intuitions can sometimes be very accurate: I’ll be talking about this more in my next post.

When there are a lot of different factors involved and the situation is very uncertain, a simple formula wins out by focusing on the important factors – whilst people may try to “think outside the box” and get swayed by irrelevant considerations. Formal approaches also have the virtue of consistency: give them the same input and they’ll give you the same output every time. This is something people are surprisingly bad at: experienced radiologists rating x-rays as “normal” or “abnormal” contradict themselves a shocking 20% of the time! 4

These results have, unsurprisingly, been met with a fair amount of hostility. They seem to completely undermine the expertise of professionals all over the place, from medicine to courts of law. However, if these studies are to be trusted, simple formulas could be incredibly useful. A well-thought-out formula might save us a great deal of time and effort in making decisions. If it means improving the decisions of doctors, in some cases it might even save lives.

A formula for the perfect job?

Here’s a thought: if decision rules can improve the accuracy of interviews, of predicting parole outcomes, of patient diagnosis, might they help us with our career decisions?

Intuitively this seems plausible: choosing the best career has a number of things in common with choosing the best candidate for a job. There are a lot of different factors to consider. It’s hard to predict the outcome of your decision. We don’t tend to get good, quick feedback on whether our choice was a good one. The success of structured interviews suggests that we can improve decisions by explicitly thinking about what the most relevant factors are, having some way of scoring these factors, and building this into our decision process. The studies comparing human judgements with formulas also seem to confirm this.

But we have to be careful about what we conclude here just because it “seems plausible”. There are a few things we need to think about:

1) If formulas are so great, why don’t we use them more?

Why aren’t structured interviews common practice if they’re so much better at predicting job success? When scientific evidence goes against common sense, we need to ask whether we’re interpreting the evidence correctly, or otherwise explain how we could have got it so wrong.

Kahneman suggests that we’re biased against formulas because they seem forced and superficial.5 Human judgement, on the other hand, is seen as much more holistic and natural. Malcolm Gladwell puts it nicely:

“For most of us, hiring someone is essentially a romantic process, in which the job interview functions as a desexualized version of a date. We are looking for someone with whom we have a certain chemistry, even if the coupling that results ends in tears and the pursuer and the pursued turn out to have nothing in common. We want the unlimited promise of a love affair. The structured interview, by contrast, seems to offer only the dry logic and practicality of an arranged marriage.”6

Another point is that people just don’t like admitting they might be wrong. Suggesting to experts that their years of experience could be beaten by a simple calculation is unlikely to go down well, at least at first. However, Atul Gawande suggests in A Checklist Manifesto that mistakes in judgement tend to arise not from errors of ignorance but errors of ineptitude: it’s not that clinicians don’t have a lot of relevant knowledge, they just don’t know the best ways to apply it. So we needn’t see formulas or checklists as threatening: it’s not that they show us what we don’t know, but rather help us to make better use of what we do know.7

2) How reliable are these studies?

It’s not entirely clear how fair these comparisons between rules and experts are. There’s been some scepticism about the reliability of these results: for some discussion see the comments on this article.

Ashenfelter’s study, for example, is often quoted as showing that a simple algorithm can do a much better job than wine tasting experts when predicting future prices of Bordeaux wine. But if we look closer at the study itself there doesn’t seem to be any direct comparison between the formula and expert’s predictions: the formula’s predictions are just compared with market prices. It’s also not clear that the outcome of these predictions has been measured over a long enough period of time.

There are a number of questions we need to ask here. Do the rules and experts use the same information? Is it relatively easy to come up with a prediction rule like this, or do the studies just show that there exists a rule that beats experts after multiple attempts? How is the success of the different predictions being measured?

3) How similar are these studies to career choice?

In many of the documented cases, there’s already a lot of detailed expert knowledge: medical diagnosis being one such example. So it might be that the rules do better by codifying this knowledge in a simpler way. Apgar’s test, for example, was a simple rule that improved the diagnosis of newborns dramatically and saved thousands of lives. However, Apgar needed a lot of experience to know what factors needed to be scored by the test. I’m not sure that that we have enough experience or knowledge when it comes to career choice to come up with a rule that could improve our decisions in the same way.

4) Do we know enough about the factors that influence our career decisions?

Not any old rule will do. A formula developed by an expert with a lot of experience is very different from one that you or I might just come up with ourselves. The formulas that make the best predictions are ones which use information we know is valid for the outcome we’re trying to predict. A formula designed to test the chances of success of a marriage needs to measure things which we know are actually relevant when it comes to predicting marriage success. Frequency of quarrels: yes, difference in height: probably not. This might sound obvious, but sometimes it’s just really hard to know what the relevant factors are.

We may not know enough yet about career choice: there’s no cut-and-dried account yet of what the most important things to think about are. It also depends what exactly we’re trying to predict or measure. If we’re just looking for a job we’ll love, a rule scoring careers based on these predictors of job satisfaction might be useful. There’s a fair amount of research that tells us these predictors are pretty good. But if we’re looking for a “perfect” job on multiple levels: one we’ll enjoy, be great at, and really makes a difference, coming up with a formula to predict this might be a bit harder!

Conclusion

That decision rules could improve career decisions seems worth looking into further: it seems plausible, and could be really very useful if true. But before we can reach any conclusions, we need to either a) have reliable evidence that formulas can improve decisions in other comparable situations or b) get some direct evidence that formulas actually improve career decisions.

It therefore seems like we need to do a more detailed survey of the literature on formal prediction procedures. This could also produce some really important and useful implications for a number of areas besides career choice.

Testing the use of decision rules in career choice directly also seems like a good idea, to get a better idea of how useful they are and what the important factors are. We’re planning on doing this by putting together some simple prediction models and testing them in our advice sessions. So hopefully we’ll be able to offer some more practical advice soon. If you’d like to help us with this, and see if you can improve your career decisions at the same time, we’d love to hear from you!


You might also be interested in:

“Going with your gut” – part 1

Do you really know what job will make you happy?

Biases in career choice: Don’t be misled by the category “high impact career”

Don’t “do what you’re passionate about


References and Notes



  1. The correlation between interview success and job performance in structured interviews ranges from .35 to .62 in varying studies, as compared with .14 to .33 in unstructured interviews.
    See Cynthia Kay Steven’s “Structure Interviews to Recruit and Hire the Best People” in Locke, E. (2009), Handbook of Principles of Organized Behaviours (Wiley), and http://onlinelibrary.wiley.com/doi/10.1111/j.2044-8325.1988.tb00467.x/abstract 
  2. A practical guide for how to conduct structured interviews 
  3. William M. Grove and Paul E. Meehl (1996), Comparative Efficiency of Informal (Subjective, Impressionistic) and Formal (Mechanical, Algorithmic) Prediction Procedures: The Clinical–Statistical Controversy, Psychology, Public Policy, and Law, 1996, 2, 293–323 
  4. Hoffman, P., Slovic, P., Rorer, L. (1968), “An Analysis-of-Variance Model for the Assessment of Configural Cue Utilization in Clinical Judgement.” Psychological Bulletin 69: 338-39 
  5. See Kahneman’s “Thinking Fast and Slow”, and Paul Rozin (2005), “The Meaning of “Natural”: Process More Important Than Content”, Psychological Science 16: 652-58 
  6. Malcolm Gladwell’s “The New-Boy Network: What do Job Interviews Really Tell Us?” 
  7. Atul Gawande (2009), “The Checklist Manifesto: How to Get Things Right” (New York; Metropolitan Books)