If you’re already familiar with the ideas in our career guide, this series aims to deepen your understanding of how to increase the impact of your career.
Why some career paths likely have 10 or 100 or even 1,000 times more impact than others, making your career your biggest opportunity to make a difference.
We think ‘making a difference’ is best understood as being about the number of lives you improve and how much you improve them by — regardless of who they are or when they’re living.
All careers involve some degree of negative impact. That said, in general, we recommend against taking a position which has substantial harm, even if the overall benefits of the work seem greater than the harms.
There are huge differences in the importance and neglectedness of different issues, which don’t seem fully offset by differences in tractability. This means that by choosing a different issue, you might be able to increase how much impact you have by over 100 times.
The chance of a catastrophe from nuclear war, runaway climate change, or emerging technology is small each year, but all that adds up over lifetimes. Read about why we think reducing these risks should probably be our biggest priority.
See which issues we think are most important and tractable while still being relatively neglected — meaning they offer the best opportunities to make a big difference.
You can make a bigger contribution to solving a problem by either pursuing more effective solutions or seeking greater ‘leverage’ — essentially, moving more resources towards your preferred solutions — or both. This article explores how.
Many solutions to global problems don’t have much impact, but the best are enormously effective. How taking a ‘hits-based’ approach to finding the best solutions can enable you to make a far bigger contribution.
Early in your career, we suggest focusing on building useful skills. We use the concept of ‘leverage’ to identify the most useful skills to build and explain how to test your fit and get started.
A list of ideas for high-leverage paths in which you can mobilise a lot of resources toward the best solutions to some of the world’s most pressing problems.
If you’re considering working in paths with heavy-tailed, semi-predictable performance — like scientific research — then it could be worth switching paths to get only a small increase in relative fit. Here’s why.
Comparative advantage — which is related to, but different from personal fit — matters when you’re closely coordinating with a community to fill a limited number of positions.
If you want to do good, there are greater reasons to be ambitious and take risks. We cover four arguments about why to set up your life so you can afford to fail, and then aim as high as possible.
Through trade, coordination, and economies of scale, individuals can achieve greater impact by working together — but to take full advantage of this, you need to change how you approach your career.
It’s easy to miss a great option by narrowing down too early. And if careers differ so much in impact, it’s probably even more important to explore than people usually think.
If you’re going to try to have an impact, and especially if you’re going to be ambitious about it, it’s very important to carefully consider how you might accidentally make things worse.
What’s nextSpeak to our team one-on-one to make your new career plan
If you’ve read our advanced series, our 1-1 team might be keen to talk to you. They can help you check your plan, reflect on your values, and maybe make connections with mentors, jobs and funding opportunities. (It’s free.)
We have hundreds more articles on the site. You can filter them by cause, career path, and other topics to find those that are most helpful to your situation.
“Find work you’re good at” is a truism, but we think many people still don’t take it seriously enough.
Finding the option where you have the best chance of excelling over the course of your career — where you have your greatest ‘personal fit’ — is one of the key determinants of your career’s impact. In fact, after initially identifying some promising paths, we think it’s often the most important factor.
The first reason is that in many fields, data suggests that success is distributed unevenly.
This is most pronounced in complex jobs like research or entrepreneurship. A key study of ‘expert performance’ concluded:
A small percentage of the workers in any given domain is responsible for the bulk of the work. Generally, the top 10% of the most prolific elite can be credited with around 50% of all contributions, whereas the bottom 50% of the least productive workers can claim only 15% of the total work, and the most productive contributor is usually about 100 times more prolific than the least.
In the most skewed fields like these, your expected impact is roughly just the value of outsized success multiplied by its probability — from an impact point of view, you can roughly ignore the middling scenarios.
But in most jobs there are still sizable differences in output between, say, the top 20% of performers and the average performers.
It’s unclear how predictable these differences are ahead of time, and people often overstate them. But even if they’re only a little bit predictable, it could matter a great deal — having slightly higher chances of success could result in large increases in impact.
For instance, suppose in option A you expect to be average, and in option B you expect to be in the top 30%. If the top 30% produce two times as much as average, then it could be better to take option B, even if you think option A is up to two times higher-impact on average.
If we also consider you’ll be less replaceable if you’re in the top 30%, the difference in counterfactual impact could be even larger.
The second reason why personal fit is so important is that being successful in almost any field gives you more connections, credibility, and money to direct towards pressing problems — increasing your career capital and leverage.
If you succeed at something, that gives you a reputation and credentials you can use to find future opportunities. You’ll also tend to meet other successful people, improving your connections. And you might gain a platform or money you can use to promote neglected issues. This idea is discussed more in our podcast with Holden Karnofsky.
Being good at your job is also one of the main ingredients of a satisfying job, which helps you stay motivated in addition to being important in itself. It could easily be more satisfying to be in the top 20% of a profession, even if it’s perhaps lower paid or less glamorous than an alternative where you’d be average.
How important is personal fit compared to other factors?
You can think of your degree of personal fit with a career option as a multiplier on how promising that option is in general, such that:1
total impact = (average impact of option) x (personal fit)
and
total career capital = (average career capital) x (personal fit)
This means that we often advise people to first identify some high-impact paths, and then choose between them based on their degree of fit with them — especially focusing on those where they might excel.
If you have very low personal fit for a job, it doesn’t matter how impactful the option might be in general — your total impact could end up very low. So it can be worth taking a job that you think is, say, in your second tier for impact, but is a better fit for you.
Because personal fit is so important, we would almost never encourage you to pursue a career you dislike. Succeeding in almost any career takes many years and sometimes decades of work. If you don’t like your job, you’re unlikely to stick with it that long, and so you’ll forgo a lot of your impact. (And there are other reasons we wouldn’t encourage you to pursue a career you dislike.)
Although it’s not what we most commonly recommend, it can sometimes even be worth taking jobs that don’t have any direct connection to a particularly impactful path in the short term because of the career capital you might get from excelling in them.
Isabelle Boemeke started out as a fashion model, but after speaking to experts who said nuclear energy was needed to tackle climate change (but were afraid to promote it due to its unpopularity), she pivoted to using her social media following to promote it. Becoming a fashion model isn’t normally one of our recommendations, but it could still be the right choice if your fit is high enough.
More generally, since you can have a significant impact in any job by donating, through political advocacy, or being a multiplier on others, simply working hard and being more successful in any path can let you have more impact.
What am I good at?
Academic studies and common sense both suggest that while it’s possible to predict people’s performance in a path to some degree, it’s a difficult endeavour.2 What’s more, there’s not much reason to trust intuitive assessments, or career tests either.3
So what does work?
Making predictions
Here are some questions you can use to make some initial assessments of your fit from several different angles:
What do you think are your chances of success?4 To do this, look at your track record in similar work and try to project it forward. For instance, if you were among the top 25% of your class in graduate school, because roughly the top half of the class continue to academia, you could roughly forecast being in the top 50% of academia.5 To get a better sense of your long-term potential, look at your rate of improvement rather than recent performance. (More technically, you can try to make a base rate forecast).
What drives performance in the field, and how do you stack up? The first step gives you a starting point, but you can try to improve your estimates by asking yourself what most drives success in the field, and whether you have those traits, as well as looking for other predictors of performance.
What do experts say? If you can, ask people experienced in the field for their assessment of your prospects. Just be careful not to put too much weight on a single person’s view, and aim to ask people who have had experience selecting people for that job in question, and are likely to be honest with you.
Does it match your strengths? One way to gauge this is to look for activities that don’t feel like work to you, but do for most people. We have an article about how to assess your strengths.
Do you feel excited to pursue it? Gut-level motivation isn’t a reliable predictor of success, but if you don’t feel motivated, it’ll be challenging to exert yourself at the level required for high performance in most jobs. So a lack of excitement should give you pause.
Will you enjoy it? To stick with it for the long term, the path would ideally be reasonably enjoyable and fit with the rest of your life (e.g. if you want a family, you may want a job without extreme working hours).
Learning to make good predictions is an art, and one that’s very useful if your aim is to do good, so we have an article about how to get better at it.
Investigating your options
Many people try to figure out their career from the armchair, but it’s often more useful to go and test things in the real world.
If you have time, the next stage is to identify key uncertainties about your fit, and then investigate those uncertainties.
It’s often possible to find low-cost ways to test out different paths. Start with the lowest-cost ways to gain information first, creating a ‘ladder’ of tests. For example, one such ladder might look like this:
First read our relevant career reviews and do some Google searches to learn the basics (1–2h).
Then speak to someone in the area (2h).
Then speak to three more people who work in the area and read one or two books (20h). You could also consider speaking to a career advisor who specialises in this area.
Then look for a project that might take 1–4 weeks of work, like applying to jobs, volunteering in a related role, or starting a blog on the policy area you want to focus on. If you’ve done the previous step, you’ll know what’s best.
Only then consider taking on a 2–24 month commitment, like a work placement, internship, or graduate study. At this point, being offered a trial position with an organisation for a couple of months can also be an advantage, because it means both parties will make an effort to quickly assess your fit.
In our planning process, we lead you through the process of identifying key uncertainties for each stage of your career, and then making a plan to investigate them.
If at any point you learn that a path is definitely not for you, then you can end the investigation.
Otherwise, when your best guess about which path is best stops changing, then it’s time to stop doing tests and take a job for a few years. But that is also an experiment, just on a longer time scale — as we discuss in our article on exploration.
Peak: Secrets from the New Science of Expertise, by Anders Ericsson, makes the argument that success is mainly driven by years of focused practice. We think his conclusions are too extreme, but it’s a provoking book, and the central idea — that attaining high levels of performance requires a lot of practice and it’s possible to improve most of our skills — seems correct. Also see this nice summary of Ericsson’s career by Cal Newport.
Plus, join our newsletter and we’ll mail you a free book
Join our newsletter and we’ll send you a free copy of The Precipice — a book by philosopher Toby Ord about how to tackle the greatest threats facing humanity. T&Cs here.
Notes and references
More specifically, we define a person’s ‘personal fit’ for a job as the ratio between: 1) the productivity that person would have in the job in the long term, and 2) the average productivity of other people who are likely to take the job.↩
The best study we’ve found showed that the best predictors of job performance only correlate about 0.5–0.65 with job performance. This means that much of the variance is unexplained, so that even a selection process using the best available predictors will appear to regularly make mistakes.
Schmidt, Frank L., et al. “The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 100 years…” Fox School of Business Research Paper, 2016, 1-74. PDF
This matches personal experience; it’s pretty common for hiring processes to make the wrong call and for new hires to not work out.↩
Most career tests are based on ‘interest-matching,’ often using a system similar to Holland types. However, meta-analyses have found that these methods don’t correlate or only very weakly correlate with job performance. We cover some studies about this here.↩
If the outcome of a choice of career path is dominated by ‘tail’ scenarios (unusually good or bad outcomes), which we think it often is, then you can approximate the expected impact of a path by looking at the probability of the tail scenarios happening and how good/bad they are.↩
If we suppose that the 50% with the best fit continue to academia, then you’d be in the top half. In reality, your prospects would be a little worse than this, since some of your past performance might be due to luck or other factors that don’t project forward. Likewise, past failures might also have been due to luck or other factors that don’t project forward, so your prospects are a bit better than they’d naively suggest. In other words, past performance doesn’t perfectly predict future performance.↩
How to investigate your career uncertainties and make a judgement call
The goal of this part of the career planning series is to help you make an overall judgement call about your plan after taking stock of, prioritising, and (perhaps) investigating your key uncertainties, so that you are ready to put it into action.
You will also set points in the future at which you’ll review and update your plan.
★Step back, reflect, and make your final list of key uncertainties
Career planning involves so much uncertainty that it’s easy to feel paralysed. To tackle your uncertainty, approach your investigation like a scientist. Make hypotheses, investigate them empirically, and update.
You’ve listed key uncertainties throughout the articles in this series. Now we’re going to gather them all up, check whether you’ve missed any, prioritise which to investigate, and work out what investigation to do.
Gather up uncertainties
You have accumulated key uncertainties through this process, including about:
Which global problems are most pressing (and your values and worldview)
Your best potential longer-term paths
What your best next career steps should be
What your backup options should be
Copy and paste them into Section 7.1.1 of your template.
Now here are some ways to reflect on your whole plan, check you haven’t missed any key uncertainties, and start resolving some of them.
Get some overall feedback
Often what the people we advise find most helpful at this point is to show their plan to others — other people can help spot assumptions you’re making that may seem obvious to you but really aren’t.
One exercise is to make a copy of your template and send it to a couple of friends or advisors for comments. If it’s long and messy with notes, you can just send your plan A, next step ideas, and uncertainties.
This not only helps to identify further key uncertainties, but can probably help you start answering some uncertainties too.
Try to get feedback from people who understand your aim to have an impact and who can be supportive while they challenge your thinking. A ‘career-planning partner’ can be great if you can find someone to trade career plans with — you can critique each other’s plans and help each other generate more options, plus provide moral support. Our advisors may be able to help too.
How do you reach out to people? It depends on your relationship — but if the person is someone you don’t know as well or with whom you have a more formal relationship, these tips may be helpful. In these cases, it’s better to send one or two specific questions rather than your whole plan.
It can often be easier to reach out to people if you’re both part of a community focused on making a positive difference, because then they know helping you will help them further your shared goals.
If you get some negative feedback, don’t respond hastily. If your plan is unconventional — which is likely, if you’re targeting something neglected — probably not everyone is going to agree with it. Try to understand the reasons behind their negative reaction, and decide whether to adjust. If the reasons are unclear, perhaps wait to see if others have a similar reaction or if it’s an isolated example.
You can adjust your list of uncertainties in Section 7.1.1 of your template in light of what you learn.
Optional: Change frame
Perhaps the biggest challenge in decision making is that we tend to think too narrowly. For that reason, much decision-making advice is essentially about how to view your decision from a different perspective.
Here are a couple of perspectives you can take about your plan to help you see it in a new way.
Why are you most likely to be wrong about your plan? To make this more vivid, imagine that your plan has failed — what went wrong? (Do a ‘premortem’ on it.) These ‘negative’ frames are some of the most useful ways to reduce any overconfidence bias and spot problems with your plans.
What would a kind, wise friend advise you to do? It’s often easier to see the mistakes that other people are making because you have more distance from the situation. This prompt tries to help you gain this distance about yourself.
Change the time frame with 10/10/10. Imagine you’ve already committed to your plan A. How do you feel about it 10 minutes later? How do you feel about it in 10 months’ time? Imagine it’s 10 years in the future — how do you feel about it looking back?
Do any of these prompts help you to resolve a key uncertainty or spot a new one? If so, again adjust the list you made in Section 7.1.1 of the template accordingly.
Ask yourself how you feel about your plan
Now that you’ve done a lot of explicit thinking, it’s a good time to listen to your gut.
Bring the different aspects of your plan to mind. Is there something you feel uneasy about, or that feels aversive? Does something feel ‘off’? It’s okay if this is vague.
How do you feel about your plan in general? Anxious / excited / sad / frustrated / calm / guilty etc.?
Your intuition is good at evaluating things like what you’re excited by and which people are good to work with, which are important inputs into the decision.
An uneasy gut reaction may also be a sign you’ve uncovered a problem with the plan that hasn’t yet made it to your conscious mind. Emotions contain signals about what to pay attention to, although they don’t always accurately represent reality.
If you have a negative emotion or gut reaction, try to understand what it’s about.
Then, consider whether you should change your plan or investigate further. You might find your intuition has picked up a mistake that you can fix. The ideal is for your analysis, emotions, and intuitions to all line up.
Alternatively, you might realise it’s best to push ahead anyway — it’s normal to be, say, worried about the future, even if you’ve done all the steps worth taking to mitigate the risks. And sometimes your gut can get stuck on something that it shouldn’t — as we know from the study of neuroses and biases.
If you’ve investigated what your gut feeling is about and you don’t endorse the concern, acknowledge the feeling and try to let it go.
If you think the concern is a good one, see if you can mitigate it by adjusting your plan.
If you can’t identify what’s behind the feeling, it might be best to push ahead in the meantime and keep checking in on it until it becomes clear what it’s about.
Again, adjust your list of uncertainties in Section 7.1.1 of your template in light of what you learned.
★Prioritise your key uncertainties
Now you can make an overall ranking of your key uncertainties, including both those you identified throughout this process and any you just uncovered. Base your ranking on:
How easy the uncertainties seem to be to resolve
How much difference resolving them would make to your career plan
Write them down in rank order in Section 7.2 of your template. Feel free to leave off any that seem like they definitely won’t be worth investigating.
★Make a plan to investigate
Now that you know what your top uncertainties are, you have a choice:
Investigate to reduce some of your top uncertainties, and then revise your plan in light of what you learn.
Attempt to put your plan into action now (bearing in mind you can update it later).
In reality, these two options overlap. Often one of the most useful things you can do is to just apply to lots of jobs and talk to lots of people, which both helps to put your plan into action and gives you valuable information about opportunities and your fit for them. We often see people agonise over a choice between different paths, when if they’d made lots of applications, the best path would have become obvious.
However, it’s still useful to roughly divide into ‘investigation mode’ and ‘action mode’. Focus more on action if your key uncertainties are relatively minor or will be hard to make progress on. Likewise, sometimes your key uncertainties are best resolved just by trying to put your plan into action and seeing how it turns out — you can always revise your plan in 6–24 months depending on what happens.
Otherwise, since your career involves so much time, it’s most likely worth some further investigation.
How to investigate your key uncertainties
The most useful step is often to talk to people. The right person can give you more up-to-date and personalised information than what you’ll be able to find in a book or online. People we advise are often surprised at how willing people are to help. See some email scripts for informational interviews and asking for advice. (Bear in mind that when you’re talking to these people, they are probably also informally interviewing you — see our advice on preparing for interviews in a separate article.)
Do ‘desk’ research, such as searching Google. As part of this, you can see everything we’ve written by topic to see if we’ve covered it before.
Look for ways to test your uncertainties. For instance, simply applying to lots of jobs is often one of the best ways to learn about your fit (and can double as pursuing your next steps).
Make sure to start with the lowest-cost ways to gain information.
We like to think in terms of a ‘ladder’ of tests, from least to most costly. For example, one such ladder might look like this:
First read our relevant career reviews and do some Google searches to learn the basics (1–2h).
Then speak to someone in the area (2h).
Then speak to three more people who work in the area and read one or two books (20h). You could also consider speaking to a career advisor who specialises in this area.
Then look for a project that might take 1–4 weeks of work, like applying to jobs, volunteering in a related role, or starting a blog on the policy area you want to focus on. If you’ve done the previous step, you’ll know what’s best.
Only then consider taking on a 2–24 month commitment, like a work placement, internship, or graduate study. At this point, being offered a trial position with an organisation for a couple of months can also be an advantage, because it means both parties will make an effort to quickly assess your fit.
Make a list of ways you’ll investigate your highest priority uncertainties in Section 7.3 of your template.
How long to spend investigating
There’s no hard and fast rule for how long you should spend investigating your key uncertainties. While we think that the stakes are high, and most people don’t research their career enough, if you’re reading this article, you might well be biased towards over-analysing.
One good indicator that you’ve done enough research is that your best guesses have stopped changing. If you’ve done the cheapest and most informative investigations first, and you’ve stopped changing your plan, then you’ve probably taken the low-hanging fruit and it’s time to act.
One poor indicator is a feeling of confidence. Some uncertainties will not be possible to resolve in the time you have, and you will have to act despite not feeling confident it’s the best move.
It’s also worth considering the stakes of the decision. If a choice concerns what you’ll do over many years, involves large differences between options, or is difficult to reverse, then it’s worth more investigation.1
For instance, medical school is roughly a seven-year commitment, so it could easily be worth spending months researching that decision.
If needed, adjust your plan
After you’ve done some investigation, you might want to update your plan.
In reality, this is not a single step — you might go through several loops of investigation and updating.
You might also start to put your plan into action, and then later realise you need to revise it.
We cover taking action in the next article for simplicity, but in reality you might jump back and forth between acting and adjusting your plan several times.
We mention investigating your uncertainties again in the next article as a type of ‘next action’, so you can move on to that part before finishing the investigations you want to do, and you’ll be prompted again to dive into them.
★Make a judgement call
Eventually, you will need to make a judgement call about the key elements of your plan. This can be difficult. As we noted, it may not be possible to feel confident in your answers. So the aim isn’t to feel confident. Rather, it’s to make a reasonable decision given the information and time available.
If you’ve already completed the steps we cover throughout this career planning series, you’ve already put most of the best decision-making advice into practice. You have:
Broadened your options
Clarified your strategic priorities
Tried to narrow down systematically, such as by scoring them on your criteria
Sought feedback from others
Asked why you might be wrong
Investigated your key uncertainties
Considered backup options
Eventually, you just have to make a decision. If you’ve done just half of what we’ve covered, you’ve done a lot, and are likely making a much better decision than you would have otherwise.
So make a judgement call for now. In Section 7.4 of your template, write out your best guess for your career plan, including:
A list of problems you might help solve
Your strategic priorities
A Plan A, consisting in a best-guess next step and some top longer-term options it might help you work toward
Your options for a Plan B, i.e. promising alternatives to your Plan A
A Plan Z
If you’re still hesitating, we have one final step that should help give you a bit of peace of mind: setting review points. Remember that your career is a series of experiments, and you don’t have to figure it all out right away.
★Set review points
It’s easy to constantly second-guess your plan, but that’s probably not the best way to live. You should spend most of your time focused on succeeding within, learning from, and enjoying whatever you’re currently doing.
At the same time, it’s easy to just continue with what you happen to be currently doing for too long.
A way to avoid both issues is to set an explicit review point — a trigger to reassess your plan:
One option is to pick a timeframe, typically 6–24 months (shorter when you’re more uncertain and learning a lot; longer when you’re more settled). Around the new year is often a nice time, and we have an annual career review tool for this purpose.
Another option is to consider when you’ll next gain significant information about your career, and reassess then. For instance, if you’re transitioning careers, decide to review if a year passes and you haven’t yet found a new role. Or if you’re on the tenure clock as an academic, you could decide to review halfway through.
Whatever point you choose, it can be useful to set a concrete intention to review at that point in Section 7.5 of your template — and to set yourself some kind of reminder, for example in your calendar.
What to do when you hit your review point
If you’ve had a significant positive update in favour of the path you’re currently on, then probably stick with it; if you’ve had a significant negative update, seriously consider switching.
If you’ve learned about any new options that might be significantly better than your current focus, consider switching — otherwise continue.
If you do decide to change your plan, you can restart this planning process. Don’t worry, it’ll be a lot quicker the next time around, since you’ll be able to only focus on what’s changed since last time.
Recap: Your plan is done
In this part of the career planning process, you’ve checked your plan, gathered and investigated key uncertainties, and set points in the future at which you’ll review.
You should now have written down:
A list of global problems you want to help solve with your career
A Plan A — a best-guess next career step and some promising longer-term paths to aim toward with it
A Plan B — your next-best backup option — and perhaps a Plan Z
A strategic focus and priorities to guide your decision making
Points to review your plan in the future
Check that you’ve also received some feedback (or are waiting on feedback from someone) and have investigated your key uncertainties (unless you decided to skip doing that for now).
Once you have all that, your plan is done. Congratulations!
Thinking through these questions is hard, and most people never get around to it, despite their huge importance.
If you’ve invested the time, you’re giving yourself your best shot at fulfilling your potential to have a positive impact on the world, and have a career that’s exciting and personally satisfying too.
If good career planning can increase the positive impact you have with your career — or the satisfaction you get from it — by just 1%, then because a career is typically 80,000 hours long, it’d be worth spending 799 of those hours just planning.
Fortunately, we’ll be much faster, and think the rewards of good planning are much larger.
The articles and career planning template on this page are designed to help you make the best possible decisions in planning out your career.
They’re in-depth and based on the best academic research and existing advice we could find. And we’ve tested and refined the advice in them over the years by advising over 1,000 people one-on-one.
Follow the links below to make great career decisions, review your progress, and create a career plan you feel confident in.
This template takes the most important exercises from our career guide, and organises them into a complete career plan. It starts with your longer-term goals, and then shows how to translate them into concrete next steps.
Do you need to decide between a couple of concrete options right now — such as which job offer to accept, which major to select, or which companies to apply to? Use this short process:
Regardless of your career stage, skills, or what area you want to work in, this series will help you figure out what makes for a fulfilling and high-impact career, how to map out a long-term vision, figure out your next steps, and put your career plan into action.
Sign up to complete the series as a weekly course. We’ll email you an article a week alongside some questions to answer to help you write part of your career plan.
We’ll also send you updates on our research and updates on high-impact job opportunities. You can unsubscribe from either in one click.
We recommend you review your career about once a year in order to reflect on where you want to go and whether you need to change direction to get there. To help you, we created an annual career review tool that asks a couple of key questions.
Career planning often involves difficult decisions and judgement calls that you will need to think through for yourself, such as which jobs you’re most likely to enjoy and be good at in the long term.
If you want to have a positive impact, you’ll face even bigger questions about which global problems are most pressing and how to help tackle them. Since the best options are usually new or unconventional, you’ll need to think independently and learn when to bet against the crowd.
This makes it really useful to improve your thinking and decision making, which is a great life skill too. Here are some of our best resources on how to do that.
If you’re interested in working on one of the global problems we highlight, apply to speak with our team one-on-one. We can discuss which problem to focus on, look over your plan, introduce you to mentors, and suggest roles that suit your skills.
Join our newsletter to get a summary of how we can help you have a greater impact with your career. We’ll also send you updates on our research and the highlights from our job board.
You’ll receive about two emails per month, and you can unsubscribe with one click.
Recap: why do some organisations say their recent hires are worth so much?
By Benjamin Todd · Last updated 2019-05-06 · First published May 5th, 2019 ·
Our 2018 survey found that for a second year, a significant fraction of organisations reported that they’d want to be compensated hundreds of thousands or sometimes millions of dollars for the loss of a recent hire for three years.
There was some debate last October about whether those figures could be accurate, why they were so high, and what they mean.1 In the current post, I outline some rough notes summarising the different explanations for why people in the survey estimated that the value of recent hires might be high, though I don’t seek firm conclusions about which considerations are playing the biggest role.
In short, we consider four explanations:
The estimates might be wrong.
There might be large differences in the value-add of different hires.
The organisations might be able to fundraise easily.
Retaining a recent hire allows the organisation to avoid running a hiring process.
Overall, we take the figures as evidence that leaders of the effective altruism community, when surveyed, think the value-add of recent hires at these organisations is very high — plausibly more valuable than donating six figures (or possible even more) per year to the same organisations. However, we do not think the precise numbers are a reliable answer to decision-relevant questions for job seekers, funders, or potential employers. We think it’s likely that mistakes are driving up these estimates. Even ignoring the high probability of mistakes, the implications of the data depend heavily on exactly what is driving the results. We are very uncertain about the magnitude of various considerations, so we recommend against leaning on these numbers when making career decisions.
Independently of this data, we believe that these jobs are sometimes very high-impact for some people. This suggests that finding out whether or not you’re a good fit can be valuable, even if most people won’t turn out to be. At the end, we sketch out some (weak) implications for job seekers. We hope to write about our overall views on the current job market in the effective altruism community in the future.
These are just rough notes and not a polished article, but I hope they’ll help to sum up the discussion and let the debate move forward.
A quick recap of the results
Here was the exact question used in the 2018 survey:
For a typical recent Senior/Junior hire, how much financial compensation would you need to receive today, to make you indifferent about that person having to stop working for you or anyone for the next 3 years?
Here are the raw results:
In theory, the ‘value-add’ of an additional employee to an organisation over three years (relative to what would have happened if the organisation did not attempt to hire) is given by:
The extra impact the organisation is able to have over those three years (compared to the counterfactual of not making a hire at all).
Minus the cost of their salary.
Minus the opportunity cost of management time (including the time it takes to run a hiring process).
Minus any other costs of hiring (e.g. changing culture).
If someone else would have been hired otherwise, then the marginal value of a particular hire can be approximated as the value of the person hired (as calculated above) minus the value that would have been created by the person who would have been hired otherwise.
Prelude: Are these figures even plausible?
The claimed value of recent hires seem surprisingly high at first glance, so you might consider dismissing them immediately. Before outlining some potential explanations for the figures, we’d like to suggest that the results are not totally implausible, prima facie.
In particular, the results may seem more reasonable when you consider the impact of some of the organisations in the survey. For instance, GiveWell has about 25 employees and moves around $100m per year towards its top recommended charities, which is $4m per employee per year. A typical salary is under $100k, so even if the opportunity cost of management time is $1m per year per person, the value created over three years by a recent hire who raised the average impact of their recommendations by a few percent could still be millions of dollars. (These are not my actual estimates of the impact of GiveWell employees, they’re just figures to illustrate that it’s possible for a reasonable calculation to come up with very high figures.)
Next are five considerations that could (partially or sometimes fully) explain why the figures are so high.
1. The estimates might be wrong
The organisations could easily be mistaken when they give their estimates. We think this is actually pretty likely.2
One way the respondents could be mistaken is by introspecting incorrectly. The question involves predicting how they would trade money against recent hires in a hypothetical. It could be that if a respondent were actually faced with the decision to lose a recent hire or gain such a large amount of money, they’d choose the money.
Another possibility is that the respondents are correctly predicting how they would behave given the tradeoff, but would be wrong to make such a tradeoff. In other words, the recent hires don’t add as much value as the respondents think they do.
It wouldn’t be surprising if the respondents were simply wrong. Our impression is that most of the answers were given with just a couple of minutes of reflection, and so mainly reflect a gut intuition. There’s not much reason to expect these intuitions to be accurate on average in this kind of domain.
If the respondents made one of these types of mistakes, what are some specific possible causes?
These estimates involve weighing up many hard to estimate factors, so it’s easy to let bias come in. For instance, one source of bias is self-promotion or overconfidence. The organisations might be systematically over-optimistic about the value of their hires, or want to give high figures since doing so inflates the perceived value of their hires or the respondents’ own work.
One particularly difficult aspect of the estimates is that you need to envision what would happen if you lost the hire. The loss is concrete, but it’s hard to envision all the ways you might adjust to it. This could lead people to overestimate how bad the loss would be. This might be analogous to how people typically overestimate how unhappy negative events will make them. In addition, it’s also hard to quickly envision all the useful things you might find to do with additional money.
Another difficulty that could lead to mistakes is accounting for the opportunity cost of management time. If you lost the hire, it would free up senior staff to do something else, which could also produce significant value. It’s not clear how accurately people were assessing the value of lost senior management time, and it would be very easy to forget to do so at all. This could lead people to overestimate the value of recent hires.
This said, there could also be biases and mistakes in the opposite direction.
For instance, it feels uncomfortable to give high figures and giving high figures could discourage donations (since it implies donations don’t achieve much compared to hires).
Another factor that could be easily overlooked is that additional hires not only have some short term impact, but they also speed up the organisation’s growth, and may develop into senior managers themselves, thereby allowing the project to make further hires more quickly years down the line. This growth component might be an important part of their impact, but it’s hard to estimate, so might get undervalued or forgotten. They may also leave and use their experience to do useful work at other organisations in the future. Again, this might be where much of the value of junior hires comes from, but may not be taken into account in the estimates.
Overall, the more suspect the estimates, the less you should update on the results and the more weight you should put on your prior. In our original discussion of these results, we said “unfortunately, we do not have very much confidence in the answers to these questions and would not recommend updating very much based on them.” Overall, we stand by that view.
2. There might be large differences in the productivity of hires in a specific role
If the organisations believe that the people they have hired are significantly more productive in the role than those they could have hired otherwise, and the organisation has a lot of impact per staff member, then they will give high figures for their recent hires.
We expect large differences between recent hires and the next best candidate may often exist, for a couple of reasons.
Firstly, it’s very hard to get a precise estimate but we believe that the (ex-post) output of different workers in a field usually varies a great deal, especially in skilled jobs. Some people who have studied productivity have argued, for instance, that researchers vary 100-fold in their output.
The differences also get more extreme when you draw from the tail of a distribution.3 An organisation that only hires a couple of highly skilled people each year will be able to choose all its hires from the tail of its hiring pool, so differences among their best candidates will typically be larger.
If some fraction of these differences in output are predictable by an organisation, it could reasonably think that good recent hires are several times more productive than the next best person in their hiring pool.
The situation in effective altruism organisations might be even more severe than with other skilled jobs, if they require an especially rare skill set. Most roles require knowledge of many different aspects of effective altruism, and the pool of people who have that is very small. If the pool is already depleted, then there could be large differences between the best candidate and the next best candidate.
What’s more, the differences in value-add get even larger still when you also consider costs as well as output.
The costs of hiring are to some extent fixed. For instance, one of the main costs of hiring is that you use up the time of a manager. Let’s have a simple model where each hire takes up 10% of the time of a manager, and we value this at $100,000 per year. (These are just illustrative figures and should not be taken as estimates.)
Then consider three potential hires:
A. Produces $150k per year. B. Produces $110k per year C. Produces $100k per year.
When we subtract the $100k of management time, the value-add is:
A. $50k B. $10k C. $0
What’s surprising about this is that there was only a 36% difference in output between A and B, but their value-add varies by a factor of five. The 50% difference between A and C means that C has zero value-add.
In fact, the situation is even more extreme again, because different hires also vary dramatically in their costs. A great hire who’s trusted and independent might take up almost no management time, while someone who ends up being a bad fit could take up a huge amount of management time.
Likewise, a great hire can improve the culture and make other staff more motivated, while a bad hire can easily de-motivate everyone else.
While we’re very uncertain of the magnitude of the factors in this section, we believe they could plausibly contribute to very large differences between the value-add of recent hires and the next best candidate, especially among senior roles.
3. The organisations may be able to raise funds easily
If the organisations surveyed are able to easily raise funds, they’ll be willing to give up large potential donations to retain recent hires, and will give very high answers to the survey question. We only asked the organisations to quantify the value-add of additional donations to their own organisation. If some organisations are already able to consistently raise funds whenever they want to expand their budgets, then they may be nearly indifferent to additional funding — they simply can’t do much else with more money and if they do come up with another effective use of money, they can be confident they’ll find it from another source.
In theory, the respondents should have considered the possibility of donating un-needed funds somewhere else (at least to the extent that they are trying to maximize amount of good done overall and not biased toward their own organisations).4 But it’s not clear that, even if their organisation would regrant the money in this situation, the respondent would consider this possibility when responding to a question framed around the tradeoff between retaining a recent hire and donations directed to them in particular.
This would mean that the figures would mostly reflect organisations placing a low value on marginal donations, as opposed to a high value on staff. The figures would then be very poor evidence for the questions we actually care about – like how much value potential donors and staff should place on direct work.
It does suggest that earning to give and donating to that particular organisation is less useful, but in that case people can direct their donations elsewhere instead.
In the next version of the survey, we could try to ask what a donor should be willing to pay, rather than what the organisation would need in compensation to retain a hire. We could also try to ask about the value of hires in a different unit.
4. Retaining a recent hire also allows you to avoid the cost of running a hiring process
The survey question asks about the value of retaining a recent hire, which is different from the expected value of deciding to hire a similar candidate. Finding a new person requires running a hiring process, which can be expensive. It takes a long time and a lot of effort to find the right people, train them, measure performance and build trust. We list some of the costs of hiring here, and also see this post by GiveWell.
In a recent post reflecting on Open Phil’s 2018 generalist research analyst recruiting, Luke Muehlhauser wrote that “despite our time-intensive application process and trial period, in most cases we didn’t feel we had a good read on which candidates would be a good fit for our generalist RA roles until roughly 2 months into the in-person trial.”
Let’s say a hiring process took up 1 month of senior staff time. If an organisation values this at $100k, then anyone who is already through the hiring process is worth $100k more to the organisation, when compared to people who haven’t been through yet.
The 1 month of senior staff time and $100k figure are not my estimate of the actual cost of hiring. It’s just an illustrative figure to show that this could drive up the value-add figures of recent hires. In practice, the cost of hiring varies a great deal between organisations, candidates and situations.
A potential staff member who has already been identified, is trusted and can hit the ground running has fewer costs, so is worth more to the organisation.
Hiring requires several sequential steps, which means that if you get unlucky, there can be years between identifying a need on staff and successfully onboarding someone who is a good fit. Search, selection, and training can take over a year. If you then discover the person wasn’t a good fit after all, you have to start over. This serial dependency can create bottlenecks to hiring even for an organisation that is committed to expanding. If the organisation is growing rapidly, it can end up hiring far fewer people than it would like to. This increases the value of people who have already been hired.
If the cost of hiring accounts for a lot of the value of recent hires, then this increases the value of retention at impactful organisations (and is a consideration in favor of staying at your current job if you’re already at an impactful organisation where you’re a good fit).
This could also partially explain why some organisations placing a high value on recent hires don’t continue hiring. When considering the value of adding another staff member, they have to subtract out this cost.
If the cost of hiring is a major reason for high figures, then we could say that vetting potential hires is a key bottleneck for EA.
Do the survey results have any decision-relevant implications?
Which of the above considerations play the biggest role in driving the results? I’m unsure. I think each one has an effect but don’t know their relative magnitudes. This is one reason that decision-makers — including job seekers, staff at impactful organisations, potential employers, and funders — should not rely on the precise magnitude of these figures to inform their decision-making. Their implications depend heavily on the (unknown) size of the effects we’ve discussed.
The group these figures do have some relevance for are the most recent hires of these organisations. Inasmuch as the willingness to sacrifice donations to retain them is high, that suggests they shouldn’t quit if their alternative would be earning to give to the place they work. However, the large figures might be caused by the organisations surveyed having few funding constraints, so they can’t tell us much about the relative benefits of working for one of these organisations or earning to give and donating elsewhere.
The group for whom these figures are second most relevant are those choosing between working for a particular organisation, and earning to give to that same organisation. Inasmuch as these figures are very high that suggests that they should expand the options they are choosing between to include earning to give and donating to other more cash-strapped projects.
Beyond that, my personal view is that there’s a good chance the figures are wrong, so they shouldn’t be a major part of anyone’s process for making career decisions. Instead, you could lean on more robust considerations, such as whether the organisation is high-impact, your degree of personal fit with the role, how much career capital you’ll gain, and so on.
I also think that the high cost of hiring processes and ease of fundraising for some organisations increase the figures, making them less useful for most decision-relevant questions.
Nonetheless, for other reasons, I do think (i) some of the organisations in the survey have a lot of impact and (ii) there are large differences in productivity between hires, such that some staff have a very high value-add. This means that at least some positions at these organisations are likely to have a very large impact.5
This would suggest that, if these jobs are among your shortlist, it’s valuable to find out if you might be able to end up as one of these high value-add staff.
However, it’s important to do this while bearing in mind that the highest impact positions are not all in these organisations. Working at an effective altruism organisation is just one among ten priority paths that we recommend especially highly. There are many high-impact positions in other types of organisation, so it might be even better to focus on testing your fit elsewhere. The rest of this section is written for readers who think working at effective altruism organisations is in their shortlist of long-term paths.
It’s also important to keep in mind that the base rate for any application being accepted is under 10% – sometimes well under. In addition, there are only a handful of these organisations, so only something like 20 to 40 positions open up in a given year. This means that even someone who has promising fit can’t be confident of landing a position. In the same way that no-one looking for their first consulting job would plan their career around working at one specific firm, no-one should plan their career around getting a job at these 20 or so organisations, which between them have fewer job openings than a large consulting firm.
However, that doesn’t mean it’s not worth applying for these jobs. Since there could be a high upside if it works out, it can be worth making applications or learning more about the area, so long as you can do so in a way that doesn’t set back your career in other areas.
For instance, we sometimes come across people who almost didn’t apply to these positions since they thought they had no hope of landing a job, but then turned out to be successful. It would be good if more people like this applied; however, we’d never encourage someone to only apply to jobs at effective altruism organisations. See making applications as an attempt to learn about personal fit, and apply to other types of jobs as well.
Similarly, if you’re not able to get a job right away, it could be worth taking a position that builds career capital for these positions, but we’d usually only recommend doing this if the position also builds your career capital for other paths. For instance, doing think tank research both prepares you for research jobs at effective altruism organisations and opens up top paths in policy.
One final point to remember is that the high dollar figures don’t necessarily imply that it’s easy to get a job at these organisations. Rather, the high figures might reflect the fact that very few people are a good fit for these jobs, making it harder to get jobs at these organisations.
In brief, exactly how to respond to this situation will vary a lot from person to person and it’s hard to give generic advice. If you’re unsure about the next steps for your career, you can see a summary of our general advice for comparing your options in our decision process.
Appendix I – A possible alternative survey question
At some point we will have the opportunity to survey organisations about the financial value of their hires again.
Ideally the answers would be informative for someone deciding between i) earning to give and doing direct work at the organisations surveyed; ii) someone choosing between working at different organisations surveyed; iii) someone deciding which of the organisations surveyed to give to; and iv) someone deciding between direct work at those organisations, and other direct work options elsewhere. Unfortunately, like most social science, this is not straightforward, and so we will probably have to focus on one of these groups.
Below is a very rough draft of one version of the question we are considering asking. We hope it would be more decision-relevant than the question used above, but we haven’t yet had time to pilot it or vet it for any issues:
Imagine that sometime in the next year you are about to hire your next junior (senior) hire. A genie appears and offers you the following choice. You can have one of the following:
1. The genie will create a person and applicant for the job from thin air. They will be as much more productive (in % terms) than the next best applicant in the pool, as your last junior (senior) hire appeared to be at the point when you were evaluating whether to hire them. This person will live out the rest of their life like any other staff member, and may well go on to do other useful work outside of your organisation later on. You should consider the benefits of that for the world as well.
2. The genie will distribute $X among whichever organisations or people you nominate – which can include you and your organisation – to be used to improve the world as much as possible. Consider all the benefits for the world this would generate.
At what value of X would you be indifferent between these two options?
We would value feedback on the ways this could be a good or bad question.
Some issues to keep in mind include:
A hire is less useful before you’ve tested them (ex ante) than after you have and decided to keep them (ex post).
Respondents are better able to respond to concrete questions about the past and specific people than hypothetical future ones.
We should consistently allow people and funding to flow out to their next best opportunities in the rest of the world if an organisation finds it hard to scale up.
We should compare adding staff and adding money, or removing money and removing staff, to avoid getting confounded by people valuing losses and gains differently. This is another way the question we used previously was not ideal.
The more convoluted the question becomes, to deal with the problems described in this post, the less intuitive it is to answer, which can itself make the results less reliable.
A more recent conversation about the competitiveness of positions at EA organisations has also raised questions about the replaceability and overall value of these jobs although this post was drafted prior to that discussion. See also 1, 2, 3, 4.↩
In fact, we initially reported these results with a disclaimer about our low level of confidence in them.↩
This is because most distributions are much less dense in their tails than at the center of the distribution. For example, let’s say you have a hundred people and the distribution among them of some trait, call it height, is a standard normal distribution. Then the tallest person and the second tallest person will have heights that are about as different as the difference between the heights of the 36th tallest person and the 50th tallest person. If the distribution of output is fatter tailed than the normal distribution then the difference in output between people at the tails will be even more extreme.↩
Note that some organisations in the effective altruism community avoid this problem with policies preventing them from accumulating more than a certain amount in reserves.↩
I don’t believe this primarily due to the survey results, though the survey results are weak evidence for it.↩
Is it fair to say that most social programmes don’t work?
By Benjamin Todd · Last updated August 2017 · First published July 2017 ·
Image courtesy of A&ETV Beyond Scared Straight. Learn more about the effectiveness of Scared Straight.
Lots of government and charity programmes aim to improve education, health, unemployment and so on. How many of these efforts work?
The vast majority of social programs and services have not yet been rigorously evaluated, and…of those that have been rigorously evaluated, most (perhaps 75% or more), including those backed by expert opinion and less-rigorous studies, turn out to produce small or no effects, and, in some cases negative effects.
This estimate was made by David Anderson in 2008 on GiveWell’s blog. At that time, he was Assistant Director of the Coalition for Evidence-Based Policy.
This has become a widely-quoted estimate, especially in the effective altruism community, and often gets simplified to “most social programmes don’t work”. But the estimate is almost ten years old, so we decided to investigate further. We spoke to Anderson again, as well as Eva Vivalt, the founder of AidGrade, and Danielle Mason, the Head of Research at the Education Endowment Foundation.
We concluded that the original estimate is reasonable, but that there are many important complications. It seems misleading to say that “most social programmes don’t work” without further clarification, but it’s true that by focusing on evidence-based methods you can have a significantly greater impact.
We’ll go through the estimates made by Anderson, Vivalt and Mason in turn, discuss the complications, and try to reach an overall conclusion at the end.
David Anderson’s updated estimates
David Anderson is now Director of Evidence-Based Policy at the Laura and John Arnold Foundation, a multibillion dollar charitable foundation. We reached out to him, and he had some bad news:
If anything, the percentage of programs found to have weak or no effects when rigorously evaluated may even be a bit higher than 75%.
He went on to explain:
I originally gave that quote to GiveWell as a rough estimate based on our organization’s review of hundreds (now probably thousands) of randomised controlled trials conducted across various areas of social policy. Since making that estimate, we’ve looked at this question a little bit more systematically.
Education: Of the 90 interventions evaluated in RCTs commissioned by the Institute of Education Sciences (IES) since 2002, approximately 90% were found to have weak or no positive effects.
Employment/training: In Department of Labor-commissioned RCTs that have reported results since 1992, about 75% of tested interventions were found to have found weak or no positive effects.
Medicine: Reviews have found that 50-80% of positive results in initial (“phase II”) clinical studies are overturned in subsequent, more definitive RCTs (“phase III”).
Business: Of 13,000 RCTs of new products/strategies conducted by Google and Microsoft, 80- 90% have reportedly found no significant effects.
The current pace of RCT testing is far too slow to build a meaningful number of proven interventions to address our major social problems. Of the vast diversity of ongoing and newly initiated program activities in federal, state, and local social spending, only a small fraction are ever evaluated in a credible way to see if they work. The federal government, for example, evaluates only 1-2 dozen such efforts each year in RCTs.
What counts as a “weak effect”?
One difficulty with these estimates is that they’re sensitive to the definition of a “significant effect”. Some variables include:
The bar for statistical significance.
How large the effect size needs to be relative to the cost.
How the outcomes are chosen.
Our understanding is that Anderson used the standard 5% significance test for (1), and he told us in correspondence that:
We were focused on fundamental (policy relevant) outcomes drawn from individual RCTs, as opposed to meta-analyses. In terms of the effects themselves, I was basing my estimate to Give Well on the general rule we used at the Coalition to determine if something “worked” – i.e., whether it was found in a well-conducted RCT to produce sizable, sustained effects on important outcomes.
Costs weren’t explicitly considered.
We can also look directly at the IES study mentioned above to see their conditions for inclusion, which are in line with this:
In cases where the study measured intermediate outcomes (e.g., teacher content knowledge) and more ultimate, policy-relevant outcomes (e.g., student achievement), we counted the effect on the ultimate outcomes.
In cases where the study measured both interim and longer-term outcomes…we counted the effect on the longer-term outcomes.
Another issue is how the studies are chosen. If you include lots of studies with too few participants, then the percentage that work will appear to be low, even if most of them do (these are called underpowered studies). In the Arnold Foundation’s review, however, they say that studies were only included if:
Sample was large enough to detect a meaningful effect of the intervention.
Estimates within international development, and meta-analyses vs. RCTs
So far we’ve only talked about estimates for US-based programmes, and we’ve only talked about individual randomised controlled trials rather than meta-analyses — a meta-analysis takes all the existing studies on a programme and combines them, with the aim of providing clearer answers about what works. Eva Vivalt is the Founder of AidGrade, which does meta-analyses of international development interventions, so she was well placed to help.
Vivalt did a couple of quick analyses of their data set of RCTs to show how the statistic depends on the definitions. Note that these are just off-the-cuff estimates and could be revised on further analysis.
To start with:
60-70% of individual RCT results are insignificant.
This is similar to Anderson’s estimate, though a slightly higher percentage work.
However, Vivalt pointed out that it’s an underestimate of the fraction that work, because (i) most studies have too small a sample to pick up the effects (are “underpowered”) and (ii) it includes all outcome measures, including those that aren’t very important.
If we combine the studies by type of intervention (e.g. bednets), and perform a meta-analysis, then:
70-80% of interventions (aggregated up into things like “bed nets”, “deworming”, etc., not individual projects) have at least one positive significant outcome if aggregated using random-effects meta-analysis.
This is now surprisingly high, but this is still not quite the figure we want, because (i) the outcome might not be important and (ii) the effect size might be small relative to cost. Furthermore, if many outcomes are measured but only one is significant, the chances of a false positive are a lot higher, for the reason explained here.
How can we pinpoint which outcomes are important? One option is to look at all the intervention-outcome combinations that were addressed by multiple studies, given that few studies on a type of intervention share outcomes in common. The idea is that if many researchers included the outcome, they probably thought the outcome was important. Restricting attention to those intervention-outcomes shared in common by at least three papers, we find:
60-70% of intervention-outcomes that were studied have insignificant meta-analysis results.
The mean effect size is about 0.1 standard deviations.
Overall, the picture seems similar to Anderson’s estimates, but with a slightly higher fraction working. (They’re also in-line with other data we’ve seen, such as JPAL’s Policy Lessons.) However, we should suspect that a higher proportion of meta-analyses would find significant effects compared to individual RCTs. This is for several reasons:
First, there is probably more positive selection with meta-analyses, since people won’t study an intervention unless they think it works, and meta-analysis relies on pulling together results from multiple studies. Vivalt agrees and thinks that it’s an optimistic estimate of the underlying distribution.
Second, many individual studies are underpowered, and so will show no statistically significant effects. However, if the intervention does actually work, then when you combine all the studies into a meta-analysis you will achieve statistical power and find a positive result.
Third, you could imagine that an intervention is ineffective in most circumstances, but occasionally has strong positive effects. Consider three trials:
No significant effect.
No significant effect
Three units of impact.
Then the proportion of individual RCTs that “work” is only 33%, but if we took an average across the studies, the average impact would be 1 unit. This is a gross simplification of what a meta-analysis would do, but illustrates the basic idea. Zooming out, if you think international development works as a whole, then the more studies we combine, the higher the chance of a positive effect.
A fourth factor might be that a higher proportion of interventions work in international development compared to US social services. If people are poorer, it might be easier to find simple ways to improve their lives that actually work and are large enough to be picked up by studies. In general, we should expect the fraction that “work” to vary by domain.
Finally, Vivalt estimated the fraction of programmes that are tested at all, and made a rough estimate similar to Anderson:
Perhaps only around 1-2% of programmes get evaluated with RCTs.
Meta-analyses of UK education with Danielle Mason
The UK’s Education Endowment Foundation provides a fantastic “toolkit” that summarises the evidence on different UK education interventions, so we can ask the same questions in another domain.
Danielle Mason, Head of Research at the organisation, told us that the toolkit attempts to include all relevant, high-quality quantitative studies:
For each topic in the toolkit we capture all existing English language reviews and studies that meet a certain quality threshold.
Each type of intervention is assessed based on (i) strength of evidence, (ii) effect size and (iii) cost. See how these scores are assessed here.
As of 19 June 2017, there are 34 types of interventions in the “teaching and learning toolkit”, of which 31 had at least one meta-analysis performed on them (i.e. they have a score of at least ⅖ for strength of evidence).
Within that, what fraction of these 31 could be said to “work”? As we’ve said, this depends on the definition you use, since whether something works depends on the ratio of costs and benefits, and there’s no clear dividing line. EEF encourages users to consider the tradeoffs, rather than dividing the intervention-types into those that “work” and “don’t work”. With that said, here are some rough figures for the remaining 31 strands:
2 (6%) had negative effects.
19 (61%) had an impact score of at least “3 points” (measured in months of progress), which is defined as “moderate effect” in the rubric.
Of those 19, one was expensive relative to its effect size, so it might be reasonable to count them as “not working”.
2 had an impact score of only “2 points”, but were among the cheapest, so it might be reasonable to count them as “working”.
The percentage that work seems surprisingly high, and is perhaps higher than Vivalt’s figures. This is similar to John Hattie’s findings – out of 1,200 meta-analyses within education, he found that the average effect size was 0.4 standard deviations, suggesting that a majority of interventions “work”. However, many of these are not causal interventions. For instance, the top item in Hattie’s list is “teacher estimates of achievement”, which just shows that teachers can predict which students will do well, but doesn’t tell us about how to improve student performance. We would expect the average effect size of the causal interventions to be lower.
That aside, we are not sure why the percentage that work seems higher. It might be that the positive selection effects are stronger in this sample, or that there is more publication bias within education research (as we will come on to).
Other sources to investigate
The Campbell Collaboration does meta-analyses of social programmes and the Cochrane Collaboration does meta-analyses of health interventions. It would be useful to review the proportion of these that are significant, but our rough impression from browsing the database is that about half find insignificant results.
What about the replication crisis?
Even if an RCT finds a positive effect, when another group tries to run the same study (“replicate” the findings), they often find no effect. The fraction that fail to replicate varies by field, but is often in the range of 20-50%.
The replication crisis is most severe in subjects like psychology and education research, which could explain the apparently more positive findings in education set out above. In psychology, even findings backed by multiple meta-analysis and expert consensus have later failed to replicate. You can read a popular account of the failure of “ego depletion” studies here. A recent attempt to replicate multiple studies on “romantic priming”, found that the whole effect was likely due to publication bias. The plot below shows that the original studies found an average effect size of 0.5, compared to 0 for the replication studies.
The replication crisis is thought to be happening because existing statistical techniques provide lots of opportunities to increase the apparent significance of the effects, and positive effects are far more likely to be published than negative effects. So, even if you have an RCT showing positive effect, there’s still perhaps a 20-50% chance that the true effect is near zero.
In part for this reason, John Ioannidis famously argued that “most published research findings are false”.
Nearly 80% of the reported effects in these empirical economics literatures are exaggerated; typically, by a factor of two and with one-third inflated by a factor of four or more.
We haven’t made a further adjustment for these concerns in any of our estimates above, so they are probably mostly overestimates.
However, these problems are much less serious if we focus on high-quality studies, and even more so if we use meta-analyses, as we have in many of the estimates. If we suppose that 30% of findings will fail to replicate, then if the proportion that seem to work starts at 35%, it’ll drop to 25%.
What’s more, an upcoming paper by Vivalt and others found that the picture is better in development economics, since the field contains a relatively large number of big studies.1
What can we conclude from all the above?
It’s hard to say what fraction of social interventions “work” because:
Only a couple of percent are ever rigorously measured, and many studies are underpowered.
This makes selection effects potentially serious. If researchers tend to study more promising interventions, then the results will paint an overly optimistic picture.
The proportion that “works” is sensitive to (i) the studies that are included, (ii) the outcomes that are included, (iii) where you draw the line for statistical significance, (iv) where you draw the line for effect size relative to cost, (v) whether you focus on individual studies or meta-analyses, and how broadly you aggregate, and (vi) which area you focus on (e.g. health vs education).
A significant fraction of this research might not be trustworthy due to the replication crisis (“p hacking”, publication bias, etc).
However, what might some tentative conclusions be about the proportion that are effective?
If we focus on key fundamental outcome measures:
Of individual projects, when tested with well-powered randomised controlled trials, perhaps over 80% don’t “work”, i.e. deliver a reasonable effect size relative to cost.
Perhaps 1-10% have negative effects.
Of intervention types that have been evaluated with meta-analyses, the proportion that don’t “work” is probably lower, perhaps over 60% instead of 80%, but this is partly just because there is more attention paid to the most promising interventions.
The interventions and projects that haven’t been tested are probably worse, since more research is done on the most promising approaches.
If you consider whole “areas” (e.g. education as a whole), then the average effect is probably positive. This is what you’d expect if the area is making progress as a whole, and there is some pressure, even if weak, to close down poor programmes. This is consistent with many individual projects failing to work, and a small number of projects having strong positive effects, which is what we might expect theoretically.2
The average effect size and proportion of interventions that work probably varies significantly by area.
So is it fair to say “most social programmes don’t work?”
I think this is a little ambiguous and potentially misleading. Individual projects mostly don’t work, but whole areas often do have a positive impact. So, if you pick an intervention at random, then on average your impact will be positive, because there’s a small but important chance of you picking one of the good ones.
However, if you can focus on the best interventions in an area according to the evidence, then you can have significantly more impact than the average. For instance, if two thirds of interventions don’t work, then if you can avoid these, you’ll have about three times as much impact as if you work on whatever you first stumble into, and pick randomly.
Given that we also can’t expect our gutinstincts to pick accurately, it’s still important to try our best to focus on evidence-based approaches.
How important is it to be “evidence-based”?
That said, the boost of “being evidence-based” isn’t as big as is often made out in the effective altruism community. Suppose 10% of interventions are highly effective, and have 10 units of impact, while 90% don’t work. If you can pick the top 10%, then you’ll have 10 units of impact, while if you pick randomly, then you’ll have:
10% * 10 + 90% * 0 = 1 unit of impact.
So the boost in impact you get from being evidence-based is 10-fold. But, this is an upper bound, because in reality, the other 90% will have some positive impact. Further, your measurements will be imperfect, so you won’t be able to precisely identify the top 10%, further reducing the differences.
In general, the size of the boost from being evidence-based depends on the degree of spread in effectiveness within the area, and how good your measurements are. Global health is probably the best area on these grounds, since we have the most data and there are large differences in cost-effectiveness. But the best interventions in global health are only about ten times more effective than the mean, and the difference will be less after adjusting for measurement error (regression to the mean).
What’s more, it’s possible that the best interventions in an area are not based on current evidence – rather they might involve creating and testing new interventions, or taking a high-risk, high-reward approach like research or policy advocacy. If you only stick to evidence-based methods, you might miss those that are highest-impact. (It depends on whether people are taking too much or too little risk in the area.)
All taken together, a tenfold gain in effectiveness probably represents the maximum you can currently gain from picking evidence-based interventions, and in most areas, it’s probably more like a two or threefold gain.3
To illustrate the importance of being evidence-based, people in the community commonly cite the difference between the best interventions and the worst, or the median. This is an interesting figure, because it gives an idea of the spread of effectiveness, but the alternative to being evidence-based is probably more like picking randomly (at worst), rather than systematically picking the worst interventions (or the median). If you pick randomly, then you have a small chance of picking something highly effective by luck, which means your expected effectiveness is equal to the mean. And you might be able to do even better than picking randomly by using theory or your own experience.
A two to tenfold gain in impact from being evidence-based is, by most standards, a big deal, but it’s much smaller than the boost you can get from picking the right problem area in the first place. We’ve argued using our framework that efforts in some common areas can be expected to be over 100 times more effective than others.
Read next
Among social interventions that are effective, how much better are the best compared to the average? And what does this imply for efforts to do good?
Plus, join our newsletter and we’ll mail you a free book
Join our newsletter and we’ll send you a free copy of The Precipice — a book by philosopher Toby Ord about how to tackle the greatest threats facing humanity. T&Cs here.
We find that the large majority of studies in our sample are generally credible.
Coville, A., & Vivalt, E. (2017, August 14). How Often Should We Believe Positive Results? Assessing the Credibility of Research Findings in Development Economics. Link to preprint.↩
Complex systems often produce “fat tailed” outcomes, where a small number of outcomes are far bigger than the median. For instance, we’ve found that “career success” seems to be like this. (If an outcome is caused by a product of normally distributed factors, then it’ll be log-normally distributed.)↩
Though in the longer-term, it’s valuable to build a culture that attends to evidence. For instance if you don’t attend to rigorous evidence, then you may (1) be influenced by extravagant claims on areas where data hasn’t been collected, and other factors that cause you to pick worse than randomly, and (2) create incentives to add an unbounded quantity of bad interventions (since any intervention will get its lottery share).↩
Should I quit my job? Which of my offers should I take? Which long-term options should I explore?
These decisions will affect how you spend years of your time, so the stakes are high. But they’re also an area where you shouldn’t expect your intuition to be a reliable guide. This means it’s worth taking a more systematic approach.
What might a good career decision process look like? A common approach is to make a pro and con list, but it’s possible to do a lot better. Pro and con lists make it easy to put too much weight on an unimportant factor. More importantly, they don’t encourage you to make use of the most powerful decision-making methods, which can greatly improve the quality of your decisions.
In this article, we present a step-by-step process for making your next career decision. This process draws on the most useful discoveries in decision-making research1 and our experience advising thousands of people one-on-one.
Career decisions usually involve a huge amount of uncertainty. If you sometimes feel stressed or anxious, this is normal. We can’t make your next decision easy, but if you work through this process, we think you’ll be more likely to avoid common mistakes and take the best next step you can.
You can work through the article below, or use a simplified version in our tool.
1. Clarify your decision
First, make sure you have a clear idea of exactly what decision you want to make. Are you choosing where to apply, between two specific offers, which medium-term options to focus on, or something else? When do you need to decide by?
Also note that this process is geared towards choosing between a list of specific options.
2. Write out your most important priorities
Once you’re clear about the next decision you need to make, write out your 4-7 most important priorities in making the decision. When making decisions, people usually focus on too narrow a set of goals. Writing out your list of factors will help you stay focused on what most matters.
We typically recommend that people focus on the factors in our framework, which we think capture most of the key elements in high-impact careers. They include the following:
Career capital — does this option significantly accelerate you towards your longer-term career goals, or otherwise open up lots of good options?
Impact potential — how pressing is the problem addressed and how large a contribution might the typical person in this career make to the problem (in expectation)?
Personal fit — compared to the typical person in this career in the long-term, how productive do you expect to be?
Personal satisfaction — how would this path satisfy other important personal priorities that aren’t already covered?
Exploration value — might this path be an outstanding long-term option that you’re uncertain about and can test out?2
Which factors to focus on depends on the decision you’re making and your career stage.
If you’re earlier in your career and comparing longer-term options, you’ll probably mainly focus on impact potential, personal satisfaction, and fit.
If you’re comparing next steps (rather than longer-term options) and early in your career, then focus on career capital, exploration value, and fit.
If you’re later in your career and comparing next steps, you’d focus more on immediate impact and fit.
If working with a community, you might also consider:
Relative fit — how do your strengths compare to other community members focusing on these issues (which determines your comparative advantage)?
Community capital — does this increase the influence of the community and its ability to coordinate?
You can also try to make the factors more specific based on your situation. What type of career capital is most valuable? What signals best predict impact in the areas you’re focused on? What exactly are your priorities in personal satisfaction? On the latter, it’s important to try to be honest, even about your least noble motivations, or otherwise the path won’t be sustainable.
See a list of all the factors in our framework and a worksheet here.
There are also some other filters to consider:
Do a significant number of people think this option is likely to have a negative impact in a top area? If so, can you modify the option to avoid the risk? If not, eliminate it. Read more about accidental harm.
Does this option pose a significant risk of a long-term negative impact on your happiness or career capital? If so, modify it or eliminate it.
3. Generate more options
One of the most important mistakes when making career decisions is to consider too few options. Some research suggests that even just making sure you consider one extra option improves satisfaction with outcomes.
You probably already have some options in mind, but here the challenge is to generate even more.
One way to generate options is to use the priorities from the step before. What options might best help you achieve those priorities? For example, what might be the best option for career capital, job satisfaction, and so on?
If you’re trying to generate options for your next step (rather than long-term career), then it’s useful to both “work forwards” and “work backwards” to generate options.
Working backwards means starting from your longer-term vision and thinking about the most effective route to getting there.
Working forwards involves looking for great opportunities to learn and have an impact, even if you’re unsure where they’ll lead.
Here are some prompts for working forwards:
Speak to your friends, those working on interesting problems, and people you admire, and ask about what might be a good fit for you.
Check out the jobs listed on our job board — do any of them seem interesting?
What options might you be unusually good at?
What options might help you learn the most?
What ‘open doors’ are available right now? These are interesting opportunities that you happen to have come across and might not be around in the future.
Here are even more prompts to help you come up with more options. Pick and choose whichever seem most useful to think about:
Career capital — What’s the most valuable career capital you have right now? If I wanted to become excellent at a high-impact skill, how could I do that?
Ideal world — What would you do if money were no object? What is your dream job?
Combinations — Are there any ways your top options could be combined to get the best of all worlds?
Elimination — If you couldn’t do any of your top options, what would you do instead?
You can get more advice on working forwards and backwards in our career guide article on career planning.
4. Rank your options
Now you’ve got your options on the table, put them in a rough order according to how well they satisfy the factors you wrote down at step two. Don’t worry too much about accuracy – we just want to get a rough idea at this stage to make it easier to do the next couple of steps.
5. List your key uncertainties
Try to identify the information that is most likely to change your ranking.
The questions people most commonly ask us are often not actually decision relevant. Frequently, people focus on big picture questions that are too hard to settle, so thinking about them is unlikely to change their ranking. It’s also easy to get lost ruminating about the huge variety of issues that can be relevant. Try to focus on the questions that are most relevant.
Some useful questions to consider include:
How could you most easily rule out your top option?
If you had to decide your career tomorrow for the rest of your life, what would you do today?
What were you most uncertain about in making your ranking? Do any of those uncertainties seem easy to resolve?
Some of the most common questions are things like:
Would I enjoy this job?
Could I get this job?
What skills are required to get this job?
How pressing is this problem compared to other issues I could work on?
How much influence would I really have in this position?
Try to make the questions as specific as possible.
6. Go and investigate
Not every decision in life deserves serious research, but career decisions do.
We often find people get stuck analysing their options, when it would be better to go and gather information or test out their options. For instance, we encountered an academic who wanted to take a year long sabbatical, but wasn’t sure where to go. They’d thought about the decision for a while, but hadn’t considered going to visit their top choice for a week, which would have likely made the decision a lot easier.
When investigating your options, we find it useful to think of a ladder of tests that go in ascending order of cost, and aim to settle the key uncertainties you’ve identified.
We often encounter people considering taking drastic action — like quitting their job — before taking lower cost ways to learn more about what’s best first.
Here’s an example of a ladder of tests:
Read our relevant career reviews and do some Google searches to learn the basics (1-2h).
Then the next most useful thing you can usually do is to speak to someone in the area. The right person can give you more up-to-date and personalised information than what you’ll be able to find written down (2h).
Speak to three more people who work in the area and read one or two books (20h). You could also consider speaking to a careers adviser who specialises in this area. During this, also find out the most effective way for you to enter the area, given your background. Bear in mind that when you’re talking to these people, they are also informally interviewing you – see our advice on preparing for interviews in a later article.
Now look for a project that might take 1-4 weeks of work, like applying to jobs, volunteering in a related role, or starting a blog on the policy area you want to focus on. If you’ve done the previous step, you’ll know what’s best.
Only now consider taking on a 2-24 month commitment, like a work placement, internship or graduate study. At this point, being offered a trial position with an organisation for a couple of months can actually be an advantage, because it means both parties will make an effort to quickly assess your fit.
One of the most useful, but often neglected, steps is to simply apply to lots of jobs. We often find people wondering whether one path is better than another, when if they’d applied, it would have been obvious which one to go for.
If you’re lucky, at some point in these investigations, your next step will become clear.
If it doesn’t, then you can keep going up the ladder of tests until you run out of time, or perceive that your best guess about which option is best is no longer changing (technically, when the value of information is less than the cost of the test). One other rule of thumb is that the higher the stakes of the decision, the more time it’s worth investigating.
The aim is not confidence. You will likely always be uncertain about many aspects of your career. Instead, the aim is to find the best possible ranking using low-cost tests and basic research. Once you’ve done that, the most efficient way to learn more is probably to pick an option and try it out.
7. Make your final assessment
When you’ve finished investigating, it’s time to make a decision. Here are some more decision-making tips to help make your ranking more accurate.
Consider scoring your options
It can be useful to score your short-list of options on each of the factors listed in your second step from one to ten. There’s some evidence that making a structured decision like this can improve accuracy. It can be useful to add all your scores together and see what ranks highest. Don’t blindly use the score to determine your decision — it’s mainly a means of probing your thinking.
When it comes to assessing each factor, there are more tips on what to look for in our career framework article.
Upside downside analysis
If you want to go into more detail in making your assessment, then also consider imagining an upside and downside scenario for your top options to get a sense of the full range of possibilities (instead of thinking narrowly, which is the norm, and is especially misleading in the world of doing good, where often most of your impact comes from the small chance of an outsized success).
A simple way to do that is to consider a ‘success’ and ‘failure’ scenario for each. A more complex option is to consider:
The upside scenario — what happens in a plausible best-case scenario? (To be more precise, that could be the top 5% of outcomes.)
The downside scenario — what happens in a plausible worst-case scenario? (E.g. the worst 5% of outcomes.)
The median — what’s most likely to happen?
In each scenario, consider how good or bad the option will be based on the factors you defined earlier — impact, career capital, learning and so on. One saving grace is that you often learn the most from failures, so the downside scenario is perhaps not as bad as it seems.
If you weight each scenario by their probability, you can make a rough estimate of the expected value of the option — this will often be dominated by the value of the upside scenario.
You may want to eliminate any options that have unusually large downsides. For instance, if you think pursuing an option might burn you out, bankrupt you, ruin your reputation, or holds another risk that could prevent you from making an impact in the future, it’s probably best to eliminate it so that you can ‘stay in the game’ and continue to have opportunities to contribute in the future. We talk more about Plan Z options later.
If you’re trying to decide which job to focus on for a couple of years, then a big part of your decision should be learning about what might be the best fit for you in the long term (value of information). This can mean it’s best to focus on the path with the best upside scenario rather than the best expected value (provided the downsides are similar). This is because if the upside scenario is realised, you can stick with it, and if it isn’t, you can switch to something else. This asymmetry means it’s rational to be somewhat optimistic.
Check your gut intuition
After you’ve finished your assessments, take a break, and re-rank your options.
Once you’ve made a ranking, notice if your gut intuition feels uneasy about something. You can’t simply go with your gut to make good career decisions, but you shouldn’t ignore your gut either. Your intuition is good at aspects of the decision where you’ve had lots of opportunity to practice with relatively quick feedback, such as whether the other people involved are trustworthy. But your intuition is not good at assessing novel situations, as many career decisions are.
If your gut feels uneasy, try to pinpoint why you’re having that reaction, and whether it makes sense to go with your gut or not in this instance. The ideal of good decision-making is to combine intuitive and systematic methods, and use the best aspects of each.
It’s also a good idea to sleep on it. This may help you process the information. It also reduces the chance that you’ll be unduly influenced by your mood at that moment.
More ways to reduce bias
If you want to go further, here are some other techniques to help reduce bias in your thinking:
Pre-mortem and pre-party: Imagine that you take an option, but two years later you’ve failed and regret the decision — what went wrong? Then imagine that instead the option was way better than you expected — what happened? This helps to expand your views about what’s possible, which tend to be too narrow.
Change the frame. Imagine you’ve already made the decision, how do you feel? How do you expect to feel one year later? What about 10 years later? What would you advise a friend to do?
Ask other people. Having to justify your reasoning to someone else can quickly uncover holes. You can also ask people where they think you’re most likely to be wrong.
More advanced decision-making techniques
There is much more to say about how to make good decisions. For instance, often decisions come down to predictions, especially about your likely chances of success in an area, and the expected impact of different interventions.
For instance, to make better predictions, you can make base-rate forecasts from many angles, combining them based on their predictive power. You should try to update on your evidence in a ‘bayesian’ way. You can break down the prediction into multiple components as a ‘fermi estimate’. And you can try to improve your calibration through training.
Here is some further reading we recommend on decision-making:
Two popular books that go more into depth on the technique we’ve covered are Decisive by Chip and Dan Heath and The Signal in the Noise by Nate Silver.
Clearer Thinking has many interactive tools to help you think through key decisions.
8. Make your best guess, and then prepare to adapt
At some point, you’ll need to make a decision. If you’re lucky, one of your options will be clearly better than the others. Otherwise, the decision will be tough.
Don’t be too hard on yourself: the aim is to make the best choice you can given the evidence available. If you’ve been through the process above then you have put yourself in a position to make a well-considered decision.
Here are some further steps you can take to reduce downsides.
Plan B
First, create a backup plan if your top choice doesn’t work out.
Why is your top option most likely not to work out?
What will you do in this situation? List any promising nearby alternatives to plan A, and call them your ‘plan B’. For instance, if you’re already in a job and applying to a masters programme, one possibility is that you don’t get into the programmes you want. In that case, your Plan B might be to stay in your job another year.
When doing the above exercise, you might realise it’s much easier to switch from option X to Y, than from Y to X i.e. that option X is more reversible than Y.
For instance, after completing a PhD, everyone in academia agrees that if you leave, it’s hard to re-enter. This is because getting a permanent academic position is very competitive, and any sign that you’re not committed will rule you out (especially in certain subjects). This means that if you’re unsure about continuing with academia after your PhD, it’s often best to continue.
If you haven’t started a PhD, and want to try something else, then it’s best to do that before you start.
It can sometimes be better to enter the more reversible option, even if you’re less confident it’s best. If you’re right and it doesn’t work out, you can go back to your top option later anyway.
Ask yourself whether thinking about ordering should cause you to rerank your options.
Plan Z
You may face unforeseen setbacks, so it’s also useful to figure out a ‘Plan Z’. Here are some questions to help you do that.
If you take your top option, what might the worst case scenario be? Many risks are not as bad as they first seem, but pay attention to anything that could permanently reduce your happiness or career capital.
How can you reduce the chances of the worst case happening? It’s difficult to give general advice, but there are often ways to mitigate the risks.
If the worst case scenario does happen, what will you do to cope? Call this your ‘Plan Z’. Some common options include: taking a temporary job to pay the bills, moving back in with your parents, or living off savings. What makes most sense will again depend a lot on your situation.
Is your Plan Z tolerable? If not, then you should probably modify your plan A to build more career capital so that you’re in a better position to take risks (e.g. take a job that lets you save more money). If it is, great – hopefully this exercise will make it easier to commit to your Plan A.
Set a review point
A final point to bear in mind is that your next career step is probably only a commitment for 1-4 years — building a career is a step-by-step process, not a one-off decision — and if you plan ahead to that next revision point, you’ll be better able to focus on your top option in the meantime, as well as be more prepared when it arrives. Here are some extra steps to consider:
Schedule in a time to review your career in six months or a year. We made a career review tool to make it easier.
Set check-in points. Make a list of signs that would tell you you’re on the wrong path, and commit to reassessing if those occur. For example, publishing lots of papers in top journals is key to success in academic careers, so you could commit to reassessing the academic path if you don’t publish at least one paper in a top journal before the end of your PhD.
9. Take action
Once your plan is set, it’s time to focus on execution. How to execute is not the main focus of this article, but here are some further resources.
First, translate your plan into very concrete next steps. Write out what you’re going to do and when you’ll do it. Setting ‘implementation intentions’ makes it significantly more likely you’ll follow through.
To get more ideas on how to increase your chances of success in a path:
Check out our relevant skill pages and career reviews, which sometimes have a section on how to succeed in a path.
One of the most useful steps you can take is to team up with others who want to have an impact. There are many great communities out there, often focused around specific problems. Your first step should probably be to try to meet people in the communities most relevant to you.
We also helped to found the effective altruism community, which is a group of people who use evidence and reason to work out the best ways to have a positive impact. This community is not for everyone, but through it we’ve met some of the most impressive people we know. Find out more about how to get involved.
Notes and references
Some of the sources we drew upon include the following, as well as those listed above:
Ariely, Dan. Predictably Irrational. New York: HarperCollins, 2008.
Arkes, Hal R., and Catherine Blumer. “The psychology of sunk cost.” Organizational behavior and human decision processes 35.1 (1985): 124-140.
Heath, Chip, and Dan Heath. Decisive: How to make better choices in life and work. Random House, 2013.
Hubbard, Douglas W. “How to measure anything.” Finding the Value of “Intangibles” in Business (2007).
Keeney, Ralph L., and Ralph L. Keeney. Value-focused thinking: A path to creative decisionmaking. Harvard University Press, 2009.
Kahneman, Daniel. Thinking, fast and slow. Macmillan, 2011.
Larrick, Richard P. “Broaden the decision frame to make effective decisions.” Handbook of principles of organizational behavior (2009): 461-480.
Tversky, Amos, and Daniel Kahneman. “Judgment under uncertainty: Heuristics and biases.” Science 185.4157 (1974): 1124-1131.↩
If you assess your options in terms of what would happen in a plausible best case scenario, rather than just in terms of expected value, then value of information will already be somewhat captured. This is the ‘upper confidence interval’ algorithm discussed in our podcast with Brian Christian.↩
Checklist for assessing options
1. Score options on priorities
Take your top two to five options, and score them from 1-5 on each of your priorities. We’re not suggesting that you should go with whichever option has the highest total score (see step 5), but explicitly assessing your options on each of your priorities should help you avoid getting misled by irrelevant factors. It also ensures that you’ve thought through each aspect of the decision. This is best done in a spreadsheet. You can use this template.
2. Question your gut
Pay attention to where gut judgements feed into your ranking. Your gut is good at making decisions where you’ve made lots of similar decisions before, so it’s probably good at judging something like “Will I get on with the people I’m working with in this job?” (if you’ve actually met them). But it’s bad at judging questions like “how much will I earn in this career?” or is this cause effective? See more on when to go with your gut.
3. Consider why you might be wrong
Ask yourself why you might be wrong about your ranking. This can help to reduce bias.
4. Focus on future pros and cons
Check that you’ve focused on the future pros and cons of your options, rather than what you’ve done in the past. This can help you avoid the sunk cost fallacy.
5. Check you’re not relying on one or two strong considerations
Have you considered the problem from many angles? Rather than basing your decision on one or two strong considerations, it’s often better to consider the issue from many independent perspectives weighted by their robustness and importance. This style of thinking has been supported by various groups and has several names, including ‘cluster thinking’, ‘model combination and adjustment’, ‘many weak arguments’, and ‘fox style’ thinking.
6. Look for dominant options
If you’re lucky, you’ll find one option seems better or equal from all perspectives than another. You can then rank this option above the one it dominates.
To find work you love, don’t (always) follow your passion
By Benjamin Todd · Last updated May 2016 · First published September 2014 ·
Watch our Executive Director’s TEDx talk, which has been viewed over 4.7 million times, or read a more up-to-date article below (updated May 2016).
“Follow your passion” has become a defining piece of career advice.
The idea is that the key to finding a great career is to identify your greatest interest – “your passion” – and pursue a career involving that interest. It’s an attractive message: just commit to following your passion, and you’ll have a great career. And when we look at successful people, they’re often passionate about what they do.
Now, we’re fans of being passionate about your work. Research shows that intrinsically motivating work makes people a lot happier than a big pay cheque. We also think it’s really important to find something you can excel at.
However, there’s four ways “follow your passion” can be misleading advice.
One problem is that it suggests that passion is all you need. But even if you’re interested in the work, if you lack the other key ingredients of job satisfaction that research has shown are important, you’ll still be unsatisfied. If a basketball fan gets a job involving basketball, but works with people he hates, has unfair pay, or finds the work meaningless, he’s still going to dislike his job.
In fact, “following your passion” can make it harder to satisfy the other ingredients, because the areas you’re passionate about are likely to be the most competitive, which makes it harder to find a good job.
A second problem is that many people don’t feel like they have a career-relevant passion. Telling them to “follow their passion” makes them feel inadequate. If you don’t have a “passion”, don’t worry. You can still find work work you’re passionate about.
Others feel like they have lots of passions, and aren’t sure which one to focus on. We need more precise criteria for comparing our options.
The third problem is that it makes it sound like you can work out the right career for you in a flash of insight. Just think deeply about what truly most motivates you, and you’ll realise your “true calling”. However, research shows we’re bad at predicting what will make us happiest ahead of time, and where we’ll perform best. When it comes to career decisions, our gut is often unreliable. Rather than reflecting on your passions, if you want to find a great career, you need to go and try lots of things.
The fourth problem is that it can make people needlessly limit their options. If you’re interested in literature, it’s easy to think you must become a writer to have a satisfying career, and ignore other options.
But in fact, you can start a career in a new area. If your work helps others, you practice to get good at it, you work on engaging tasks, and you work with people you like, then you’ll become passionate about it. The ingredients of a dream job we’ve found are most supported by the evidence, are all about the context of the work, not the content. Ten years ago, we would have never imagined being passionate about giving career advice, but here we are, writing this article.
Many successful people are passionate, but often their passion developed alongside their success, rather than coming first. Steve Jobs started out passionate about zen buddhism. He got into technology as a way to make some quick cash. But as he became successful, his passion grew, until he became the most famous advocate of “doing what you love”.
In reality, rather than having a single passion, our interests change often, and more than we expect. Think back to what you were most interested in five years ago, and you’ll probably find that it’s pretty different from what you’re interested in today. And as we saw above, we’re bad at knowing what really makes us happy.
This all means you have more options for a fulfilling career than you think.
By Benjamin Todd · Last updated May 2023 · First published August 2014 ·
Everyone says it’s important to find a job you’re good at, but no one tells you how.
The standard advice is to think about it for weeks and weeks until you “discover your talent.” To help, career advisers give you quizzes about your interests and preferences. Others recommend you go on a gap yah, reflect deeply, imagine different options, and try to figure out what truly motivates you.
But as we saw in an earlier article, becoming really good at most things takes decades of practice. So to a large degree, your abilities are built rather than “discovered.” Darwin, Lincoln, and Oprah all failed early in their careers, then went on to completely dominate their fields. Albert Einstein’s 1895 schoolmaster’s report reads, “He will never amount to anything.”
Asking “What am I good at?” needlessly narrows your options. It’s better to ask: “What could I become good at?”
That aside, the bigger problem is that these methods aren’t reliable. Plenty of research shows that while it’s possible to predict what you’ll be good at ahead of time, it’s difficult. Just “going with your gut” is particularly unreliable, and it turns out career tests don’t work very well either.
Instead, you should be prepared to think like a scientist — learn about and try out your options, looking outwards rather than inwards. Here we’ll explain why and how.
Reading time: 25 minutes
The bottom line
Your personal fit for a job is the chance that — if you worked at it — you’d end up excelling.
Personal fit is even more important than most people think, because it increases your impact, job satisfaction, and career capital.
Research shows that it’s hard to work out what you’re going to be good at ahead of time. Career tests, trying to introspect, or just “going with your gut” seem like poor ways of figuring this out.
Instead, think like a scientist: make some best guesses (hypotheses) about which careers could be a good fit, identify your key uncertainties about those guesses, then go and investigate those uncertainties.
Look for the cheapest ways of testing your options first, creating a ‘ladder’ of tests. Usually this means starting by speaking to people already working in the job. Later it could involve applying to jobs or finding ways to do short projects that are similar to actually doing the work.
It can take years to find your fit, and you’ll never be certain about it. So even once you take a job, see it too as an experiment. Try it for a couple of years, then update your best guesses.
Early in your career, if you have the security, it can be worth planning to try out several career paths, aiming high, and being ready to quit if something is going so-so rather than great. You can make this easier by carefully considering which order to explore your options, and making good backup plans.
Being good at your job is more important than you think
Everyone agrees that it’s important to find a job you’re good at. But we think it’s even more important than most people think, especially if you care about social impact.
First, the most successful people in a field account for a disproportionately large fraction of the impact. A landmark study of expert performers found that:1
A small percentage of the workers in any given domain is responsible for the bulk of the work. Generally, the top 10% of the most prolific elite can be credited with around 50% of all contributions, whereas the bottom 50% of the least productive workers can claim only 15% of the total work, and the most productive contributor is usually about 100 times more prolific than the least.
So, if you were to plot degree of success on a graph, it would look like this:
It’s the same spiked shape as the graphs we’ve seen several times before in this guide.
In the article on high-impact jobs, we saw this in action with areas like research and advocacy. In research, for instance, the top 0.1% of papers receive 1,000 times more citations than the median.
These are areas where the outcomes are particularly skewed, but our review of the evidence suggests that the best people in almost any field have significantly more output than the typical person. The more complex the domain, the more significant the effect, so it’s especially noticeable in jobs like research, software engineering, and entrepreneurship.
Now, some of these differences are just due to luck: even if everyone were an equally good fit, there could still be big differences in outcomes just because some people happen to get lucky while others don’t. However, some component is almost certainly due to skill, and this means that you’ll have much more impact if you choose an area where you enjoy the work and have good personal fit.
Second, as we argued, being successful in your field gives you more career capital. This sounds obvious but can be a big deal. Generally being known as a person who gets shit done and is great at what they do can open all sorts of (often surprising) opportunities.
For example, many organisations will hire someone without experience of their area, if that person has done something impressive elsewhere (e.g. many AI companies have hired people without a background in AI). Charity and company board members are often successful people recruited from other fields. Or you might meet someone in another field who admires your work and wants to work together. (You can hear more on the case for generally kicking ass in our podcast with Holden Karnofsky.)
Moreover, being successful in any field — even if it seems a bit random — gives you influence, money, and connections, which, as we’ve also covered, can be used to promote all sorts of good causes — even those unrelated to your field.
Third, being good at your job and gaining a sense of mastery is a vital component of being satisfied in your work. We covered this in the first article.
All this is why personal fit is one of the key factors to look for in a job. We think of “personal fit” as your chances of excelling at a job, if you work at it.
If we put together everything we’ve covered so far in the guide, this would be our formula for a perfect job:
Personal fit is like a multiplier of everything else, which means it’s probably more important than the other three factors. So, we’d never recommend taking a “high-impact” job that you’d be bad at. But how can you figure out where you’ll have the best personal fit?
Hopefully you have some rough ideas for long-term options from earlier in the guide. Now we’ll explain how to narrow them down, and find the right career for you.
(Advanced aside: if you’re working as part of a community, then your comparative advantage compared to other people in the community is also important. Read more.)
Why introspection, going with your gut, and career tests don’t work
Performance is hard to predict ahead of time
When thinking about which career to take, our first instinct is often to turn inwards rather than outwards: “go with your gut” or “follow your heart.”
People we advise often spend days agonising over which options seem best, trying to figure it out from the armchair, or through introspection.
These approaches assume you can easily work out what you’re going to be good at ahead of time. But in fact, you can’t.
Here’s the best study we’ve been able to find so far on how to predict performance in different jobs over the next couple of years. It’s a meta-analysis of selection tests used by employers, drawing on hundreds of studies performed over 100 years.2 Here are some of the results:
Type of selection test
Correlation with job performance (r)
IQ tests
0.65
Interviews (structured)
0.58
Interviews (unstructured)
0.58
Peer ratings
0.49
Job knowledge tests
0.48
Integrity tests
0.46
Job tryout procedure
0.44
GPA
0.34
Work sample tests
0.33
Holland-type match
0.31
Job experience (years)
0.16
Years of education
0.10
Graphology
0.02
Age
0.00
Almost all of these tests are fairly bad. A correlation of 0.6 is pretty weak. And the accuracy for longer-term predictions is probably even worse.3 So even if you try to predict using the best available techniques, you’re going to be wrong much of the time: candidates that look bad will often turn out good, and vice versa.
Anyone who’s hired people before will tell you that’s exactly what happens (and there is some systematic evidence for this).4 And because hiring is so expensive, employers really want to pick the best candidates. They also know exactly what the job requires — if even they find it really hard to figure out in advance who’s going to perform best, you probably don’t have much chance.
Don’t go with your gut
If you were to try to predict performance in advance, “going with your gut” isn’t the best way to do it. Research in the science of decision-making collected over several decades shows that intuitive decision-making only works in certain circumstances.
For instance, your gut instinct can tell you very rapidly if someone is angry with you. This is because our brain is biologically wired to rapidly warn us when in danger, and to fit in socially.
Your gut can also be amazingly accurate when trained. Chessmasters have an astonishingly good intuition for the best moves, and this is because they’ve trained their intuition by playing lots of similar games, and built up a sense of what works and what doesn’t.
However, gut decision-making is poor when it comes to working out things like how fast a business will grow, who will win a football match, and what grades a student will receive. Earlier, we also saw that our intuition is poor at working out what will make us happy. This is all because our untrained gut instinct makes lots of mistakes, and in these situations it’s hard to train it to do better.
Career decision-making is more like these examples than being a chess grandmaster.
It’s hard to train our gut instinct when:
The results of our decisions take a long time to arrive.
We have few opportunities to practice.
The situation keeps changing.
This is exactly the situation with career choices: we only make a couple of major career decisions in our life, it takes years to see the results, and the job market keeps changing.
Your gut can still give you clues about the best career. It can tell you things like “I don’t trust this person” or “I’m not excited by this project.” But you can’t simply “go with your gut.”
Many career tests are built on “Holland types” or something similar. These tests classify you as one of six interest types, like “artistic” or “enterprising.” Then they recommend careers that match that type. However, we can see from the table above that “Holland-type match” is only weakly correlated with performance. It’s also only weakly correlated with job satisfaction (studies find correlations of around 0.1 to 0.3). So that’s why we don’t pay much attention to traditional career tests.
What does work in predicting where you’ll excel, according to the research?
In the table above, interviews rank near the top, which suggests the following method: talk to people who have experience recruiting in the field, and ask them how you’d stack up compared to other candidates. This makes a lot of sense — experts are probably pretty good at making this sort of judgement call.
The cluster of job tryout procedures, job knowledge tests, and work samples also do well, and that suggests another intuitive method: try to get as close to actually doing the work as possible, and then see how that goes. We talk about some ways to do that below.
Surprisingly, IQ tests correlate the most, but they’re not so useful for helping you figure out which kind of job is the best fit for you relative to other jobs (and that’s setting aside the question of what IQ tests actually measure!).
All this said, it’s important to keep in mind that none of these methods work that well. It’s just hard to say where you might be able to excel or not in the future, and this means you should keep an open mind and give yourself the benefit of the doubt — you probably have more options than it first seems!
And ultimately, the only way to find out is to take the plunge and actually try things.
How can you find a job that fits? Think like a scientist.
If it’s hard to predict where you’ll perform best ahead of time, and going with your gut intuition doesn’t work, then we need to take an empirical approach:
Make some best guesses (hypotheses) about which options seem best.
Identify your key uncertainties about those hypotheses.
Go and investigate those uncertainties.
And even when your investigation is complete and you start a job, that too is another experiment. After you’ve tried the job for a couple of years, update your best guesses, and repeat.
Finding the right career for you isn’t something you’ll figure out right away — it’s a step-by-step process of coming to better and better answers over time.
Here are some more tips on each stage.
Make a big list of options
The cost of accidentally ruling out a great option too early is much greater than the cost of investigating it further, so it’s important to start broad.
And since it’s so hard to predict where you’ll excel, that also means it’s hard to rule out lots of paths!
This can also help you avoid one of the biggest decision-making biases: considering too few options. We’ve met lots of people who stumbled into paths like PhDs, medicine, or law because those options felt like the default at the time — but if they’d considered more options, they could easily have found something that fit them better.
We also meet a lot of people who think they need to stick narrowly to their recent experience. For example, they might think that because they studied biology, they should mainly look for jobs that involve biology. But what major you studied rarely matters that much.
So start by making a long list of options — longer than your first inclination. We’ll look into how to do this more in our article on planning.
Figure out your key uncertainties
You don’t have time to try or investigate every job, so you need to narrow down the field.
To start, just make some rough guesses: roughly rank your options in terms of personal fit, impact, and supportive conditions for job satisfaction (plus career capital if you’re comparing next steps rather than longer-term paths)
Then ask yourself: “What are my most important uncertainties about this ranking?”
In other words, if you could get the answers to just a few questions, which questions would tell you the most about which option should be top?
People often find the most important questions are pretty simple things, like:
If I applied to this job, would I get in?
Would I enjoy this aspect of the job?
Would the pay be high enough given my student loans?
What’s the day-to-day routine actually like?
Does this job fit with my personal priorities (such as family commitments and having children)?
Now that you have a list of uncertainties, try to resolve them!
Start with the easiest and quickest ways to gain information first.
We often find people who want to, say, try out economics, so they apply for a master’s programme. But that’s a huge investment. Instead, think about how you can learn more with the least possible effort: “cheap tests.”
In particular, consider how you might be able to eliminate your top option. Or consider what you might need to find out to move a different option to the top slot.
When investigating a specific option, you can think of creating a ‘ladder’ of tests.
After each step, reevaluate whether the option still seems promising, or if you can skip the remaining steps and move on to investigate another option.
Speak to three more people who work in the area and read one or two books. (20 hours)
Consider using some of the additional approaches to predicting success below.
Given your findings in the previous steps, look for a relevant project that might take 1–4 weeks of work — like applying to jobs, volunteering in a related role, or doing a side project in the area — to see what it’s like and how you perform.
Only then consider taking on a 2- to 24-month commitment — like a work placement, internship, or graduate study. Being offered a trial position with an organisation for a couple of months can be ideal, because both you and the organisation want to quickly assess your fit.
If you’re choosing which restaurant to eat at, the stakes aren’t high enough to warrant much research. But a career decision will influence decades of your life, so it could easily be worth weeks or months of work making sure you get it right.
Try something (and iterate)
You’ll never be certain about which option is best, and even worse, you may never even feel confident in your best guess.
So when should you stop your research and try something?
If you keep investigating, but your answers aren’t changing, then the chances are you’ve hit diminishing returns, and you should just try something.
Of course, some decisions are harder to reverse or higher stakes than others (e.g. going to medical school). So all else equal, the bigger the decision, the more time you should spend investigating, and the more stable you want your answers to be.
Once you take the plunge and start a job, it helps to remember that even this is just an experiment. In most cases, if you try something for a couple of years and it doesn’t work out, you can try something else.
With each step you take, you’ll learn more about what fits you best.
See our individual career reviews for more advice on how to assess your fit with a specific job, including once you’re already in the path.
Advanced: what are the best ways to predict career fit, according to the research?
Our key advice on predicting fit is to define your key uncertainties and go investigate them in whatever way seems most helpful.
But it’s also true that, based on the research and our experience, some approaches to predicting fit seem better than others.
You can use these prompts to better target your efforts to gain information, and to make better guesses before you start doing lots of investigation.
What is the job actually like? We often meet people who speculate on their fit for, say, working in government, but have little idea what civil servants actually do. Before you go any further, try to get the basics down: Can you describe what a typical day might look like? What tasks create value in the job? What does it take to do them well?
What do experts say? If you can, ask people experienced in the field about how well you’d perform — especially people with experience recruiting for the job in question. But be careful — don’t put too much weight on a single person’s view! And try to find people who are likely to be honest with you.
What’s been working for you so far?5 One simple method to predict your success is to project forward your track record. If you’ve been succeeding in a path, that’s normally a good reason to continue. You can also try to use your track record to make more precise estimates of your chances. For instance, if you’re at grad school, roughly the top half of your class will go into academia, so if you’re in the top 25% of your class at grad school, you could roughly guess you’ll be in the top 50% of academia.6 To get a better sense of your potential over the long term, you should try to look at your rate of improvement rather than just recent performance.7
What drives success in the area, and how do you stack up? Your answers to steps 1–3 give you a starting point, but then you can modify that up or down depending on specific factors that could increase or decrease your chances of success. The aim is to develop a model of what’s needed for success. You can try to do this by asking people in the field about what’s most needed, and trying to understand what causes people to succeed or not. Then try to assess how you stack up on these predictors. This is how (good) job interviews work: they try to identify the traits most important for the job, and then ask you about evidence that you’ve displayed those traits in the past.
Do you feel excited about it? Gut-level motivation isn’t a reliable predictor of success. But if you don’t feel motivated, you probably won’t be able to put in the effort you’d need to to perform well. So a lack of excitement should definitely give you pause; it might be worth exploring what precisely you find uninspiring.
Will you enjoy it? This matters even if you mainly care about social impact: to stick with any career for long enough to make a difference, it’ll need to be reasonably enjoyable and fit with the rest of your life. For example, if you want to have children, you’ll probably want a job without extreme working hours.
Combine all these perspectives. Predicting career success is hard, and there’s no single approach that’s reliable. So it’s useful to consider all of the perspectives above, and focus on options that seem good from several of them.
Suppose you’ve decided to try a job for a few years. You now face a tradeoff: should you stick with it, or quit with the hope of finding something better?
Many successful people explored a lot early in their career. Tony Blair worked as a rock music promoter before going into politics. Maya Angelou worked as a cable car conductor, a cook, and a calypso dancer before she switched into writing and activism, while Steve Jobs even spent a year in India on acid, and considered moving to Japan to become a zen monk. That’s some serious exploration.
Examples of people who specialised early, like Tiger Woods, often stand out to us — but it doesn’t seem necessary to specialise that early, and it’s probably not even the norm. In the book Range: Why Generalists Triumph in a Specialized World, David Epstein argues that most people try several paths, and that athletes who try several sports before settling on one tend to be more successful — holding up Roger Federer as a foil to Tiger.
A 2018 study in Nature found that “hot streaks” among creatives and scientists tended to follow periods of exploring several areas.
And today, it’s widely accepted that many people will work in several sectors and roles across their lifetime. The typical 25- to 34-year-old changes jobs every three years,8 and changes are not uncommon later too.
And if personal fit is as important as we’ve argued, it could be worth spending many years finding the job that’s best for you.
But of course, exploring is also costly. Changing career paths can take years, and if you do it too often it can look flaky. Also, some paths can be hard to reenter once you’ve left them.
Steve Jobs liked to say you should “never settle.” But that’s not realistic advice. The real question is how to balance the costs of exploration with the benefits.
Fortunately, there’s been plenty of research in decision science, computer science, and psychology about this question. For instance, we interviewed Brian Christian, author of Algorithms to Live By: The Computer Science of Human Decisions about how to summarise this research, and we have an advanced series article all about it.
These are some of the key findings.
Explore more when you’re young
Everyone agrees that the earlier you are in your career, the more exploratory you should be.
This is because the earlier you discover a better option, the longer you have to take advantage of it.
If you discover a great new career at age 66 and retire at age 67, you’ve only benefited for one year. But if you discover something new at age 25, you may have decades to enjoy it.
In addition, early on you know relatively little about your strengths and options, so you learn a lot more from trying things.
Society is also structured to make it easier for younger people to explore — for instance, many internships are only available to people who are still at college — so the costs of trying other paths are also lower when younger.
Consider trying several paths (with careful ordering)
One exploration strategy is to try several paths, and then commit to whichever seems best at the end. (This is similar to the solution to the (anachronistically named) “secretary problem” in computer science, which is about how long to spend searching for the best candidate to hire from a pool of applicants.)
This strategy is most suitable while an undergraduate or in your first couple of jobs, when exploration is easiest and most valuable, and when your uncertainties are greatest.
The main downside of this strategy is that it’s costly to try out several paths. However, it’s often possible to reduce the costs significantly by carefully ordering your options. For example, you can try out a surprising number of paths between undergraduate and graduate school, during summer breaks, or by putting more reversible options first.
Here’s some more detail on how to order your next steps:
1. Explore before graduate school rather than after (and put other reversible options first)
In the couple of years right after you graduate, you’re not expected to have your career figured out right away — generally, you have licence to try out something more unusual, like starting a business, living abroad, or working at a nonprofit.
If it doesn’t go well, you can use the “graduate school reset”: do a master’s, MBA, law degree, or PhD, which lets you return to a standard path.
We see lots of people rushing into graduate school or other conventional options right after they graduate, which makes them miss one of their best opportunities to explore.
It’s especially worth exploring before a PhD rather than after. At the end of a PhD it’s hard to leave academia. This is because going from a PhD to a postdoc, and then into a permanent academic position — you’re unlikely to succeed if you don’t focus 100% on research. So, if you’re unsure about academia, try out alternatives before your PhD if possible.
Similarly, it’s easier to go from a position in business to a nonprofit job than vice versa, so if you’re unsure between the two, take the business position first.
2. Choose options that let you experiment
An alternative approach is to take a job that lets you try out several areas by:
Letting you work in a variety of industries. Freelance and consulting positions are especially good for this.
Letting you practice many different skills. Jobs in small companies are often especially good on this front.
Giving you the free time and energy to explore other things outside of work.
3. Try something on the side
If you’re already in a job, think of ways to try out a new option on the side. Could you do a short but relevant project in your spare time, or in your existing job?
If you’re a student, try to do as many internships and summer projects as possible. Your university holidays are one of the best opportunities in your life to explore.
4. Consider including a wildcard
One drawback of the strategies above is that your best path might well be something you haven’t even thought of yet.
This is why in computer science, many exploration algorithms have a random element — making a random move can help avoid settling into a ‘local optimum.’ While we wouldn’t recommend literally picking randomly, the fact that even computer algorithms find randomness helpful illustrates the value of trying something very different.
That could mean trying something totally outside your normal experience, like living in a very different culture, participating in different communities, or trying different sectors from the ones you already know (e.g. nonprofits, government, corporate).
For instance, I (Benjamin) went to learn Chinese in China before I went to university. I didn’t have any specific ideas about how it would be useful, but I felt I learned a lot from the experience, and it turned out to be useful when I later worked to create our resources for people working on China-Western coordination around emerging technologies.
Jess – a case study in exploring
Here’s a real-life example: when Jess graduated from maths and philosophy, she was interested in academia and leaned towards studying philosophy of mind, but was concerned that it would have little impact.
“80,000 Hours has nothing short of revolutionised the way I think about my career.”
So the year after she graduated, she spent several months working in finance. She didn’t think she’d enjoy it, and she turned out to be right, so she felt confident eliminating that option. She also spent several months working in nonprofits, and reading about different research areas.
Most importantly, she spoke to loads of people, especially in the areas of academia she was most interested in. This eventually led to her being offered to study a PhD in psychology, with a focus on how to improve decision-making by policymakers.
During her PhD, she did an internship at a think tank that specialised in evidence-based policy, and started writing about psychology for an online newspaper. This allowed her to explore the ‘public intellectual’ side of being an academic, and the option of going into policy.
At the end of her PhD, she could have either continued in academia, or switched into policy or writing. She also could have gone back to finance or the nonprofit sector. Most importantly, she had a far better idea of which options were best.
A rational reason to shoot for the stars
Young people are often advised to “dream big,” “be more ambitious,” or “shoot for the stars” — is that good advice?
Not always. When asked, more than 75% of Division I basketball players thought they would play professionally, but only 2% actually made it. Whether or not the players in the survey were making a good bet, they overestimated their chances of success… by over 37 times.
Telling people to aim high doesn’t make sense when people are so overconfident in their chances of success.
But when you’re more calibrated, it often is good advice.
Suppose you’re comparing two options:
Earning to give as a software engineer
Research in AI safety
Imagine you think your chances of success in research aren’t very high, so most likely you’ll have more impact earning to give. But, if you do succeed in research, it would be much higher impact.
If you only get one shot to choose, you should earn to give.
But the real world isn’t normally like that. If you try the research path, and it doesn’t work out, you can most likely go back to earning to give. But if it does work out, then you’ll be in a much higher-impact path for the rest of your career.
That is, there’s an asymmetry. This means that if you can tolerate the risk, it’s better to try research first.
More generally, you stand to learn the most from trying paths that:
Might be really, really good…
But that you’re very uncertain about.
In other words, long shots.
In this sense, the advice to shoot for the stars makes sense, especially for young people.
An aggressive version of this strategy is to rank your options in terms of upside — that is, how good they would be if they go unusually well (say in the top 10% of scenarios) — then start with the top-ranked one. If you find you’re not on track to hit the upside scenario within a given time frame, try the next one, and so on.
This is usually only suitable if you have good backup options and the fortunate position of being able to try lots of things.
A more moderate version of this strategy is to use it as a tiebreaker: when uncertain between two options, pick the one with the bigger potential upside.
The sunk cost bias is the tendency to keep doing something that doesn’t make sense anymore, just because of how much you’ve already paid for it. It leads us to expect people to:
Continue with their current path for too long.
Want to avoid the short-term costs of switching.
Be averse to leaping into an unknown new option.
This all suggests that if you’re on the fence about quitting your job, you should quit.
This is exactly what an influential randomised study found. Steven Levitt recruited tens of thousands of participants who were deeply unsure about whether to make a big change in their life. After offering some advice on how to make hard choices, those who remained truly undecided were given the chance to flip a coin to settle the issue — 22,500 did so.
Levitt followed up with these participants two and six months later to ask whether they had actually made the change, and how happy they were on a scale of 1 to 10. It turned out that people who made a change on an important question gained 2.2 points of happiness out of 10!
Of course, this is just one study, and we wouldn’t be surprised if the effect were smaller on replication. But it lines up with what we’d expect.
Apply this to your own career
In the earlier articles, you should have made a list of some ideas for longer-term career paths to aim towards.
Now you could start to narrow them down.
Make a rough guess at which longer-term paths are most promising on the balance of: impact, personal fit, and job satisfaction.
What are some of your key uncertainties about this ranking? List out at least five.
How might you be able to resolve those key uncertainties as easily as possible? Go and investigate them. Consider doing one or two cheap tests.
Which option do you think has the most upside potential?
If you were going to try out several longer-term paths, what would be the ideal way to order those tests?
How confident do you feel in your longer-term options? Do you think you should (i) do more research into comparing your longer-term options? (ii) try to enter one (but with a backup plan)? (iii) plan to try out several longer-term paths? or (iv) just gain transferable career capital and figure out your longer-term paths later?
If you want to think more about your longer-term options, try our full process for comparing a list of career options:
We like to imagine we can work out what we’re good at through reflection, in a flash of insight. But that’s not how it works.
Rather, it’s more like a scientist testing a hypothesis. You have ideas about what you can become good at (hypotheses), which you can test out (research and experiments). Think you could be good at writing? Then start blogging. Think you’d hate consulting? At least speak to a consultant.
If you don’t already know your “calling” or your “passion,” that’s normal. It’s too hard to predict which career is right for you when you’re starting out, and even sometimes when you’re many years in.
Instead, go and try things. You’ll learn as you go, heading step by step towards a fulfilling career.
Once you’ve chosen an area, how can you ensure you succeed? That’s what we cover in the next article. After that, we’ll show how to fit everything together into a career plan.
Want to come back later? Get the guide as a free book
Sign up to our newsletter, and we’ll mail you the entire career guide as a book.
You’ll be joining over 500,000 people who receive weekly updates on our research and job opportunities. T&Cs here. You can unsubscribe in one click.
Notes and references
Simonton, Dean K. “Age and outstanding achievement: What do we know after a century of research?” Psychological bulletin 104.2 (1988): 251. PDF↩
Schmidt, Frank L., and John E. Hunter. “The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 100 years of research findings.” Working paper (2016). PDF↩
We’re pretty confident this is true, because uncertainty compounds over time.
For example, Ericsson has argued that the best predictor of expert performance over longer time frames is how much ‘deliberate practice’ someone has done.
But a 2014 meta-analysis found that this only explained about 20% of the variance, and that was in fields like sport, chess, and music, where deliberate practice is comparatively more important. In the other professions, it was only 1%.
This suggests that even the best predictor we have doesn’t tell us that much.↩
Career choices, for instance, are often abandoned or regretted. An American Bar Association survey found that 44% of lawyers would recommend that a young person not pursue a career in law. A study of 20,000 executive searches found that 40% of senior-level hires “are pushed out, fail or quit within 18 months.” More than half of teachers quit their jobs within four years.
If the outcome of a choice of career path is dominated by ‘tail’ scenarios (unusually good or bad outcomes), which we think it often is, then you can approximate the expected impact of a path by looking at the probability of the tail scenarios happening and how good/bad they are.↩
If we suppose that the 50% with the best fit continue to academia, then you’d be in the top half. In reality, your prospects would be a little worse than this, since some of your past performance might be due to luck or other factors that don’t project forward. Likewise, past failures might also have been due to luck or other factors that don’t project forward, so your prospects are a bit better than they’d naively suggest. In other words, past performance doesn’t perfectly predict future performance.↩
Median employee tenure was generally higher among older workers than younger ones. For example, the median tenure of workers ages 55 to 64 (9.8 years) was more than three times that of workers ages 25 to 34 years (2.8 years). Also, a larger proportion of older workers than younger workers had 10 years or more of tenure. For example, among workers ages 60 to 64, 53 percent had been employed for at least 10 years with their current employer in January 2022, compared with 9 percent of those ages 30 to 34.