We aim to list issues where each additional person can have the most positive impact. So we focus on problems that others neglect, which are solvable, and which are unusually big in scale, often because they could affect many future generations — such as existential risks. This makes our list different from those you might find elsewhere.

It’s also a constant work in progress, doubtless incomplete and mistaken in some ways, and may not align with your worldview — so we also provide a guide to making your own list. To learn why we listed a specific issue and how you can help tackle it, click the profiles below and see our FAQ below.

Get notified when we add new problems to these lists.

Join over 150,000 people who get our newsletter — you’ll receive updates twice a month on our latest research into the world’s most pressing problems and how to solve them.

Our list of the most pressing world problems

These areas are ranked roughly by our guess at the expected impact of an additional person working on them, assuming your ability to contribute to solving each is similar. But there’s a lot of variation within each issue, so it could easily be better to pursue a path that’s a great fit or a great opportunity in one ranked lower down.

  • The development of AI is likely to greatly influence the course we take as a society. We think that if it goes badly, however, it could pose an existential threat.

  • Biotechnological developments threaten to make much deadlier pandemics possible, due to accidental leaks or malicious use of engineered pathogens.

  • We are part of effective altruism, so we might be biased — but we think growing and improving this network of people working on solving the world’s most pressing problems is one way to do a lot of good.

  • Rigorously investigating how to prioritise global problems and best address them will make the efforts of people aiming to do good more effective.

  • Nuclear weapons were the first genuine man-made existential threat. Despite some progress, we have not reduced the threat of nuclear war enough.

  • Can the decision-making processes of the most powerful institutions be improved to make important decisions better in a range of areas?

  • Beyond the suffering it’s already causing, worse climate change could increase existential risks from other causes and affect standards of living far into the future.

  • We haven’t yet fully reviewed this issue, but it seems like one of the biggest risk factors for existential catastrophe. We don’t yet know what individuals can do to help, but plan to investigate.

We think these issues present many opportunities to have a big positive impact. If you want to help tackle them, check out our page on high-impact careers.

Similarly pressing but less developed areas

We’d be equally excited to see some of our readers (say, 10–20%) pursue some of the issues below — both because you could do a lot of good, and because many of them are especially neglected or under-explored, so you might discover they are even more pressing than the issues in our top list.

There are fewer high-impact opportunities working on these issues — so you need to have especially good personal fit and be more entrepreneurial to make progress.

  • If we make it more likely that the world’s population could eventually recover from a catastrophic collapse, we could save the possibility of a flourishing future even if a catastrophe does occur.

  • Worse than extinction would be a long future of great suffering. The study of these suffering risks (‘s-risks’) aims to specifically minimise the chance of a terrible outcome.

  • We may soon create machines capable of experiencing happiness and suffering, whose wellbeing will matter just like our own. But our understanding of consciousness is so incomplete, we might not even realise when this becomes possible.

  • If we could effectively spread positive values — like (we think!) caring about the wellbeing of all sentient beings impartially — that could be one of the broadest ways to help with a range of problems.

  • If a totalitarian regime ever becomes technologically advanced enough and gains enough global control, might it persist more or less indefinitely?

  • Even as investment in space increases, we have very little plan for how nations, companies, and individuals will interact fairly and peacefully there.

  • There are many ‘public goods’ problems, where no one is incentivised to do what would be best for everyone. Can we design mechanisms and institutions to mitigate this issue?

  • The ability to manipulate the creation of molecules would plausibly have large impacts and could be crucial in many of the worst — and best — case scenarios for advanced AI.

  • Some of the worst possible futures might be less likely if we better understood why some people intentionally cause great harm (and how that harm could be limited).

  • The world’s most pressing problems pose immense intellectual challenges. Better reasoning by researchers and decision-makers could give us a better shot at solving them.

More world problems we think are important and underinvested in

We’d also love to see more people working on the following issues, even though given our worldview and our understanding of the individual issues, we’d guess many of our readers could do even more good by focusing on the problems listed above.

Problems many of our readers prioritise

Factory farming and global health are common focuses in the effective altruism community. These are important issues on which we could make a lot more progress.

  • Every year, billions of animals suffer on factory farms, where standards of humane treatment generally range from low to nonexistent.

  • Preventable diseases like malaria kill hundreds of thousands of people each year. We can improve global healthcare and reduce extreme poverty with more funding and more effective organisations.

Other underrated issues

There are many more issues we think society at large doesn’t prioritise enough, where more initiatives could have a substantial positive impact. But they seem either less neglected and tractable than factory farming or global health, or the expected scale of the impact seems smaller.

  • Digitally running specific human brains — ‘mind uploading’ — might be a safer way to get some of the benefits of artificial intelligence, but might also pose its own risks.

  • There is an unfathomable number of wild animals. If many of them suffer in their daily lives and if we can find a (safe) way to help them, that would do a lot of good.

  • Liberal democracies seem more conducive to innovation, freedom, and possibly peace. There’s a lot of effort already going into this area, but there may be some ways to add more value.

  • Keeping people from moving to where they would have better lives and careers can have big negative humanitarian, intellectual, cultural, and economic effects.

  • The algorithms that social media companies employ to curate content may be contributing to harmful instability and erosion of trust in many societies.

  • Incentives shaped by universities and journals affect scientific progress. Can we improve them, e.g. to speed up development of beneficial technologies (and limit the proliferation of risky ones)?

  • Faster economic growth could improve global standards of living and cooperation, and might help future generations flourish.

  • Depression, anxiety, and other conditions directly affect people’s wellbeing. Finding effective and scalable ways to improve mental health worldwide could deliver large benefits.

  • First-past-the-post voting is common in high-stakes elections like for the US president. Everyone who works on voting theory agrees this is one of the worst systems there is.

Frequently asked questions

Our aim is to find the problems where an additional person can have the greatest social impact — given how effort is already allocated in society. The primary way we do that is by trying to compare global issues based on their scale, neglectedness, and tractability. To learn about this framework, see our introductory article on prioritising world problems.

To assess the problems based on this framework, we mainly draw upon research and advice from subject-matter experts and advisors in the effective altruism research community, including the Global Priorities Institute, Rethink Priorities, and Open Philanthropy, though we also make some of our own judgement calls in borderline cases.

To see the reasons why we listed each individual problem, click through to see the full profiles.

Assessments of the scale and tractability of different global issues depend on your values and worldview. You can see some of the most important aspects of our worldview in the ‘foundations’ section of our key ideas series, especially our article on how we define social impact.

All this has led to a few themes in the issues we tend to prioritise most highly:

  • Emerging technologies and global catastrophic risks. New transformative technologies may promise a radically better future, but also pose catastrophic risks. We think that mitigating these risks, while increasing the chance these technologies allow future generations to flourish, may be the crucial challenge of this century. Though there is a growing movement working to address these issues, work on mitigating many risks remains remarkably neglected — in some cases receiving attention from only a handful of researchers.
  • Building capacity to explore problems. Comparing global issues involves lots of uncertainty and difficult judgement calls, and there have been surprisingly few serious attempts to make such big-picture comparisons, so we’re strongly in favour of work that might help resolve some of this uncertainty — whether in the form of research or in the form of trying to see what works in more speculative areas.
  • Building communities to solve problems. We think it can be extremely valuable to invest in organisations and communities of people who are trying to do good as effectively as possible. We’re especially keen to build the effective altruism community, because it explicitly aims to work on whichever global challenges will be most pressing in the future. We count ourselves as part of this community because we share this aim.

We think some problems are much bigger and more neglected than others, such that by choosing carefully, an additional person can have a far greater impact.

Holding all else equal, we think that additional work on the most pressing global problems can be between 100 and 100,000(!) times more valuable in expectation than additional work on many more familiar social causes like developed world education, where your impact is typically limited by the smaller scale of the problem (e.g. because it only affects people in one or a few countries), or the best opportunities for improving the situation are already being taken by others. Moreover, it seems like some of the issues in the world that are biggest in scale — especially those that could affect the entire future of humanity, like mitigating risks from AI or biorisks — are also among the most neglected. This combination means you can have an outsized impact by helping tackle them.

For this reason, we think our most important advice for people who want to make a big positive difference with their careers is to choose a very pressing problem to work on. This page is meant to help readers make that choice. Read more about the importance of choosing the right problem.

Graph showing higher expected impact if you focus on pressing global issues

A key consideration for where to work is how society is currently allocating resources. If an important problem is already widely recognised, then it is likely that a lot of people are already trying to solve it, in which case it will usually be harder for a few extra people who decide to work on the issue to have a very large impact. All else equal, you are likely to be able to do far more good in an area that is not getting the attention it deserves.

One way to think about this is in terms of a ‘world portfolio’: What would the ideal allocation of resources be for all social causes? And in which causes are we farthest from that ideal allocation?

This is why our list looks a bit surprising: we purposefully want to highlight global issues that we think are furthest from getting the attention they need — such as the risk of a catastrophic engineered pandemic, which currently gets $1–2 billion of funding per year, which is only 1/500th of a more widely recognised problem like climate change (which also needs more work).

To learn more about why we prioritise more neglected issues, see our article on comparing global problems in terms of scale, neglectedness, and solvability, and our key ideas series.

Another reason our list might look different from others’ lists is we think it makes sense morally to value the interests of all sentient beings equally — regardless of where they live, when they live, or even what species they are — which is uncommon. One upshot is that if it seems like something could impact a huge number of future lives, we think that’s a very big deal.

Some find it objectionable to say one problem is more pressing than another — perhaps because they think it’s impossible to make such determinations, or because they think we should try to tackle everything at once.

We agree that it’s difficult to determine which issues will affect lives the most, as well as how tractable and neglected different problems are. The field of global priorities research exists because these questions are so complicated, and we are far from certain about our views (see below). But we think with careful thought and research people can make educated guesses — and do better than random.

We also agree that we can make progress on different issues at the same time, and advocating for more people to work to help others can increase the total amount of work done to solve all problems.

However, resources are still very much limited, so we can’t do everything at once. And we think that given the seriousness of the many challenges humanity faces, we have to prioritise among issues and use our resources effectively to solve them as much as we can, given our limitations.

Refusing to compare problems to one another doesn’t get you out of prioritising — it just means you’ll be choosing to prioritise something with your time without thinking much about what.

No — though we’ve put a lot of work into thinking about how to prioritise global issues, ultimately we are drawing on a modest amount of research to address an unbelievably large and complex question, so we are very likely to be wrong in some ways (see below). You might be able to catch some of our mistakes.

Moreover, it’s very useful for people trying to make the world a better place with their careers to develop their own views about what to prioritise — you’ll be more motivated and more able to help solve an issue if you understand the case and have chosen the problem for yourself.

To help you form your own views, below we suggest a rough process for creating your own list of problems.

The most important and unusual driver of our lists is probably that we especially focus on the impact different issues can have on all future generations, an idea called longtermism. This increases the importance we place on reducing existential risks and on shaping other events that could affect the long-run future.

If we were to reject longtermism, issues that contribute to existential risk would stand out much less (including most of our top-recommended issues), while issues like ending factory farming, improving global health, speeding up economic growth, improving science, and migration reform would all be boosted.

That said, even if we rejected longtermism, we still think positively shaping AI and reducing the chance of a catastrophic pandemic would be top problems for more people to work on due to their large near-term and medium-term effects, as well as their neglectedness.

You can read about some counterarguments to longtermism on our page about it and in the second half of this article.

Of course there are other parts of our broad worldview that could be badly wrong — you can read about some of them in the articles in the foundations section of our key ideas series.

Another major worry we have about the lists is that there’s an important issue we haven’t even thought of, but should be among our top-ranked issues. We sometimes call this the possibility of finding a ‘Cause X.’ The possibility of finding Cause X is one reason why we rate further research and capacity-building so highly.

Finally, we could easily be wrong about any of the particular issues we list — maybe some are much bigger or smaller than we think, or turn out to be more or less tractable. For example, perhaps the development of AI will be largely safe by default. You can see some of our key uncertainties about each individual issue by clicking through to the individual profiles, and we invite you to investigate these questions for yourself.

No — we don’t think everyone in our audience — let alone everyone in the world — should work on our top list of problems (even if everyone totally agreed with our views).

First, the pressingness of a problem is only one aspect — though a very important one — of our framework for comparing careers.

Different people will find different opportunities within each problem, and will have different degrees of personal fit for those opportunities. These other factors also really matter — you may well be able to have 100 times the impact in an opportunity that’s a better fit, and this can easily make it higher impact to work on an issue you think is less pressing in general.

Moreover, as our audience expands, we need to think more in terms of a ‘portfolio’ of effort by our readers, which creates additional reasons for members to spread out (we cover this in more detail in our article on coordination). Two of the most important such reasons are:

  • As more people work on an issue, it gets less neglected, and there are diminishing returns to additional work. This means that a group of people that’s large compared to the capacity of an issue to absorb people will start to run out of fruitful opportunities to make progress on that issue, making it better for new people to spread out into other areas.
  • If you work with others, there is value of information in exploring new world problems — if you explore an area and find out that it’s promising, other people can enter it as well.

Among people who follow our advice, we aim to help a majority shoot for one of the top world problems we list above, but we’d also like 10–20% to work on the second longer list, and perhaps another 10–20% to work on the others.

If we consider the world as a whole, not just our readers, it’s even more obvious they shouldn’t all work on our top-ranked issues. The world wouldn’t function if everyone tried to work on AI safety and preventing pandemics. Clearly, we need people working on a wide range of issues, as well as keeping society running and taking care of themselves and their families.

However, in practice it’s safe to assume that what most of the world will do will remain unaffected by what we say. (If that changes, we’ll change our advice accordingly!) So we focus on finding the biggest gaps in what the world is currently doing, to enable our readers to have as much impact as they can.

We are so glad you are interested! It can seem daunting, but we’ve seen lots of people make real contributions to these problems, including people who didn’t think they could when they first came across them.

We have a career planning process that will take you through this step by step (and a template to help create your career plan). There’s lots of detail in the full process, but here’s a quick summary:

The first step is to learn more about the issues you are considering, as well as what is important to you in your career.

Then you’ll want to brainstorm career paths that will let you contribute the most — we have some ideas for these on our career reviews page. Note that each issue requires a lot of different kinds of work, from advocacy to research to helping build organisations, so you’ll have many paths to consider.

Don’t rule something out too early because it doesn’t sound at first like it’d be a fit for your skills — this is a mistake we see a lot. For example, you can help with AI safety using a variety of non-technical skills (see some suggestions for work in governance as well as supporting roles here).

Next you’ll gain more information about your career options — either by talking to people, reading, or trying things out, and start narrowing down.

The next step is often building career capital — knowledge, experiences, skills, and aptitudes that you can use to have impact in a variety of jobs later (this also helps you learn about future career options you might take).

It’s much more important to maximise the impact you can have over the course of your career than it is to have a big impact next year — which often means starting by investing in yourself.

You can also apply for free one-on-one career advice from our advisors, who can help you compare options and connect you with mentors and other opportunities.

Check out our full step-by-step career planning process for more.

The only thing you can control is contributing as much as you can — and that’s a matter not just of what the world needs, but also of your own motivation and abilities.

We think people often can enjoy more kinds of things than they intuitively think — motivation can come from working with great colleagues on something you think is really important, even if the area isn’t immediately interesting or captivating to you. People’s interests also develop over time.

But if you try to get motivated and it doesn’t work, you can try working on something else. In the answer to the next question, you’ll find a few other lists of issues you can investigate besides ours.

If you really want to help with these issues but don’t feel motivated to help with them directly, you could try helping by donating to organisations that work on them. If you do this as a primary aim of your career, we call it ‘earning to give.’ You can also just donate 10% of your income or however much you’re comfortable with.

Read more about how to have a positive impact in any job.

You can find some guidance on this question in this article and this template. Here are the basics:

Your values and worldview
What do you think is important? What do you think the world is like, and how do you think we should come to beliefs about it? Your answers to these questions are part of your worldview. (For example, we discuss our worldview in our key ideas series.) Spend some time investigating and writing out answers to these questions — keeping in mind that you’ll never have a complete and fully confident answer.

Frameworks for comparing issues
Learn about different frameworks you can use to compare issues. For example, we often use the importance, neglectedness, and tractability framework, where how you assess the importance and tractability of problems is partially determined by your worldview. (See a more popular introduction to the framework.)

Start generating ideas
Once you have frameworks and your worldview clarified to some extent, you can start generating ideas for pressing problems, perhaps using other people’s lists to get started (like ours or others’ listed just below).

Compare
Now that you have your list of issues, compare them according to your worldview and using the frameworks you learned about above.

Identify key uncertainties about your list, work out what research you might do to resolve those uncertainties, then go ahead and do it, and then reassess and repeat. If some of the issues on your list overlap with ours, you can use our problem profiles as a jumping-off point.

Again, it would take a lifetime to get totally confident and make your list complete, so aim for action-relevant information instead. You can (and should) continue to think about which issues you think are most pressing throughout your career.

Other lists of pressing global issues, for inspiration:

Different problems need different skills and expertise, so people’s ability to contribute to solving them can vary dramatically. That said, there are also many ways to contribute to solving a single problem, so you also shouldn’t assume you can’t help with something just because you don’t have some salient qualification.

To learn more about what’s most needed to address different world problems, click through to read the profiles above.

To explore your own skills and other aspects of your personal fit (especially early in your career) and find your comparative advantage, we encourage you to make a list of career ideas, rank them, identify key uncertainties about your ranking, and then try to do low-cost tests to resolve those uncertainties. After that, we often recommend planning to explore several paths, if you’re able to.

You can find more thorough guidance in our career planning process.

You might also be interested in this post on different ‘aptitudes’ you can develop and apply to a variety of issues, and how to assess your fit with each one.

If you already have experience in a particular area, see our article about how you might best be able to apply it.

Read next:  In which career can you make the biggest contribution?

There are many ways to contribute to tackling global problems. Even within careers that help, there are huge differences in how much. This article explores one way to increase the scale of your contribution — getting more leverage.

Enter your email and we’ll mail you a book (for free).

Join our newsletter and we’ll send you a free copy of The Precipice — a book by philosopher Toby Ord about how to tackle the greatest threats facing humanity.