In a nutshell: We think it’s probably very valuable for talented people focused on safety and social impact to work at leading AI companies — even if they aren’t in technical or policy roles. Non-technical roles offer opportunities to do things like:

  • Shift the culture around AI toward safety and positive social impact.
  • Recruit safety-minded researchers.
  • Help on some safety-relevant projects.

If you can find a non-technical role that’s an especially good fit for you, we think this might be your highest-impact option.

Sometimes recommended — personal fit dependent

This career will be some people's highest-impact option if their personal fit is especially good.

Review status

Based on a shallow investigation 

Why might non-technical roles in leading AI companies be high impact?

Although we think technical AI safety research and AI policy are particularly impactful, having very talented people focused on safety and social impact at top AI companies may also be very valuable, even when they aren’t in technical or policy roles.

For example, you might be able to:

  • Shift the culture around AI toward safety and positive social impact by talking publicly about what your organisation is doing to build safe and beneficial AI (like DeepMind has done).
  • Recruit safety-minded researchers.
  • Design internal processes to consider social impact issues more systematically in research.
  • Help different teams coordinate around safety-relevant projects.

We’re not sure which roles are best, but in general those involved in strategy, ethics, or communications seem promising. Or you can pursue a role that makes an AI lab’s safety team more effective — like in operations or project management.

If you can find a position at a specifically AI safety–oriented organisation (like Redwood Research), then any role that helps them do their work better makes a contribution.

That said, it seems possible that some of these roles could have a veneer of contributing to AI safety, without doing much to head off bad outcomes. For this reason, it seems particularly important to continue to think critically and creatively about what kinds of work in this area are useful. You can read more in our article about whether it’s good to work at a leading AI lab (whether in technical or non-technical roles).

Some roles in this space may also provide strong career capital for working in AI policy by putting you in a position to learn about the work these companies are doing, as well as the strategic landscape in AI.

Want one-on-one advice on pursuing this path?

If you think this path might be a great option for you, but you need help deciding or thinking about what to do next, our team might be able to help.

We can help you compare options, make connections, and possibly even help you find jobs or funding opportunities.

APPLY TO SPEAK WITH OUR TEAM

Learn more

Read next:  Learn about other high-impact careers

Want to consider more paths? See our list of the highest-impact career paths according to our research.

Plus, join our newsletter and we’ll mail you a free book

Join our newsletter and we’ll send you a free copy of The Precipice — a book by philosopher Toby Ord about how to tackle the greatest threats facing humanity. T&Cs here.