NOTE: This piece is now out of date. More current information on our plans and impact can be found on our Evaluations page.


Introduction and summary

In this document, which is part of our annual review, we overview the performance of our programs and their costs from founding to the end of 2013. Our programs consist of online research, coaching, community building and events. In this document, we examine how successful they have been in gaining audience, engaging and informing that audience, and ultimately in changing their career plans for the better.

The key question we want to answer is: do we have proof of concept that our programs can repeatedly change the careers of our target audience for the better?

Our key findings are:

  • We’ve been successful in reaching people and engaging them with little investment in outreach.
  • Out of several thousand engaged users, 107 have made significant career plan changes, and a significant proportion of these have followed through with their plan changes.
  • Overall, we think we have proof of concept that our programs can repeatedly change careers for the better.
  • In total, 80,000 Hours has received about £147,000 of donations and has taken 13 years of labour from the team.

Our main uncertainties are:

  • How valuable is a plan change, taking into account what the person would have done otherwise? We have performed further analysis of this question in our analysis of plan changes. We plan to further investigate this question by performing a research evaluation and continuing to track our users over time. Some important sub-questions include:
  • How high value are ‘conventional careers’ compared to ‘effective altruist’ style careers?
  • What’s the chance of one or two people in the group having extreme impact, such as founding a highly influential organisation, donating tens of millions of pounds to charity, or being elected to office?
  • Are these plan changes due to 80,000 Hours or another group in effective altruism? We also investigate this issue in our analysis of plan changes. In the future, we may be able to carry out a randomised controlled trial of some of our programs to learn more.
  • How useful are the benefits of our programs compared to the information and support our users could have found elsewhere?
  • Will our users stay engaged for a couple of years, or many decades into the future?
  • How confident can we be that future work will lead to more plan changes?
  • How much personal value do our users gain from our programs? Would they pay for coaching?
  • How is impact allocated between our different programs? Would the plan changes attributed to coaching have been caused anyway due to our research and community?

What are our programs?

Currently, we’re investing mainly in online content and coaching, so we focus on these areas in this evaluation.

How have our programs changed over time?

2011 – 1H 2012 we focused on events and community building. All career advice given was informal. In early 2012, we started formal career coaching. At the same time, we started investing in the blog. The plans for the research pages have gone through several iterations (more). in mid-2013, we started doing case studies in addition to short coaching sessions. see more detail in our upcoming strategic review.

How do our programs fit into our overall strategy for impact?

Our ultimate aim is to have the largest possible social impact.

We do this by focusing on addressing the following problem:

Over a third of students highly prioritise making a difference with their careers, but the information on how to have social impact with your career is poor and incomplete. As a result, every year over half a million students end up on track to have far less impact with the 80,000 hours they have in their careers than they could.

Our aim is to become the best source of advice in the world for talented, young graduates who want to make a difference with their careers, enabling them to have far more positive social impact.

Through this, we want to create a global, evidence-based, systematic conversation about how to best use your career to solve the world’s most pressing problems. Through doing this, we’ll enable as many people as possible to take high impact careers.

For more, see our business model document.

What do our programs need to achieve?

To help our target market have more impact with their careers, we need to:

  • Reach people in our target market.
  • Engage them in our programs.
  • Give them new information.
  • Cause them to change their career plans towards more high impact paths.
  • Cause them to follow through with these plan changes and change behaviour.
  • Keep them engaged, so they continue to adjust their plans as new information arises and so we can assess our impact.

In the next section, we review the overall success of our programs at each of these stages.

Overview of the success of our programs

Have we reached people?

Our main outreach methods have been:

  • Word of mouth.
  • Referrals from other groups within effective altruism (especially Less Wrong and Giving What We Can).
  • Running local student groups in Oxford and Cambridge, which hold career events.
  • Spreading our content through social media and online presence.
  • Traditional media.

This hasn’t required much investment over and above writing blog posts, which we do as part of our research anyway. Over the past year, we spent less than 10% of our time on outreach, and this time was mainly spent by interns rather than staff. This is because largely free sources of promotion continue to bring people to our site in sufficient numbers to test our programs and have enough impact to justify our costs.

Unique visits to the site

Our key metric to measure reach is the number of unique visitors to the website. In total, we’ve reached over 100,000 people, and traffic doubled over the last year.

Unique website visitors (monthly)

Unique_website_visitors

PeriodUnique VisitorsAnnual growth rate
1H 2011
2H 20114,266
1H 201212,157
2H 201235,2561112%
1H 201352,056532%
2H 201341,229197%
Total (all time)141,889

Note: ‘1H’ denotes ‘the first half’

Note that traffic was boosted by media coverage in 1H 2013, so we did not expect to grow between 1H and 2H 2013. In December 2013 we experienced a Google search error, which caused us to lose several thousand visits. since we are not expecting to invest in outreach over 2014, we may not see strong growth in traffic over the next year.

Media coverage

In addition, we’ve reached at least several million people through our media coverage.

Have the people we’ve reached become engaged users of our programs?

Facebook lifetime total likes

Facebook_lifetime_total_likes

PeriodTotal ‘likes’ at end of periodAnnual growth rate
1H 2011
2H 2011190
1H 2012352
2H 2012649242%
1H 20131,444310%
2H 20132,004209%

Note in 2013 we set up pages for the Oxford and Cambridge student groups, which now have over 1,000 likes. These are not included in the totals.

PeriodTotal mailing list subscribers at end of periodAnnual growth rate
1H 20110
2H 2011628
1H 2012672
2H 20121,911204%
1H 20132,405258%
2H 20133,836101%

We’ve also seen similar growth in the number of Twitter followers, and currently have about 900.

In total, we have several thousand subscribers to our content, constituting about 2% of the website’s unique visitors.

Do people read our online research?

We track this by monitoring page views of blog articles, since most of our research is stored as blog posts.

PeriodPage views to the blogAnnual growth rateAvg. time spend on blog per visit
1H 2011
2H 20111,320
1H 201218,5730:02:49
2H 201250,7625257%0:02:51
1H 201366,271588%0:02:56
2H 201354,498174%0:02:48
Total (all time)196,4240:02:51

Our blog articles receive substantial traffic, which almost doubled over the last year. During this time, average time on the blog has stayed roughly constant, suggesting we haven’t sacrificed quality of visits for quantity.

Do people sign up to our coaching service?

PeriodTotal # of coaching requestsAnnual growth rate% of unique visits to website
2H 2011Informal only0.00%
1H 2012120.10%
2H 2012510.14%
1H 20131500.29%
2H 2013268563%0.65%
Total (all time)4180.29%

Notes: In 2011, we informally spoke to people about their careers. We didn’t start formal coaching until 2012. The 2012 figures are estimates based on the number of people we spoke to and recorded in our careers coaching diary, assuming we spoke to about ? of requests, since in 2012 we spoke to most people who asked. The 1H 2013 figure is taken from our last metrics report. The 2H 2013 figure is based on the number of online form submissions.

There was very strong growth in the number of coaching requests over the last year, as our service became more established and well known.

We wrote an in-depth analysis of the coaching applicants, showing that on average they are high-achieving and altruistic, though around half probably wouldn’t benefit much from the service.

How many people did we coach?

PeriodTotal # of people coached% of requests fulfilled
2H 2011Informal only
1H 20128~66%
2H 201234~66%
1H 20134228%
2H 20134115%
Total (all time)12526%

The number of requests in 2012 is an estimate.

The rate of one-on-one coaching has been roughly constant since 2H 2012, when we started full-time. This is because the main aim of the coaching is to test and develop our content, and we think current volumes are roughly sufficient for that. Since the number of requests has grown, we’ve been successful in our aim of making the coaching applications more competitive. In the future, we plan to speak to 20-40% of people who make requests for coaching.

How many people were willing to fill out our impact survey?

Our 2014 impact survey received 205 responses over a month. This shows there are at least this number of relatively highly engaged users, who are prepared to help us out by filling out a detailed survey on their career plans. Over 30 of respondents left testimonials.

Do users gain important new information from our programs?

After engaging users, we need to help them learn new useful information about their careers. We may also contribute to changing attitudes like making them more motivated or altruistic, but this is not our current focus.

We think our programs contain important, new information for our target market. Our main evidence for this is the plan changes that have resulted from engaging with 80,000 Hours in the next section. In the next six months, we also intend to directly investigate the quality of our research, by carrying out an external evaluation of its quality based on our blog research quality rubric.

We’ve also investigated the extent to which we were able to help the people we coached over 2H 2013 in our coaching evaluation. 52% of coachees gave the coaching a score of 6 or 7 out of 7 for usefulness. Everyone said they were more likely than not to recommend us to a friend.

We may also be able to learn more about the extent to which users find the content useful in the future by charging for some of it.

Do users change their career plans?

The key metric we track in this area is the number of significant plan changes we have caused.

A significant plan change is defined as:

An individual has made a significant plan change if they say they have changed their credence in pursuing a certain mission, cause or next step by 20% or more; they attribute this change to 80,000 Hours, and there’s a plausible story about how engaging with 80,000 Hours caused this change.

For instance, if someone says they anticipate going to medical school with probability 55% and law school with probability 45%, then they read an article on the 80,000 Hours blog and switch to 75% med school and 25% law school, that would count as a significant plan change.

We count the plan change as plausibly due to us if (i) the person tells us that they changed their plans and they think it was due to us, and (ii) we can point to the new information we gave them that changed their mind.

In total we’ve collected 107 significant plan changes, caused by all of our programs in combination since 2011. That’s 1-10% of engaged users, or 0.1% of unique visitors. We explore these figures in more depth in the upcoming plan change analysis.

In the plan change analysis, we roughly estimated when these changes occurred:

Period# of significant plan changesAnnual growth
Total 201114
Total 201234142%
Total 20135976%
Total (all time)107

We think this is good evidence that we cause a significant number of engaged users to change their career plans. Indeed, there’s likely to be significantly more people who have changed their plans but have not told us about it. It’s also likely that there are engaged users who will change their plans in the future, but haven’t yet had the opportunity.

It’s difficult to disaggregate the impact of our coaching, research, events and community, because normally more than one pathway is important for each person, and it’s difficult to know what would have happened otherwise. Overall, all the methods seem to be about equally important, except events (though they’ve had less investment, and are mainly an outreach method). There have also been over 20 cases where only research or only coaching have triggered plan changes.

We go into more detail in the upcoming plan change analysis and in the metrics matrix over 2h 2013.

Do users follow through and change their career behaviour?

We think 40-90% of the people who made significant plan changes will follow through with their changed career plans, at least over the next couple of years:

  • In the survey, about 44% of people said they had already taken steps to follow through with their plan change. as far as we can tell, the majority who haven’t followed through haven’t done so because not enough time has passed.
  • This is borne out by looking at in-depth studies of plan changes, which show that if enough time has passed, almost everyone recorded as having made a significant plan change has followed through with it.

On the other hand, we don’t yet have much indication how long the changes will last after the next couple of years. Will the people who made significant plan changes have a lifelong commitment to maximising their social impact, or will most people drop out into ‘conventional’ jobs after a couple of years? We expect many people to carry on, because (i) many report increasing how much they prioritise making a difference in an evidence-based way, which can be expected to influence future decisions, (ii) many are now involved in our community and the effective altruism community more broadly, which should encourage them to remain focused on social impact. On the other hand, we don’t yet have much data, so it’s difficult to be confident. To learn more about this question, we intend to continue following up with previous career-changers.

Do users stay engaged?

We’re not currently investing in fostering continued engagement, except through our general efforts to promote engagement listed above, and the introductions we make through our coaching process.

Nevertheless, we think an active community with members who will be engaged in the long-term is being built.

  • 205 people responded to our impact survey.
  • We now have 125 coaching alumni, and a significant proportion of these people are happy to take introductions to other coaching alumni.
  • The wider effective altruism community is continuing to grow rapidly, and is expected to continue growing as promotion for Will MacAskill’s book on effective altruism starts over the next year. This provides a ready made community for our members.

Metrics by program

We made a rough attribution of each stage above to our different programs over H2 2013, and put them in this spreadsheet.

Note that it’s very difficult to untangle the interactions of our programs. These figures are mostly rough estimates of the immediate cause of the various metrics. We have not tried to estimate what would have happened otherwise if we hadn’t run a certain program (e.g. if we hadn’t done coaching, some of the 17 plan changes may well have happened anyway through other channels).

Costs

Inputs

Financial costs in £Volunteer weeksStaff weeksIntern weeksTotal weeks of work
2H 20110780078
1H 20120220022
2H 201223,171103988137
1H 201350,426445117166
2H 201373,6861468103185
Total (all time)147,283128152308588

The financial figures are taken from our upcoming finance report. The staff time figures are taken from our trustee evaluation in May 2013, with figures added for 2H2013. We assumed 40h per week minus 10% holiday, 4.3 weeks per month. Staff time includes holiday, but intern and volunteer time doesn’t. The number of volunteer weeks in 2H 2013 is a guess made by me.

Opportunity cost of staff time

Note that our larger cost is probably the opportunity cost of the labour invested in 80,000 Hours, rather than the financial costs (especially because everyone involved has worked at far below market rates). If we had not worked on 80,000 Hours, we could have done something else to have a social impact.

We can roughly convert the costs of the labour into financial costs by assuming that if we hadn’t worked on 80,000 Hours, we would have pursued earning to give instead.

We assume staff could have donated £20 per hour (assuming they could have have taken high-earning jobs and donated a large fraction), and interns £7.50 per hour (which may be an overestimate, since many interns were between jobs or education courses). We cost volunteers at £2 per hour before 2H 2012 (when they were mostly students), and £10 per hour afterwards (when many were highly qualified advisors). Using these conversion rates give the following figures.

Opportunity cost of staff time in £ of donationsOpportunity cost of intern time in £ of donationsOpportunity cost of volunteer time in £ of donationsTotal opportunity cost of labour in £ of donations
2H 2011006,2406,240
1H 2012001,7541,754
2H 201230,96026,4454,12061,525
1H 201336,12035,1531,50072,773
2H 201354,72030,9005,40091,020
Total (all time)121,80092,49819,022233,320

Breakdown of staff time

Over H2 2013

Over this period, we had the equivalent of three staff:

  • Our share of Rob and Tom on operations and fundraising
  • Roman, who focused on coaching, but also helped with staff recruitment.
  • Myself, who did a mixture of strategy, management, fundraising, writing web content and coaching.

We also had an average of four interns, roughly breaking down as:

  • One focused on fundraising and operations.
  • One focused on website tech.
  • One focused on outreach and the rebrand.
  • One focused on coaching and research.

Note that we had intended to have a whole intern focused on research, but one of our research intern left early.

I made rough estimates of what proportion of time was spent on different activities by the staff and interns over 2H 2013 using the contents of weekly employee reviews. Some of the activities are difficult to disentangle and the estimates may be biased, so these figures should only be taken as a very rough indication of how time was spent.

Overall work weeksStaff work weeksIntern work weeks
Total17168103
Team and management meetings, email3420%1320%2120%
Coaching and case studies2313%1218%1111%
Website tech/Rebranding/Static pages2313%23%2120%
Networking/fundraising2012%913%1111%
Other159%57%1010%
Research blog posts (excluding case studies)127%35%99%
Finance admin116%69%4
Outreach106%01%99%
Other operations84%35%4
Management64%57%11%
Recruitment64%58%11%
Strategy42%46%00%

Much of the team meetings and email are devoted to furthering the development of our programs through discussion and feedback, so this shouldn’t be regarded as lost time, although some portion of it is required to generally stay in sync. It’s also when we deliver much of our staff training.

Long-run averages since July 2012

We think there are some important differences between these figures and the averages since July 2012.

  • The time spent on research blog posts is unusually low, because we focused on coaching over this period and we ended up short one intern. Normally it would be more like 10-20% of time.
  • We didn’t do any evaluation over this period, so that term is unusually low. Over the year, we’d expect it to be 5-10% of time.
  • Coaching is higher than normal, since the case studies were a key focus in this period.
  • Outreach is higher than normal, since we had an outreach intern during this period. We’d expect the long-run average to be under 5%.
  • Time spent on the website is high, since we rebranded in this period.

Broadly, we guess the longer term historical averages are something like:

  • 45% developing and delivering our programs (fairly evenly split between coaching, blog, and website).
  • Arguably, most of the value from this time is in content development, rather than the direct impact on the career choices of users.
  • 10% on strategy and evaluation.
  • 15% managing, training and recruiting staff, and generally staying in sync.
    • This also includes some set-up costs and organisational learning (e.g. developing management practices, developing our financial procedures).
  • 10% finance admin and operations.
  • 10% networking and fundraising.
  • 5% outreach.
  • 5% other.

In the future, we can significantly increase the proportion of time going into program development by hiring additional staff.