Review of progress July 2013 to April 2014

Introduction and summary

The purpose of this document is to review what 80,000 Hours has achieved from July 2013 to February 2014 since our last review of progress. We also review how we performed relative to our targets, and our mistakes over the period. This document is part of our annual review.

In summary:

  • We went through three stages during the period: website redesign, testing our content, and finally conducting our impact evaluation and strategic review. Other significant priorities included writing a book proposal on effective altruism, fundraising and staff recruitment.
  • Our main achievements were establishing proof of concept that our programs (research and online content, supported by coaching) can change career plans, and creating a clearer strategy.
  • While doing this, we continued to faciliate significant plan changes, which we think justify our costs.
  • Other achievements included: Will landed a major book deal to write about effective altruism, we continued to build the team and CEA, we increased our financial security by reaching our target of 12 months’ cash reserves, we implemented more professional branding, we had a meeting at the UK Prime Minister’s office on careers advice policy, we helped to foster the Global Priorities Project, we published over 40 research blog posts, the Cambridge student group had a strong first year, and we increased our organisational transparency.
  • We made progress on all of our key priorities and completed most of what we set out to achieve in our last review, but ended up several months behind schedule for a variety of reasons.
  • Our main mistake over the period was not keeping the team sufficiently focused on fundamental strategic progress. We think we’ve already corrected this mistake.

You can find more detail on our key metrics in our review of program performance.

What have we worked on since July 2013?

We started the period with the following plan and priorities.

The period divided into three main phases.

1. Website redesign and rebrand, July – September 2013

In this period, Ozzie, Jen and I focused on:

  • Redesigning the website structure and key copy to make the website better explain – and be more focused on – what we offer.
  • Doing a complete visual rebranding, using two external designers, and implementing it on all our materials.
  • Rewriting the key website content on the research and coaching pages, so that it fits with our current plans.

There’s much more detail in our review of progress on the website.

In this period, we also moved into our new office, which belongs to the University of Oxford and is shared with the Future of Humanity Institute.

2. Testing our programs, October 2013 – January 2014

While Ozzie continued working on the website, I joined Roman, Thomas and Sameer to focus on running a set of tests of our programs and improving our research. Most importantly, this involved aiming to:

  1. Complete 20 case studies, and see which proportion of people made a “significant plan change”.
  2. Complete 20 short one-on-ones, and see which proportion of these people made a ‘significant plan change’.
  3. Update the research pages, including writing the careers and causes lists.

You can see the team plan and full list of tests here. we reviewed the results here.

3. Evaluation, February – April 2014

In the last three months, we have mainly focused on preparing data for and writing up our impact evaluation and annual review documents.

Note that from January, only Roman, Ozzie and I were working on 80,000 Hours, in addition to our 50% share of the central CEA team.

Other activities

  1. Book proposal: Will prepared a book proposal to write about effective altruism and met with publishers.
  2. Fundraising: Rob spent about 50% of his time on fundraising (and we cover half of Rob’s costs). Much of this was spent on discussions with past donors, and meant that we raised about as much income as we were expecting. I also spent about 10% of my time on fundraising, focusing more on developing new donors. We unsuccessfully applied to Ycombinator and reached out to the Cambridge Angels. In February, we received a £30,000 donation from a new donor, Luke Ding.
  3. Staff recruitment: We ran a full application process from August to September 2013, and are currently half way through another application process. I have also spoken to five promising people in person about working for us.
  4. On-going financial management and logistics.
  5. On-going outreach: We took some of the more promising outreach opportunities that presented themselves. We gave talks at the Oxford and Cambridge student groups, and gave advice to the volunteers running the groups. Will did a tour of the US East Coast. I visited New York and attended the Nexus Global Youth Summit on philanthropy and social enterprise. Rob, Roman and Jess attended the Effective Altruism Summit. We co-hosted the first public lecture of CSER. I spoke at the Tatasec conference on social enterprise. We had some minor media appearances.
  6. On-going research: We continued with research and blogging throughout the period, aided by volunteers.

Staff time breakdown

You can see a rough analysis of how staff and intern hours were allocated in the second half of 2013 in the review of program performance and costs.

Over the period, we had an equivalent of about three full-time staff and four interns.

Main achievements since July 2013

Over the last 18 months, our key priority has been searching for a business model that can be scaled up. As described in depth in our strategic review, we think we made major progress towards this goal over the last period, and are approaching a model we can commit to. In particular, our main two achievements were:

  1. We established initial proof of concept for our research, online content and coaching by showing that they can cause people to change their career plans.
  2. Based on that, and further reflection, we created a clearer strategy for the next two years.

Some smaller achievements which contributed to establishing proof of concept for our programs were:

  • We performed a detailed analysis of how we’ve changed career plans over our history, which identified 107 significant plan changes since 2011. We expect the majority of these to result in people taking different careers. This helped to resolve a key uncertainty in our business model: we knew people were engaging with our content, but weren’t sure to what extent they were changing their behaviour based on it.
  • We found over 30 people who had changed their career plans after reading our online content, but without receiving any coaching or knowing us personally, suggesting that our online content alone can change careers.
  • We completed 18 case studies and 18 short one-on-ones, and did a full evaluation of them. We found that about 40% of the people we coached made a significant plan change, greater than our expectations, suggesting that our coaching alone can also change careers.
  • We’ve continued to see strong growth in user engagement, despite little investment in outreach. Most notably, the number of coaching requests almost doubled in the second half of the year compared to the first, but remained high quality.
  • The careers list had a very strong reception and is now our most ‘liked’ post of all time. This was a key piece of prototype content.
  • In our surveys and impact evaluation, we found we have a strong base of highly engaged users, who are very willing to help us in developing 80,000 Hours.

Within creating a clearer strategy:

  • In addition to the data collected during our impact evaluations, we received feedback on our strategy from Ycombinator, Adam O’Boyle, and Soushiant Zanganehpour, as well as many people within the effective altruism community. We collected extensive feedback from users through our impact survey and coaching evaluation.
  • Based on this evidence, we made a series of key strategic decisions, including whether to prioritise research or coaching, what to include within the research pages, how to monitor our impact, and what to do over the next couple of years. The results are in our strategic review.
  • By doing our impact evaluation, we clarified how we will track our impact You can read a description of our plans in the strategic review. during this process, we addressed many of the criticisms of our old processes raised by our trustee nick beckstead in his may 2013 review of our performance. in particular, we now have much better documentation of the types of career changes we have caused over time and whether these are likely to be improvements (see our plan change analysis document). we’ve also increased integration of tracking methods by introducing a standard metric (the significant plan change, maintaining a single list of changes, and introducing a standard impact survey.

Some other important achievements in 2013 include:

  • Despite focusing on developing our business model, we think we’ve continued to have a significant immediate impact. In total, we identified 59 significant plan changes in 2013, of which we think a significant proportion happened in the second half of the year. We think it’s likely that the additional impact resulting from these plan changes is enough to justify our costs. Moreover, we don’t think it’s likely that significant plan changes account for the majority of our impact.
  • Will MacAskill signed a major book deal with Gotham Books (part of Penguin) and Guardian Faber to write about effective altruism. We expect this to be a major source of promotion leading up to its release in summer 2015; for instance, Guardian Faber has already pledged £60,000 of banner advertising.
  • We made good progress building the team, including hiring Roman Duda to focus on coaching, promoting Rob to Executive Director of CEA, recruiting Marek Duda to the central team, finding two highly promising freelance web developers, finding several promising candidates to hire as researchers, finding a promising Director of Development, and securing an excellent candidate to work at 80,000 Hours from July 2015, focusing on outreach. We’re on track to have our ideal execution phase team in place within 18 months: myself, our share of the central team, and someone on each of coaching, research, web development and outreach.
  • We increased our financial strength by securing our target of 12 months’ reserves. Fundraising has been progressing well. See more in our finance report.
  • The Cambridge student group had strong results. They organised 10 events with a total attendance of 680; gained 700 Facebook likes; caused two people to sign the Giving What We Can pledge, caused an estimated ten significant plan changes, and handed over to a strong committee. The Oxford student group also had good results, though caused fewer plan changes, and has been handed over to two strong volunteers.

We also had a meeting at the UK Prime Minister’s office, became even more transparent by releasing public weekly updates, did a complete rebrand and redesign of the website, helped to foster the Global Priorities Project, continued to learn about how to manage the organisation (see below for more), developed a method to externally evaluate our research blog posts, made it much easier to find old research by categorising our blog posts and creating a tag tree at the top of the blog page, and published over 40 research blog posts.

How did we perform relative to our stated plans?

Previous plans:

Overall, we made progress on almost all our main priorities from the last review, and achieved most of what we set out to achieve. The main discrepancy is that progress took longer than intended. In particular, we set a goal to complete 30 case studies by the end of December, but we only completed 18 case studies and 18 short one-on-ones by the end of February. We also haven’t yet expanded our research pages or experimented with paid-for coaching as intended. Overall, we’re a couple of months behind where we planned to be. This was for many reasons, including:

  • We raised our standards in hiring interns or staff, which contributed to shrinking the team to only three staff and two interns (including our share of central CEA). This slowed down immediate progress, but we expect it will yield more rapid progress in the long-term.
  • We had some bad luck with staffing. One intern had to leave early and one staff member did about one month less work than planned.
  • We spent longer on the careers and causes lists and the annual review than planned, but made them higher quality.
  • We didn’t leave enough slack in the plan for unanticipated opportunities, like Ycombinator and the visit to the UK Prime Minister’s office.
  • We didn’t initially have a sufficiently structured process for doing the case studies and didn’t leave slack in the plan for abandoned studies or significant delays from our coachees.

Our mistakes and ways things could have gone better

Team too large and not sufficiently focused on strategic progress

We think the biggest way things could have gone better would have been keeping the team smaller, more permanent, higher quality and more focused. This would probably have resulted in less immediate impact, in the form of changed careers, research and outreach, but it would have likely accelerated fundamental strategic progress, such as developing product plans or prototypes, testing the impact of our programs, recruiting staff and raising funding. Ultimately, it’s strategic progress that’s important for our chances of becoming many times more influential in the future.

In the past, we’ve maintained a large team with a fairly high turnover, especially by maintaining a large number of interns. This larger team meant that we have more immediate impact, but takes time from our core staff, who are best placed to made strategic progress. It also decreased focus and made us less strategically nimble. This has been made worse by having relatively complicated plans (such as the team plan made in November 2013), which involved working on several different types of projects at once.

We analysed in depth the issue of how many interns to hire in our last six month review, concluding that we should aim to have fewer in the future. In hindsight, we should have been even more aggressive in reducing the number, in order to keep the team even more focused.

We think we’ve already corrected this mistake. First, we dropped down to just one intern (Ozzie) working on tech and one intern on central CEA. Now Ozzie has left, we’ll aim to only have contractors on tech over the next year, who will only work when needed. Ozzie also significantly simplified the website, making it easier to maintain, and taught us more about how to edit it ourselves. There’s more detail in the website review. In addition, we’ve found an oDesk editor, a volunteer editor, a virtual assistant and a contract researcher, who can provide capacity as needed. We intend only to have one or two interns over the next year, and they will be people we are strongly considering hiring, or who can help with our strategic priorities.

Besides reducing the number of interns, we’ve raised the bar on hiring, and are now very focused on building a team of staff who are around to stay, and can take 80,000 Hours to scale. We’ve attempted to make our team plans more focused, by working on fewer activities at once and always having a clear top priority. As explained above, we’ve also increased our use of contract workers, which also increases focus, because we only use their abilities when needed.

Some other mistakes over the period included (not in order):

  1. Poor team structure around September 2013.
    • Problem: Roman was managing Ozzie on the website and Sameer on outreach, while I managed Thomas and Collin on research and coaching. This resulted in me managing Ozzie and Sameer through Roman, which was inefficient.
    • Solution: We solved this by swapping Roman and me in the management structure. Roman became head of coaching, where he could be independent, and I managed Ozzie and Sameer. We also increased team communication by setting up Hip Chat, and adapting communication methods to fit the preferences of the new team.
  2. Our handling of coaching requests.
    • Problem: In the survey, three people mentioned that we hadn’t responded to a coaching request and two mentioned that we should turn fewer people down. We’ve struggled to cope with the five fold growth in coaching applications since 2012. This has meant missing several applications and in some months turning down 90% of applicants.
    • Solution: We’ve been dealing with this by improving our systems for processing applications, clarifying our offering on the website in an effort to discourage unsuitable applications, altering the rejection letters and developing the short one-on-ones. In the future, we intend to increase the percentage of people we talk to by delivering the short one-on-ones on an on-going basis. It’s likely we’ll bring back some networking features, to help the people we can’t talk to gain more value from the community.
  3. Our written content should have been more focused on high quality research.
    • Problem: Past written content has often been aimed more towards outreach than research. This has increased our immediate impact, but has meant that over the long-term our body of content is not as impressive as it could have been. We’ve also received feedback about the research being too confident, not concrete enough and not providing enough hard data. Some of the blog posts also contained too many typos.
    • Solution: Over the period, we shifted towards providing more research-style content on the blog. This was aided by developing a research quality rubric and external review system. Since then, we’ve received only positive feedback about the new direction of the content. We also found a volunteer editor and paid editor on oDesk, who edit most of our blog posts. Over the next year, we intend to make developing the research our key priority, and would like to hire a talented researcher.
  4. Our code base was a mess, and had a large ‘technical debt’.
    • Problem: In the past, we had developed too many features, not been disciplined in removing lower priority features and have had several programmers work on the site. The result was a maintenance burden, which required having a software developer on staff at all times, and a site that was difficult to change. Find more detail in this presentation.
    • Solution: As part of the website redesign, we removed many of the features and simplified the site. We’re now attempting to build a relationship with a part-time web developer who can maintain the website until we want to invest more in web development. We’re also considering migrating from Rails to WordPress.
  5. We didn’t leave enough slack in the plan for unanticipated opportunities, resulting in missed goals.
    • Solution: We increased our levels of time-tracking, in order to make better estimates of time allocation. Going forward, we’ll allocate about 25% of time to unanticipated opportunities in our plans.
  6. Case study work flow was not clear enough and the expected investment was too long.
    • Problem: We attempted to delegate case studies before the process was sufficiently clear, leading to delays.
    • Solution: I did several case studies personally, and prepared a detailed description of the process, including template documents for write ups. In the future, we intend to do mainly short coaching sessions, and upgrade a small portion of these to case studies. We’ll aim to only invest ten hours of research into a case study, rather than about 25.