This is the fifth post (of a series of six) on our six month evaluation
This report outlines our key priorities for the next six months.
Summary and Discussion
We continue to see the top priority as further investment i.e. developing our business model, content and the organisation’s robustness.
The flagship goal in this area is carrying out 30 case studies, which will form the centre of our content and evaluation efforts.
Also important is: (i) more impact evaluation in general (ii) fundraising enough to keep up with our expanding budget (iii) increasing our talent pool through training and outreach (iv) increase the appeal of our content to successful young professionals through rebranding (v) having high quality research on our key ideas to support the case studies on our blog.
Outreach is less high priority, except insofar as we do enough to ensure a strong stream of candidates for case studies and internships, and build up credibility (e.g. receiving press coverage and impressive affiliations).
We’re also not yet focused on scaling up delivery, because we think it will be overall faster to spend more time developing our content at this stage.
- Carry out and evaluate the impact of 30 high-quality case studies with people who have high potential to make a difference.
Improve our data collection and storage, in particular, setting up a central CRM database.
Aim to be in a position to carry out and analyse an impact survey of members before the end of the first quarter of 2014. This is in order to improve our evaluation of to what extent our online content causes career or donation changes.
Perform an evaluation of our research to date.
Update our understanding of customer problems and our target market based on recent user data.
Use all of this information to improve our business strategy.
Fundraise an additional £150,000 by the end of the first quarter of 2014 so that we have 12 months of reserves by the end of the period.
Experiment with at least one alternative fundraising model.
Professionalising our brand
Adapt our marketing to appeal more to successful young professionals rather than students, increase our credibility and update on our improved understanding of our target market. This will include the following sub-priorities:
(i) Rebrand our visual identity, including the updating the website design
(ii) Update our key online copy
(iii) Alter how we pitch 80,000 Hours
(iv) Seek to build up more ways to prove credibility e.g. (a) start applying for Matrix accreditation (or similar) and (b) post-our self-evaluations.
Improve our marketing towards new staff by articulating why to work at 80,000 Hours and improving our communication of what we do.
Content (over and above case studies)
- Complete high quality content explaining our core frameworks and best guesses for Our Findings page.
Other organisation building
Hire another full-time staff member of the careers research team, increasing the team to 4 full-time staff, and hire at least five long-term interns, to maintain our target of at least 6 interns (counting 80,000 Hours front facing interns and 50% of CEA ops interns). We want to maintain the ability of these staff at least at current levels, which will require better promotion of ourselves as a job opportunity.
Increase the independence of central CEA by creating the ED of CEA role.
Increase transparency about our plans and their justification by posting our evaluations and the story of our research process on the blog.
Reach out to high potential networks to ensure we have a strong enough supply of case study candidates and interns.
Run a promotional push to ‘launch’ our new content in October, with the aim of finding strong case study and staff candidates.
Receive press coverage for the organisation and our best content.
Coach our student group leaders to run successful outreach programs in our university hubs.
Continue to take low time outreach activities e.g. posting to social media, SEO and Adwords.
How will we carry out the case studies?
What follows is our tentative initial guess of how to run the case studies.
We start with an initial exploratory session which is 1-2 hours, in which we gather information and decide the next best step to resolve uncertainty (this might be defining a research question which we then spend some time on, but it might also be for them to go and get more information about a specific area by speaking to people, or something else).
We then meet for a second time to present and discuss our findings, and decide on the next best step.
We iterate this process a number of times, depending on the length of each iteration and what’s helpful, but us spending no more than ~4 days research total.
We have a final meeting to discuss findings and conclusions etc.
We write up the most important and generalisable parts for the blog.
How do we select the people?
Our goal is to pick the people who will get the most out of our service.
We think this means looking for the following traits:
Overall excellence: Is there evidence that we can expect them to be high achieving in the future?
Cause neutrality: Are they happy to consider a wide range of ways to make a difference and decide based on evidence?
Altruism: How motivated are they to do whatever has most impact?
Good questions: How well do their questions match our research question criteria 3 and 4 below?
Right now, this typically means we talk to motivated graduates of elite universities, with a variety of high potential options, who are familiar with the ideas of effective altruism and care strongly about positive impact.
We select research questions based on:
Decision relevance to the individual: concerns a question that could swing the balance of their career decision
Tractability: extent to which it is easy for us to say something decisive about this question in our research time
High impact options: relevance to the causes and careers that we think are high priority
Generalizability: extent to which the question is relevant to other members
What data will we collect for evaluation?
Before or after the first meeting, we’ll collect their current career plan, broken down into their options for (i) cause (ii) mission (iii) next steps with subjective probabilities.
We take notes of all the meetings and our research recommendations. We can use this to track what new information we presented them with.
We collect an exit career plan at the end of the last meeting. We can use this to track how the process changed their plans.
We follow up to see how they acted on this plan.
In addition, for the purposes of research evaluation, we track (i) the number of hours we spend doing research, sources used and methods used (ii) we normally speak to coachees in pairs so that the second person can give feedback to the lead (iii) we plan to record most coaching sessions so that we can get further feedback.
How did we take account of the recommendations of the trustee performance evaluation?
Nick recommended the following measures:
Obtain a stronger understanding of how we change career plans.
Provide stronger evidence that we’re changing career plans and that these changes are improvements.
Integrate our data collection into a single database.
Increase the standardisation of our data collection.
We are acting on all of these recommendations in this plan. (1) and (2) are addressed primarily through the case study model, though we also plan to survey our other members within the next 12 months. In addition, we’ll address (2) by adding a research evaluation to our next six month review. (3) is addressed via setting up a CRM. This will also help (4), though it will be difficult to improve this dramatically in the short term as we continue to rapidly adjust our monitoring and evaluation processes.
What are our long-term plans?
For more detail of our business model and long-term vision, see our Business Strategy document.