The results of our annual impact survey are in, so we can give an update on the number of significant plan changes we’ve caused as of the end of April 2014.
“Significant plan change” is the key metric we use to track our impact. See a definition.
Table of Contents
Results of the annual impact survey
The survey was open from November 2014 to early March this year. We promoted it throughout February, on our newsletter, our blog and social media as well as in-person. To encourage responses, we offered a $200 prize to one randomly selected respondent.
This resulted in 218 responses, of which 85 people answered “yes” to the following question:
“Has your engagement with 80,000 Hours caused you to significantly change your career plans?”
How many of the people who said “yes” count as having made significant plan changes according to our criteria?
First, we removed 13 people who were already captured last year (though worth noting is that five of these people seem to have made a second significant shift in plans due to us).
Second, we removed a further 18 people who didn’t seem to make our criteria. The main reason was that they didn’t list a specific shift in plans, and instead listed an intention to shift in the future or only gave a very vague response. It’s likely that some of these people would count as significant plan changes if we asked for more detail, but we’ve left them out at this stage to be conservative.
This leaves a total of 54 new significant plan changes.
|Total # of responses||217|
|# who said they had significantly changed their plans due to us.||85|
|# of people who said “yes”, but who we’d already recorded last year.||13|
|# of people who said “yes”, but who didn’t meet our criteria for having made a *significant plan change*.||18|
|Total # of new significant plan changes.||54|
Note that many of the 150 or so people who responded to the survey but whom we didn’t count as having made significant plan changes mentioned getting significant value from 80,000 Hours. Many mentioned that they had used our framework to make decisions, had considered new options that they would not have considered otherwise, or had been affirmed in their existing plans.
Sources other than the annual survey
- We also had an on-going survey on our website between April and November 2014, which was linked to on our blog and in our newsletter. It received 25 responses, of which we counted eight as new significant plan changes.
The on-going survey also ran March-April 2015, receiving 28 responses, of which we counted six as new significant plan changes.
We also found a further nine people through coaching feedback forms.
We found one person through our own knowledge.
None of these were duplicated on the annual survey.
As of our previous (and first) plan change impact evaluation, we’d collected a total of 107 significant plan changes. Adding all the new sources, the new total is 188, growth of 76%.
|Source||Total new significant plan changes|
|Total in april 2014 impact evaluation||107|
|2015 annual impact survey||54|
|2014 on-going survey||8|
|2015 on-going survey||9|
|2015 own knowledge||1|
|New total as of April 2015||188|
What did the changes consist of?
At first glance, the changes seem similar to the last evaluation.
We’ve also added four new studies to our plan change page and created a list of new organisations founded by plan changers.
How did the changes come about?
In the annual impact survey, we asked:
What was most significant in triggering these plan changes?
We gave the following options: one-on-one coaching; speaking to someone in our community; reading our research.
Here are the results:
|Method||Number in survey||Percentage|
|Speaking to someone in our community||12||22%|
|Reading our online research||39||72%|
Adding in the other sources (the on-going survey, coaching feedback forms, own knowledge), the complete totals are:
|Method||Total new sig plan changes||Percentage|
|Speaking to someone in our community||15||19%|
|Reading our online research||52||64%|
Note that the coaching figure is low compared to last year because we invested about one third of the time in coaching over 2014 as we did in 2013. We also invested several times as much time in online content compared to coaching, so one can’t easily infer that the research is more cost-effective than coaching.