Evaluation of our one-on-one coaching service

Introduction

This document evaluates our most recent round of career coaching from June 2013 – January 2014.

Summary

  • Our latest round of coaching caused changes in career plans, and exceeded our targets for the percentage of people who significantly changed plans. 56% of people (10 out of 18) participating in case studies made significant plan changes. 39% of people (7 out of 18) participating in short one-on-one coaching sessions also made significant plan changes.

  • The cost of the time put into coaching from July to February was £9,198. With 17 significant plan changes, the average cost of a change was around £541. The time put into coaching led to additional benefits, like the value of the research involved in the case studies.

  • The coaching taught us more about which questions are relevant to decision-making, and which are commonly faced by people wanting to make a difference through their career. The most common kind of question involved comparing career options. The most commonly asked about careers were those in finance, entrepreneurship, working in effective altruist organisations, software and consulting.

  • We also learned how to further improve our coaching through the feedback we received, and by analysing who didn’t change plans and why. We have subsequently made several changes to our coaching process.

Did our coaching change careers?

Our coaching

We did two types of coaching over the last 6 months: case studies and short one-on-ones.

Case studies

The process works as follows:

  • The coachee fills out a career plan, containing their cause, mission and next steps.
  • We meet to clarify their plan and greatest uncertainties.
  • We do around ten hours of original research and prepare a short report giving our current best answer to their career questions, and linking to the most additional relevant resources.
  • We also aim to introduce them to several coaching alumni.
  • We meet again to discuss the findings and create an action plan.
  • We finish by asking for feedback and collecting information on how their plans have changed.
  • We may follow up further by email.

You can see a sample case study here (with the attached research report), and our full list of write-ups.

Short one-on-ones

The process works as follows:

  • The coachee fills out a career plan, containing their cause, mission and next steps.
  • We meet for 45 minutes to clarify their plan and greatest uncertainties, and tell them as much as possible that’s relevant to their questions.
  • We link them to the most relevant additional resources.
  • We also aim to introduce them to several coaching alumni.
  • We finish by asking for feedback and collecting information on how their plans have changed.
  • We may follow up further by email.

Our process for testing whether our coaching changed careers

Our process was as follows:

  1. We recorded people’s career plans before their coaching began.
  2. After the coaching, we asked them again for their career plans and what new information they gained.
  3. We then asked them to confirm if any of the changes were due to us, checking whether their changes corresponded to new information that they gained from coaching.

What counts as a significant plan change?

We define a significant plan change as follows:

An individual has made a significant plan change if they say they have changed their credence in pursuing a certain mission, cause or next step by 20% or more, they attribute this change to 80,000 Hours, and there’s a plausible story about how engaging with 80,000 Hours caused this change.

For instance, if someone says they anticipate going to medical school with probability 55% and law school with probability 45%, then they read an article on the 80,000 Hours blog and switch to 75% medical school and 25% law school, that would count as a significant plan change.

We count the plan change as plausibly due to us if:

  • The person tells us that they changed their plans and they think it was due to us.
  • We can identify the new information we gave them that changed their mind.

This metric doesn’t fully capture our impact, but we think it’s a useful indication. If we’re not causing any significant plan changes, we’re probably not having much impact. But if we’re causing lots of significant plan changes, then we’re probably having significant impact.

How did we do?

In November 2013 we set the following target for our case studies:

  • Good: 30%+ make significant plan changes
  • OK: 10-30% make significant plan changes
  • Worrying: Less than 10% make significant plan changes

We ended up with 56% of case studies (10 out of 18) making significant plan changes, which exceeded our target. If we include an extra five case studies which were started but were discontinued, the figure goes down to 43%, which still exceeds our target. However, we spent little time on these discontinued case studies.

We also set the following targets for short one on ones:

  • Good: 10%+ make significant plan changes
  • OK: 5-10% make significant plan changes
  • Worrying: Less than 5% make significant plan changes

We ended up with 39% of short one on ones (7 out of 18) making significant plan changes, which also exceeded our target. There was one extra short one-on-one we did but we never heard back from the coachee after the session. If we assume they didn’t make a significant plan change, then the figure goes down to 37%, still significantly above our target.

Note that there’s a reasonable chance that these people would have changed their careers without coaching, for example due to other 80,000 Hours content or through talking to other people in the effective altruist community. We would like to investigate this in the future with a randomised controlled trial.

What did these plan changes involve, how valuable were they and were they really due to 80,000 Hours?

Because these questions are so crucial to us, we’re doing a separate analysis of all the plan changes we know about, including from channels other than our coaching, which will be posted on the blog in the next two months. In this further analysis, we’ll summarise what the plan changes typically involved and analyse 20 changes in depth.

Meanwhile, below is a quick list of the types of changes that our coaching caused. The changes can roughly be categorised as people becoming more likely to pursue careers in the following areas:

  • Earning to give – 6 people
  • Working at or founding effective altruist organisations – 4 people
  • Tech entrepreneurship – 2 people
  • UK party politics – 1 person
  • Finish degree – 1 person
  • Doctoral study – 1 person

And to change to the following causes:

  • Donating to meta-charities – 2 person
  • Prioritisation research for improving the far future – 1 person
  • Reducing risk of extinction – 1 person
  • Evaluating aid effectiveness – 1 person
  • Promoting Effective Altruism – 1 person

List of significant plan changes

ChangeFollowed through?
1 Substantial update in favor of earning to give (from 20% to 50%) from medical research; in favor of supporting meta-charities (from 0% to 20%) from global health. Next steps changed to include applying for more finance roles immediately and applying for research roles to learn more about them.Yes (for next steps)
2Reduced chance of doing Philosophy PhD/Masters from 70% to 10%
Increased working for EA organisations from less than 10% to 30%
Increased finance from 20% to 35%
Yes
3Changed cause to prioritisation research for helping the far future, from development of sustainable financeNot enough time passed
4Decided to finish degree; plans to spend 2 months learning about finance part-time; more in favor of finance and consulting and more in favor of earning to give, from favouring direct workNot enough time passed
5Added causes of reducing extinction risk and cause prioritisation.
Increased confidence that doing the BPhil was the best next step.
Yes (for cause)
6Changed to UK party politics, from EA org or lawYes
7Changed from starting a psychology masters to working at 80kYes
8Changed from starting a maths post-doc to going into financeYes
9Reduced plan of doing a Masters in Management by 20%. Changed top option to earning to give through a career in consulting.Not enough time passed
10Added aid effectiveness evaluation as a cause at 40%
Added researching two non-profit startup accelerators to decide whether to apply with their startup
Added looking into further study on aid effectiveness to next steps
Not enough time passed
11Causes: Added promoting effective altruism as top cause (at 50%), from global poverty
Starting an EA org increased to 50% chance, from 20% (campaigner was top choice before)
Yes (for cause)
12Changed plan to earning to give through a corporate career from aerospace engineeringNot enough time passed
13Added to their plan working in effective altruist organisations, pursuing finance, and building a startup.Not enough time passed
14More likely to pursue entrepreneurship in the medium term and politics in the long term.Not enough time passed
15Increased likelihood of dong a PhD in logic (to working on safety in artificial intelligence), from doing finance.Not enough time passed
16Reduced plan to pursue consulting or software engineering. Increased plan to join a tech incubator or a tech startup.Not enough time passed
17Changed plan to take the highest earning finance job, rather than one with the best lifestyle. Increased plan to donate to meta-charities, from global poverty.Yes (for taking job)

How many have followed through?

7 out of the 17 plan changers (41%) have already followed through. For the rest, not enough time has passed yet. We will follow up with them to check whether they end up acting on their plan changes.

How valuable was the case study research?

Here is a list of the research that we produced for the case studies, listed in order of blog quality rating where available. Note that some of these documents led to further research which isn’t listed here. We link to the research when it is already on the blog. The rest of the research will be online in the coming months.

The average research quality score of the research which has been rated is about the same as our research in general. (We have a research quality score for our research posts, which you can see at the bottom of each post.)

Cost-effectiveness

Total costs in hours

Staff and interns spent 900 hours on coaching from July 2013 – February 2014. Note that this includes the time that was spent on six case studies which were discontinued. This was usually because there wasn’t a question we could help the person with. It also includes the selection, scheduling and writing up time.

There were a total of 17 plan changes from coaching in this time. So the average time cost per plan change was 53 hours.

If we split this up between case studies and short one-on-ones, the breakdown is as follows:

Total hoursAverage hours per coacheeAverage hours per significant plan change
All coaching9002553
Case study7594276
Short one-on-one141820

Note that we also did direct time tracking of how long we spent on a case study or short one-on-one. When we add those up, the total is 441 hours. This is significantly less than the total estimate above. The reason the total time investment is higher than the direct time tracked is that there were other things which took up time, outside core coaching:

  • Developing the process for coaching
  • Choosing coaching candidates, communicating with applicants
  • Scheduling meetings
  • Pre-meeting email, follow up email
  • Informal coaching of 7 other people
  • 6 discontinued case studies
  • Coaching team meetings
  • Writing up summaries, getting these checked with coachees

Cost per career change, not including value of research produced

The cost of the time put into coaching from July to February was £****9,198, when we cost coaching staff time at £12.50 per hour, and research/coaching intern time at £6.25 per hour (which includes all overhead costs from the operations team).

With 17 plan changes, this works out at £541 per plan change.

If we split this up between case studies and short one-on-ones the breakdown is as follows:

Total costFinancial cost per significant plan change
All coaching£9,198.18£541.07
Case study£7,907.03£790.70
Short one-on-one£1,291.15£184.45

If we instead cost staff at £20 per hour, interns at £7.5 per hour and volunteers at £10 per hour to take opportunity cost into consideration, the total costs of the time put into coaching from July to February come to £14,275, making the cost per plan change £840 when taking opportunity cost into account.

The breakdown between case studies and short one-on-ones would then be as follows:

TotalCost per plan change
All coaching£14,274.55£839.68
Case study£12,769.27£1,276.93
Short one-on-one£1,505.27£215.04

The full impact evaluation in our 6 month review will consider our cost effectiveness.

Did the case studies help us find good research questions?

What we were asked about

We collected a list of all the questions that we were asked about through the coaching. In short:

  • The most common type of question involved making an overall comparison between a number of career options.
  • The options most commonly asked about were finance, entrepreneurship, effective altruist organisations, software and consulting.
  • The other main types of questions concerned: what the best opportunities are within a given cause, how to get into different careers, and how to weigh different criteria for comparing options.

Most questions only came up once, but the following were each asked three times, suggesting that they are widely applicable:

  • What are the career options in finance? (3)
  • What are the salaries in finance? (3)
  • What good opportunities might I be missing in my plan, and which next steps do they imply? (3)
  • How do I get into entrepreneurship? (3)

What we worked on

We spent most of our time on researching comparisons between options. We also spent some time on some issues relating to one particular option, and for one person we spent some time researching options within a cause. This is similar to the overall distribution of the questions we were asked. We didn’t do original research on the questions we were asked in short one-on-ones, but they informed our picture of which research questions to work on in the future.

Here is what we worked on, separated into broad categories:

Comparisons

Single issues

Causes

  • A list of people and organisations working on improving decision making, and our thoughts on the different ways of working on this.

What we learned

We learned which careers our coachees most want to know about. The number in brackets refers to the number of people who asked us about the career:

  1. Finance (9)
  2. Entrepreneurship (7)
  3. Politics, policy, campaigning (4)
  4. Effective altruist organisations (4)
  5. PhD (4)
  6. Software (3)
  7. Consulting (3)
  8. Law (2)
  9. Medical / scientific research (2)
  10. Masters (2)
  11. Value of undergraduate degree (1)
  12. Marketing (1)

Key uncertainties relevant to coachees’ decisions with the most popular options

One key uncertainty with comparing finance to other options is salary. A key uncertainty with entrepreneurship is expected earnings. For politics, policy and campaigning one key uncertainty is the career capital that you gain from early in your career if you don’t succeed. Another key uncertainty here is how good these paths are even if you succeed, given the low chance of success. The key uncertainties with working in effective altruist organisations is the career capital, how to compare impact now against building career capital, and how to compare direct work with earning to give.

For the causes of improving decision making, prioritisation research and animal welfare coaches wanted to know what the concrete job opportunities were for working on these causes.

Interestingly, no one explicitly asked which jobs they would find most satisfying or be happiest in, so this wasn’t identified as a factor relevant for further research in this round of coaching.

How did the coaching change our research plans?

Our initial guesses at the most high priority research questions were reasonably accurate. For instance, we thought in August 2013 that further work on finance, entrepreneurship and whether to do a PhD was high priority, in that order. So the coaching didn’t significantly change our research priorities. Still, it felt like an effective way to drive our research process for the following reasons:

  • The coaching suggested new topics to us, such as work on biomedical careers, social finance and the value of a degree. It was useful to put these on our radar. However, in this round we probably spent too much time on these topics. Going forward, we’ll wait for a topic to come up several times before researching it.

  • Within broad areas like finance, we acquired a better idea of which precise questions to work on. For instance, the coaching encouraged us to seek out various types of concrete information about specific careers, because people asked for this and found it useful. Examples include information about: expected earnings, what routes lead into certain careers, what experts commonly recommend in the field and what career capital you gain.

  • It also influenced our research methods. For instance, the coaching made us more in favour of doing interviews, since we realised they were a quick way to be highly useful to our coachees. It also made us more inclined to examine the details of individual cases, rather than searching straight away for general models.

  • It was motivating to work on concrete questions for specific individuals.

  • The coaching increased our confidence in some of our key research content. For instance, it further encouraged us to invest in the careers list, since so many people found it useful to hear a basic overview of the options with simple pros and cons, even if the list is highly dependent on judgement calls. It also made us more confident that our list of factors to prioritise in choosing between careers (influence, cause effectiveness, career capital/options open, discovery value, fit, other personal constraints) covers the most important factors, since we found them repeatedly useful and didn’t notice any other major factors.

How will our research process work in the future?

  • We’ll do successive rounds of coaching around 10 to 20 people over around a month, followed by one to two months spent researching the most pressing issues we uncover.

  • We’ll prioritise the research questions as follows. We’ve created a list of questions based on the questions we have been asked in coaching and our own thoughts about what is useful. We’ll rate each question on (i) how often it has come up as a crucial issue in coaching sessions (ii) how often we’ve been asked it by coachees (iii) our best guess as to how tractable it is (iv) our best guess as to how useful it would be to answer. We’ll work on the questions that then rank highest overall on these four factors.

Who were our coachees?

Student vs Working

Total

22 students (61%)
14 working (39%)

Case studies

7 students (39%)
11 working (61%)

Short one on ones

15 students (83%)
3 working (17%)

Ability

The people we did case studies with had exceptionally high levels of achievement and ability. 15 out of 18 went to top ranked universities (Oxbridge, Ivy League, Stanford or Berkeley) and/or work in highly competitive jobs.

For example, one person we did a case study with is 24, went to an Ivy League university and was then working at a top strategy consulting firm.

The same goes for the people we did short one-on-one coaching sessions with. 10 out of 18 went to top ranked universities and/or work in highly competitive jobs.

Did they help us raise donations?

As seen, 22 of our coachees were students (61%) and 14 were working (39%). One coachee is already a donor.

We didn’t ask most of our coachees for donations either because they were students or because they were between jobs, were interns, or were in interim jobs.

We asked four coachees for donations. Two agreed and two didn’t respond. Of the two who agreed, one agreed to give in a few years and the other agreed to give £1k, though we haven’t received the money yet.

One coachee made a donation of £25 before a session without being asked.

We expect most of the donation benefits from coaching to lie in the future, because people starting or changing careers are not normally focusing on donations.

How could our coaching be improved?

What feedback did we get?

We asked our coachees for feedback. Here is a summary of what we learned.

Answers to ‘What would have made the process more useful?’

The answers we received can broadly be categorised into suggestions for information that would have made the process more useful and suggestions for the way in which the coaching is delivered. There were only a few things that were mentioned by more than one person, which are listed below. The number in brackets refers to the number of people who mentioned the point.

Additional information that coachees wanted:

  • More numbers and statistics in our information in general (3)
  • Narrowing of career options by eligibility and chances of success, based on the background of coachees (3)

Suggestions for changes in the way we deliver our coaching:

  • More introductions to people in the careers that coachees are interested in (4)

With short one on ones:

  • Take greater care to clarify what is already known to the coachee (2)
  • More follow up (2)

Answers to ‘What did you find most useful about the process?’

Again, only a few things were mentioned by more than one person:

Case studies

  • Developing a framework for comparing options (2)
  • Getting outside input on possible blindspots (2)
  • Increased confidence in choice of next step, which increased motivation in taking appropriate action (2)

Short one on ones

  • Getting to think through ideas clearly and writing out plans (6)
  • Introductions to other people in careers, domain experts and people making similar decisions (3)
  • Finding out about new career options (2)

Usefulness and recommendation ratings

Usefulness rating

In the feedback form we also asked “How useful did you find the process?” Possible answers ranged from 1 (completely useless) to 7 (really useful). 31 out of 36 people responded. The average score was 5.5. The distribution looks like this:

There were a few coachees who we didn’t help because they already knew our content very well, or because they weren’t motivated to pursue highly altruistic careers. Still, most people found us useful, and 52% found us very useful, giving a score of 6 or 7.

Our case studies had a larger fraction of people rating them very useful (64%) compared with the short one-on-ones (41%).

Recommendation rating

“How likely are you to recommend an 80,000 Hours case study to a friend?” had an average score of 6.3 out of 7, where a score of 1 meant “definitely won’t” and 7 meant “definitely will”.

The distribution of answers looks like this:

The short one-on-ones had slightly lower scores than the case studies. This is probably because the case studies involved more meetings and original research.

Net Promoter Score

The Net Promoter Score is a customer loyalty metric, which is calculated by subtracting the percentage of customers who are detractors (those who give a score between 1 and 4 on the recommendation question) from the percentage of customers who are promoters (those who give a score of 6 or 7 on the recommendation question). The score ranges from ?100 (everybody is a detractor) to +100 (everybody is a promoter). A positive score is good and scores over +50 are seen as excellent.

Our overall Net Promoter Score was +77. For case studies, the score was +93 and for short one-on-ones +65.

How do these scores compare to our previous metrics report?

Dec 2012 – May 2013
Short one-on-ones
June 2013 – Jan 2014
Short one-on-ones
June 2013 – Jan 2014
Case studies
Average usefulness score5.45.45.5
Average recommendation score66.16.4
Net Promoter score+65+65+93

Our short one-on-one scores are around the same as in our previous round of coaching. Our case studies also have around the same average usefulness score as our previous round, but they have a substantially higher Net Promoter Score.

We are pleased that our Net Promoter Score has gone up with the case studies, but somewhat disappointed that our average usefulness score hasn’t gone up. We will monitor whether this increases in our next round, as we learn from and respond to feedback.

Who didn’t change plans, and why?

In the case studies, the reasons for not changing plans were as follows:

  • Three people didn’t change plans because their existing top option was shown with further research to in fact be best. We think it was still the right decision to do case studies with these individuals because we weren’t able to predict this outcome in advance.
  • Two people lacked the motivation to invest significant time in making a decision. This is something we can better screen for in our selection of candidates.
  • Two people already knew our research and ideas very well, so there was little we could add. For such people in the future, we may want to let them lead on research into their decisions, and just give them any extra information we might have that they aren’t already aware of. We could also make introductions to people in our network who may help them.
  • One person didn’t gain decisive information about their options. We should work harder to deliver decision-relevant information, though in some cases this may not be possible.
  • One person’s main question wasn’t tractable. This was difficult to foresee, so we think it was still the right decision to do this case study.

The coachees who did make significant plan changes all did so because we gave them new decision-relevant information. Individuals we were closer to were no more likely to make significant plan changes.

In the short one-on-ones, the reasons for not changing plans were as follows:

  • For three people, there wasn’t enough time to find out which information was most relevant to their decisions.
  • For three people, we couldn’t give them new decision-relevant information because we hadn’t previously researched the relevant questions.
  • Two people confirmed their existing plans with our advice. Agai, we we think it was the right decision to coach these individuals because we weren’t able to predict in advance that their plans would get confirmed.
  • Two people weren’t facing an immediate decision, so they were less likely to change their confidence in what they will do later.
  • One person faced a highly time sensitive decision after speaking with us, so they didn’t explore the other options we suggested.
  • One person we only helped with how to succeed in a given path, as opposed to switching between careers.

The coachees who did make significant plan changes also did so because we gave them new decision-relevant information. Two coachees gained this information from people we introduced them to.

How can we improve coaching going forward?

  • Screen candidates better for (i) their motivation, (ii) whether they are facing an immediate career decision, (iii) how familiar they are with our material (those who are very familiar with our research should lead their own studies, with only initial input from us.)
  • Work even harder in coaching sessions to identify what information will be decisive between options.
  • Take care to check what coachees already know.
  • Give more introductions to people in careers that coachees are interested in. As we coach more people and develop our network, this becomes easier.
  • The short one-on-ones were too short for some coachees, so if necessary we will spend longer than 45 minutes searching for decision-relevant information.
  • We will offer a follow-up session to those who would find it useful.
  • Now that we have them, we will distribute our lists of best careers and causes for people to read before our coaching sessions, to help coachees discover new options.
  • Where possible, we will try to include more useful numbers and statistics in our research.

Conclusion and further work

By monitoring people’s career plans, we learned that we can change them through coaching. We’ve learned a lot about how to do coaching better. We’ve also learned more about which research is most useful for changing people’s careers.

As we move into our next round of coaching, we will also follow up with the last cohort to see how many following through on their plan changes, so we can determine how many career changes the coaching really caused.

Author: Roman Duda

Roman did a fully-funded Masters in Philosophy at Oxford, during which he also consulted for a major energy company on their business strategy, whilst also studying Mandarin Chinese. With a strong interest in Quantified Self, he has worked at the start-ups Memrise and Self Spark.