Anonymous answers: Are there myths you feel obliged to support publicly? And five other questions.
It’s alarming whenever someone says “this is obviously the best thing to do”, when in reality we have very little information in so many spaces.
Anonymous
The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.
The advice is particularly targeted at people whose approach to doing good aligns with the values of the effective altruism (EA) community, but we expect most of it is more broadly useful.
This is the fifteenth and final in this series of posts with anonymous answers. You can find the complete collection here.
We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.
Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.
In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.
It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.
As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.
The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.
We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.
This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.
As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.
Table of Contents
- 1 What’s the worst advice you commonly see people give?
- 2 Do you have any philosophical views that you’d be hesitant to state publicly?
- 3 Are there any myths that you feel obligated to support publicly?
- 4 What are the biggest flaws of academia?
- 5 Let’s say the EA community doesn’t exist in 20 years (but there hasn’t been a major global disaster or anything). What’s your best guess for what happened?
- 6 How should people in EA think about having children?
- 7 Learn more
What’s the worst advice you commonly see people give?
‘You should obviously do this’
People give advice with near 100% confidence. They say “you should obviously do this”. And once someone respected says that “the best thing to do in this field is X” — impressionable people have a tendency to commit to that path, even if they have no reason to think they have the relevant background/skills to succeed.
It’s alarming whenever someone says “this is obviously the best thing to do”, when in reality we have very little information in so many spaces.
A lot of people are both smart and overconfident — and the smarter they are, the more easily they can lead other people astray. The thing they’re working on might not be obviously bad, but if they’re saying it’s obviously the only right approach, and they sound convincing, that can do a lot of damage.
‘You need to get into a priority career track’
I think the framing of trying to get you to focus so heavily on getting into a priority career track is probably overplayed. I don’t think it should be given zero weight, there are many reasons why it’s a good idea to go into a priority career track. If you have two opportunities that look about as good as each other, one in a priority career track and one not, you should go with the priority one. And that’s often the case for many people.
But everytime I hear someone say “I don’t really think I’d be good at this”, or “I don’t really think I’d like it, but I’m going to do it anyway because this is the priority career track that we’re supposed to do”, I think that person is usually making a mistake.
‘What have you achieved this week?’
A lot of people seem to think about career success with very short time horizons — if they haven’t done anything huge or impressive in the last couple of weeks they feel like a failure. So I think a lot of people are going around constantly feeling like a failure, when in reality their performance is perfectly good and their career trajectory overall is perfectly good. It think it would be a good idea to focus less on whether you’ve had a good week, and focus more on whether you’re having a good year.
‘Only do things you love’
Overall you want to get to a point where you love what you’re doing — otherwise you’ll just give up — but there may be specific times where for 2 or 3 years it’s better to accept a kinda horrible project and just muscle through it.
For academia, for example, the key thing is to get tenure. And if that requires sucking it up, and doing things that are less rewarding for a few years, that’s probably the wiser thing to do.
‘Say yes to everything, work at an EA org, don’t worry about networks’
To say yes to everything. Don’t do that — you’ll spread yourself too thinly. Be selective.
Advice that doesn’t include building a strong network.
Within EA — to try to work at an EA org.
Follow your passion!
Do you have any philosophical views that you’d be hesitant to state publicly?
We should cut people more slack
I honestly think that we should cut people more slack than we do. I often watch as an internet mob destroys someone — someone who undeniably did or said something bad — and basically think “There but for the grace of God, go I”. Not because I go around doing or saying things that are bad, but because I’m human and I get frustrated or don’t sleep or whatever, and I snap at people or I say something dumb for no obvious reason. Maybe not as bad as what they did, but certainly something that isn’t a reflection of who I am. But we often seem to think that when we do bad things, it’s usually at least partly because of something outside of our control, like stress, but if someone else does a bad thing, it must be because they’re a bad person. If you take this attitude plus norms around social exclusion that are designed for small groups of people and add the internet to it, I think you get a system that is often overly punitive.
You might think I’m basically saying something like “stop being so mean to people who are caught on camera doing bad stuff and mobbed by the internet” but I think cutting people more slack implies a lot more than that. It implies that we should stop wanting victims to be perfect before we take them seriously, for example. But it’s hard to make the point that even if someone has done a bad thing, maybe that doesn’t mean they should have their life ruined over it without just causing a bunch of people to turn their anger on you instead.
Feminism ought to focus more on practical concerns
I think that movements like feminism tend to focus on the edge of what is possible: like we’ve won so many victories for women, what might it look like to achieve something like true equality for even a small number of women?
But there are so many countries where women have nothing like those kinds of rights. And there aren’t that many abstract, intellectual questions about what needs to happen there to improve equality — they need better healthcare, and more money, and more freedom about whether they have a partner, when they have a partner, whether they have kids, when they have kids. And I worry that when people get so interested in the theory, these practical concerns get kind of lost.
I want people thinking about the cutting edge of feminism, but I don’t want everyone to think that’s where you should necessarily be putting all the resources. Maybe a compromise would be: I think we should put some intellectual resources there, but put most of our money somewhere else.
Factory farming is a moral abomination
It depends on the audience. I think factory farming is a moral abomination, but I won’t always mention that if I’m talking to people who are new to EA. It can just turn someone off completely — you can see them stop listening to you as soon as you mention animals. I’d always mention it if asked directly, but unfortunately it can be better to wait a while before you bring it up yourself.
You should be free to join any tribe you want
I think the ideal world would have no identity you are born with. No pre-assigned culture or race. You would be free to join any tribe you want to.
Also, I believe science is uniquely qualified to describe reality. Other types of knowledge (religious, cultural, revelation) are not as accurate at creating a model.
Some technically true things shouldn’t be said
I think there are things that are technically true — but that people still shouldn’t say because it’s impossible to say them without giving the wrong impression.
If someone says “curing TB is more important than helping hurricane victims”. That might be technically true just given the number of people affected by both. But instead of hearing the message “we both agree about how bad hurricanes are, and this shows how bad TB is”, people will just think that you don’t think helping hurricane victims is important.
We naturally think that the importance of things is zero sum — if you’re saying one thing is really important, you must think the other thing isn’t important. But I don’t think helping hurricane victims is any less important than most people do — I just have to make these decisions to triage where I can do the most good. So even though I think a lot of comparisons between causes are technically true, I think it’s hard to make them in a way that doesn’t do more harm than good.
I think it’s good to focus a little less on whether you’re technically right, and a little more on what your stated views convey to others.
Are there any myths that you feel obligated to support publicly?
‘We have free will’
I don’t go around saying that free will is an illusion, even though that’s what I believe.
‘We should prioritise climate change interventions at the expense of other long-run future causes’
I feel a lot of pressure to say that climate change is an important global catastrophic risk. And if you think that the most important way to do good is to positively influence the long-run future, it’s quite hard to make the case for prioritising climate change interventions. There are some people that say this publicly, but a lot more that think it privately. I don’t support the claim that longtermist EAs should be paying more attention to climate change, but I dispute it much less vigorously than what I think is right.
Speculation
I’m cautious with speculation. There are several things where I think I know what’s going on, and I haven’t said it publicly because there’s a chance I’d be wrong — and it’d be pretty damaging if I were wrong.
‘My field is doing very valuable work’
I certainly tend to talk up my own field — I don’t think I could come and say “well, this is kind of valuable work, but…” You should check in every now and then and think “is the work I’m doing actually valuable?”, but of course you can’t easily say that you’re considering swapping your career path to the people around you.
‘We should follow the conventions of society’
I think ‘myths’ is too strong a word, it implies that something is false. But I think it can be important to support some norms, even when you don’t think they’re fundamentally true.
There are a lot of norms that are just really good given the way the world is, and ones that I’d want to perpetuate.
For example, a norm where people generally follow the conventions of society seems like a good one — even if you disagree with some of those conventions. So, there’s a point at which laws become unjust, very seriously unjust, and then at that point maybe you want to deviate from them. But there’s a tricky zone where you should probably follow rules that you disagree with, just because there is group consensus around them.
What are the biggest flaws of academia?
Not enough focus on the most important questions
Academia is really good at selecting for bright, hard-working people — but doesn’t systematically get them working on the things that are most important. This is partly because there are incentives for them to work on things where they can demonstrate that they’re good and impressive. And that often means making progress on things that other people have been stuck on, which means working on the kinds of problems that people have already worked on a lot. There just isn’t necessarily a market for people to do work which falls between the cracks.
You see this at large scale within academia where work between disciplines is often neglected. But even in specialised fields, you get trendy topics and they attract a lot of work — and more important topics fall through the cracks.
There just isn’t much overlap between the questions tackled by academia and the most important questions. Theoretically, there could be but — at least until you get tenure — there are strong disincentives to work on the most interesting and useful problems. You’re incentivised towards working on the kinds of things that will get you tenure. You can try to find things that overlap, but basically you need to at least partially sell-out for a few years. And that’s hard.
Not valuing effectiveness enough
Being too siloed: You hear lots of people in academia say “we’re open to interdisciplinary collaboration”. And yet when you work in a variety of areas, you get feedback like “so… what do you focus on?” It still seems weird to most people in academia for someone to have diverse research interests, and that’s frustrating.
Not valuing effectiveness: The pressure to raise money also generally makes sense that if you can bring in money then you can do more high impact research. The problem is that the mainstream paths to raising money don’t usually value effectiveness nearly as much as they should.
When I was applying for academic jobs, I didn’t talk about what I really care about. I talked about my mainstream research — and then I introduced what I was hoping to work on after I started the job.
I don’t think the pressure to publish is a flaw, because I think even though it is easier to do EA forum posts, actually going through peer review does bring significant credibility to one’s work inside and outside of EA.
Let’s say the EA community doesn’t exist in 20 years (but there hasn’t been a major global disaster or anything). What’s your best guess for what happened?
A lack of diversity
I think the lack of diversity in the EA community is a big risk. It could lead to some things being said that blow up the movement and make it toxic to associate with or career suicide.
A damaging scandal
My first guess is that there was some really damaging scandal — such that almost no one could afford to be openly associated with EA. But I think it would still exist, just in private.
Causes splintering off
I’d imagine there might be a splintering off. Maybe one of the cause areas, like reducing catastrophic risks, becomes “the thing”. I’m sympathetic to longtermism, but it’s important to be able to check in and also consider “is this actually the right thing we should be working on?”
There’s also a risk in losing influence. No one’s going to listen to the EA movement if it’s just become “all x-risk, all the time” — that just sounds too crazy to the general public.
Different main cause areas get absorbed into their larger fields. Lack of cohesion. Community drama. Decrease in funding. All of that might just mean that there’s no longer this “EA community”.
A gradual degrading in the quality of the people
Probably just a gradual degrading in the quality of the people it attracts. The draw of EA for me was that it seemed like a magnet for all these super smart, thoughtful people who also cared about the world — if it becomes harder for the most talented people to meet each other (that they’re instead meeting people who are just okay), then it will become less attractive for them. The “quality” of the community would then gradually degrade.
The default is entropic, everything kind of degrades. So my intuition is that you need active, conscious injections of energy and effort to keep things at a great level. It would probably exist in some form, but maybe the best people aren’t interested anymore.
Ideas become mainstream
Probably the ideas became mainstream to the point where it didn’t make sense to have a ‘community’.
Lack of sustained motivation
I’d guess lack of continued motivation to participate along longer timelines. The community hasn’t really made many tools for long-term pledges or public accountability.
Not going to happen
Assuming no one comes up with an argument that EA doesn’t make sense, I can’t imagine it going away.
How should people in EA think about having children?
People shouldn’t deny themselves their deepest desires
I think that people shouldn’t deny themselves their deepest desires, in most cases. And so if having a child is a deep desire, I think it’s unhelpful and unhealthy to feel any guilt about having that child.
It’s unlikely that having a child is going to increase your impact, but if you do something that you really care about — like having a kid, or writing a novel — and then feel really guilty about it, you’re encouraging other people to feel the same way.
You can imagine a world where that’s correct — where you want a norm of deep self-sacrifice. But I think that is the wrong norm for EA, and instead we should have norms where maybe being an EA is one of three deep desires — and you just try to do a pretty good job of taking care of all three things.
I think the version of EA where you’re not allowed to pursue any other costly goal is a version that will rightly turn off a lot of the most important people — people who are passionate about a lot of things.
Man, it’s complicated though. Saying this, I notice that I don’t plan to have children so I can focus on having more of a positive impact, and there’s at least a part of me that deeply wants to.
When thinking about this topic, I think it’s important to think about what our goal is as a movement. While a strong utilitarian case can be made against having children, I think the value of seeming as normal as possible outweighs it. Most people want to have children — and it doesn’t seem worth the costs to encourage norms against such a natural desire.
Kids are a way of creating something positive in the world
Children are nice, if EA doesn’t work out at least you’ve created something positive in the world. Having a healthy family life seems potentially really valuable. If you think the future is likely to be good, do you want to be the only one of your lineage who didn’t get things together?
It’s hard to make the case for having more children
If you’re someone who doesn’t have an inclination towards having kids — embrace that. There can be huge benefits. The time saved can be extremely valuable.
That said, if someone wants to have children, they shouldn’t feel like they have to prioritize pure efficiency of not having one. Placing this need aside could have a negative impact and cause the person to resent their presence in EA, maybe being even less effective overall. On the other end, it’s hard to make the case that EAs should be making an effort to have more children. Children are time consuming and expensive.
Because of mean reversion, I think it’s likely that the children of EAs will tend to not be as effective as EAs, so I think EAs should have fewer children to maximize their impact now. As for intelligent people in general, the Flynn effect (IQ increasing over time, though maybe that has stalled?) is larger than the effect due to lower birthrate of higher education people, so I don’t think it’s a huge deal, especially because I think that things are going to change radically in the next 100 years anyway.
It doesn’t matter that much
I’m not sure it matters as much as people seem to think. If you want to have kids, it seems fine to have them. It also seems fine to choose to sacrifice that part of your life by delaying having kids, say, in order to strengthen parts of your career. Or to say “if I wasn’t doing important work I would want to have children — but because I think I am doing important work, I’m happy to make that sacrifice”. You get to make these choices about what you’re willing to sacrifice, and what you’re not.
That’s not to say there isn’t some correct answer to the question if you’re trying to maximize your impact. There’s probably an answer to the question about which hobbies EAs should have too. “Should this particular EA take up woodworking?” — there’s probably an answer to that, but I think we all think there needs to be space for people to make these kinds of decisions based on how it will affect them, rather than the impact it will have on the world.
I think you should probably view having kids as you would other parts of your life that would take up a similar amount of time and money. You shouldn’t assume your kids will have a positive impact, they’re probably not going to go out into the world and spread views about doing the most good (if that’s important to you, you could achieve the goals of future generations sharing your views by writing etc). But that doesn’t matter. I think we shouldn’t try to give an argument for having kids based on impact, because it kind of presupposes that this is how we should make all decisions. People need to have well-rounded lives.
Enjoy this?
Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:
Learn more
Other relevant articles
- Your career can help solve the world’s most pressing problems
- All the evidence-based advice we found on how to be successful in any job
- Find a high impact job on our job board
- Career advice I wish I’d been given when I was young
All entries in this series
- What’s good career advice you wouldn’t want to have your name on?
- How have you seen talented people fail in their work?
- What’s the thing people most overrate in their career?
- If you were at the start of your career again, what would you do differently this time?
- If you’re a talented young person how risk averse should you be?
- Among people trying to improve the world, what are the bad habits you see most often?
- What mistakes do people most often make when deciding what work to do?
- What’s one way to be successful you don’t think people talk about enough?
- How honest & candid should high-profile people really be?
- What’s some underrated general life advice?
- Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
- What are the biggest flaws of 80,000 Hours?
- What are the biggest flaws of the effective altruism community?
- How should the effective altruism community think about diversity?
- Are there any myths that you feel obligated to support publicly? And five other questions.