#2 – David Spiegelhalter on risk, statistics, and improving the public understanding of science
#2 – David Spiegelhalter on risk, statistics, and improving the public understanding of science
By Robert Wiblin · Published June 21st, 2017
My colleague Jess Whittlestone and I spoke with Prof David Spiegelhalter, the Winton Professor of the Public Understanding of Risk at the University of Cambridge.
Prof Spiegelhalter tries to help people prioritise and respond to the many hazards we face, like getting cancer or dying in a car crash. To make the vagaries of life more intuitive he has had to invent concepts like the microlife, or a 30-minute change in life expectancy. He’s regularly in the UK media explaining the numbers that appear in the news, trying to assist both ordinary people and politicians to make sensible decisions based in the best evidence available.
We wanted to learn whether he thought a lifetime of work communicating science had actually had much impact on the world, and what advice he might have for people planning their careers today.
Highlights
…What do we hear in the news? We hear about Ebola, we hear about terrorism, we hear about the latest threat that might be in what we eat and the way we travel, and we get very concerned about this, whether it’s a plane crash or whatever. Because that’s what’s in the news, that’s what is available to us. That’s what’s so prominent, but of course, so many of these risks are actually very small indeed…
…
What I am proud of is being part of a general community that’s very strong in Britain, to do with public engagement in science, which I’m just a small part of that because it covers material on the radio, stuff on television, stuff in some newspapers, and in various agencies. For example, in Statistics Authority, which is just trying to take a much more critical attitude to the way that numbers and evidence are used in society. I think it works. In Britain, we’re rather good compared with most people about, I don’t know, we don’t have these massive fears of vaccinations and nuclear power, of even GMOs. I think this is a sign that we in this country have developed quite a good public engagement with science community.
Articles, books, and other media discussed in the show
Transcript
Jess W: Here at the Centre for Effective Altruism, we’re interested in finding ways to compare what it means to do good, and to figure out which ways of doing good do the most good. We ask questions like, “Which charity should you donate to if you want to help as many people as possible? What should careers should you follow if you want to improve the world? Which cause areas have the largest impact?” These are the sorts of questions that we think it’s really important to get clear so you know how you can make a real difference.
Robert W: To help us answer some of these questions, we’re joined by Professor David Spiegelhalter, the Winton Professor for the Public Understanding of Risk at Cambridge University. Professor Spiegelhalter spent much of his life trying to improve public understanding of statistics, science, and risk in ordinary life. He regularly appears in the UK media and writes on his blog, Understanding Uncertainty. It’s great to have you with us today, David.
David S: Hi.
Robert W: We’ve got so many things we’d like to talk about. We’ll see how far we get, but first, tell us a bit about who you are and your position.
David S: Yeah, I’ve got a strange job, really. I’m in the maths department at Cambridge, and I teach statistics to undergraduates, but I’m actually funded by a hedge fund, Winton Capital Management. I’ve received my chair in order to improve the public understanding of statistics and risk. What I have done since I’ve had that job, for the last eight years, is I suppose to try to join the general community of people who are trying to improve the way numbers in particular are discussed in society.
Robert W: What kinds of questions do you research and what outreach do you do to help the public understand numbers and uncertainty better?
David S: Well, I get asked to do a huge amount of stuff. I suppose helping in various agencies, communicating risk, about cancer risk, about the risk of screening for example, risks and benefits of screening. My background is in medical statistics, so that’s what I get to do quite a lot, but I’ve done TV programs about climate change, I’ve done radio stuff about all sorts of threats to society, from Fukushima and so on. Everything to do with trying to get a handle on what are the important threats to us individually, and how we might go to make comparisons between those.
I’m very interested in rather than the great global existential risks, I’m more interested in the things that affect us all individually, about how we eat, our exercise, transport, and so on.
Jess W: One of the things that I find really interesting in this area is that there’s a lot of evidence that people tend to be very poor at estimating lots of different probabilities and so end up overestimating the scale of some problems or risks that they face, and underestimating others. From your experience, what kinds of problems do you think that people tend to be most prone to overestimate or underestimate and get wrong in very harmful ways?
David S: Well, there’s been a lot of research on this by psychologists as you know, and of course when we talk about people, I always include myself in this. I’m a subject to the winds of my emotional gut reactions as much as anybody else. I’m not making any claim about some superior knowledge and rationality compared with everybody else, but we just know that the way we respond to things, to simplify, we can think in two different ways. We can respond with our guts or we can try to engage our brain and think slowly about stuff. This is so relevant when it comes to risk.
Particularly this idea that what’s available to us, this availability heuristic is very strong, that what we hear in the news, we hear about Ebola, we hear about terrorism, we hear about the latest threat that might be in what we eat and the way we travel, and we get very concerned about this, whether it’s a plane crash or whatever. Because that’s what’s in the news, that’s what is available to us. That’s what’s so prominent, but of course, so many of these risks are actually very small indeed.
This is not a threat to us, particularly at all. The things that are much more familiar, and we don’t hear much of, for example, heart disease, cancer, all the sort of stuff that we have get largely amount of that is because of the way we live, our lack of exercise, our crummy diet and so on. Of course, people get a bit bored with that, and don’t get so concerned about it.
Robert W: Do you feel you’ve had any success improving public understanding of risks and getting people to focus on the stuff that really matters?
David S: Oh, I don’t know. It’s very [inaudible 00:04:23]. I mean, it’d be great to think so. I’ve been involved in some good projects I’ve been very proud of. For example, the redesign of the cancer screening leaflets in the UK which present the benefits and risks of cancer screening in a very balanced way. They’re hugely innovative in that they don’t actually recommend people go for screening. They just say, “Well, these are the possible benefits, these are the possible harms, make up your own mind.” I believe that’s the right way to go about communicating risk, is not to say, “Oh, you’ve got to watch, you’ve got to watch out, this is terrible.” Say, “Well, if you do this, this might happen, or it might not happen, and weigh it up,” and actually give people credit for some intelligence which I think people basically are. Don’t think people are so stupid.
What I am proud of is being part of a general community that’s very strong in Britain, to do with public engagement in science, which I’m just a small part of that because it covers material on the radio, stuff on television, stuff in some newspapers, and in various agencies. For example, in Statistics Authority, which is just trying to take a much more critical attitude to the way that numbers and evidence are used in society. I think it works. In Britain, we’re rather good compared with most people about, I don’t know, we don’t have these massive fears of vaccinations and nuclear power, of even GMOs. I think this is a sign that we in this country have developed quite a good public engagement with science community.
Jess W: Yeah, I was actually going to ask you about what potential you think there is for young people who want to have an impact in their career, going into a career path similar to your own and trying to get into public outreach and engagement to have a broader impact, rather than necessarily just doing research and academia. How hard do you think it is to do this kind of thing? It sounds like this is definitely something you wish more academics did work on.
David S: Absolutely. I mean, I really came to it quite late as a real part of my career. As I said, again, because a philanthropist in a hedge fund provided the funding to be able to do this full time, but I was doing some before as part of my job. I increasingly feel that it’s actually a duty of academics who after all, are publicly funded, for a certain proportion of them to really spend some time on public engagement. It doesn’t suit everybody, it’s not everybody at all, although I think everybody should have a website explaining what they do. They should be also supported and there should be incentives within the career structure for academics to do this.
I’m quite pleased, in Cambridge for example, when we’re looking at promotion of people, their public engagement is taken very seriously indeed. It is something I strongly support. It’s not everybody, but I think it is a very important for academics to do, as they’re doing their work, in whatever area, it has a relevance to society and it can potentially improve society. They should be working on it. I personally think statistics is a particularly important area for improvement of society. I feel this very strongly, many statisticians do. In terms of public engagement, the Royal Statistical Society has now got an initiative of training up statistics ambassadors, young statisticians who want to do this, who really want to get out there and communicate the importance of their work and try to improve the way things are done with numbers and evidence. I think this is so exciting, such fun.
Robert W: A lot of young people we meet are thinking of going into academia. Do you feel like that’s a good place to be, to have a big impact in the world? Of course, there are also some drawbacks that you might be aware of from your experience of being in academia?
David S: Yes, yeah. I wouldn’t necessarily say to recommend, “Yes, you really should go into academia.” It doesn’t suit everybody and frankly, [inaudible 00:08:17] lifestyle is it [inaudible 00:08:18]. It’s very tough now. In most careers, there seems to be a lot more pressures on than there used to be, so it can be quite a tough call, I think. But there is potential to do a lot of good stuff and people I work with and know in other areas, whether they’re working on natural threats, natural disasters, whether they’re working in botany, whether they’re working in any area, plant sciences, whatever, are really dedicated to trying to improve a lot of the world. I’m so impressed by the dedication.
But it is a tough job, and you don’t necessarily see in the push to publish the work, you’ve got to do all this, you’ve got to go through all this business in order to build one’s career. It doesn’t necessarily appear at first that you’re actually doing great job in what you do. Very little of it you can see a direct impact. However, as I know what’s very good about it, is it’s building a lot of transferable skills that you can use. I find that in my statistical skills for example, are in massive demands, by all sorts of agencies. I could point in particular to the importance of the millennial development goals and the sustainable development goals which are replacing them, in the monitoring of the state of welfare in different countries around the world has become absolutely vital.
Just being able to measure things and know what’s going on and to develop good ways to do surveys, to actually look at what works and how you can improve these measures has become enormously vital, enormously important. I think, well again, I’m coming back to … I think statistics is a particularly valuable area to go into.
Jess W: Yeah, absolutely, and just a little bit more broadly, the key idea behind effective altruism is that we want to try and figure out which altruistic activities, which charities, which career paths, which causes do the most good. We think that being able to measure things, or at least trying to measure things is incredibly valuable in making progress on those questions. But obviously many of these questions are incredibly difficult to answer, so a common criticism that we often get is people saying something like, “Oh, it’s impossible to figure out which charities or careers do the most good so you’re wasting your time trying, and you should just go with your gut or help personal causes.” What do you think of this objection and do you think there’s anything to it? Concern about there being too much uncertainty.
We often respond by saying that it doesn’t mean that we shouldn’t try and estimating this clearly the best we have, but what are your thoughts on this challenge and how to respond to it?
David S: I mean, this is not a comment that’s just restricted to the area that you’re talking about, it happens all the time. It used to happen in healthcare. People used to say, “Oh, you can’t put a value on a human life. We’ve just go to do all we can to do good and we have to just go with our guts, essentially.” No, that view now, certainly in the UK, has been completely discredited. You can put a value on a human life. We do all the time.
Robert W: I’m not sure everyone would agree.
David S: Not sure everyone can agree, and of course …
Jess W: But you have to.
David S: … But NICE, for example, has been doing this for over a decade, putting the value on the marginal benefit of particular healthcare intervention and assessing what should be supported under National Health Service according to reach all the criteria. Now, we know this is not perfect. We know this does not measure everything, but it’s explicit, it’s transparent, and they’ve done their best, and I think it’s a massive success, and a huge example of other countries, how you can go about it. Now, we know it’s never perfect, it’s never perfect, but because people then trying to take that approach which I think has been extremely beneficial in healthcare, and move it into other areas.
Now, how can you do it … So people are trying to do it with the environment. “How can we measure the value of a tree?” Well, people are trying to measure the value of a tree, and I know what the value of a tree’s worth, and it’s quite a lot. But working out all these different, trying to measure the benefit of a or how [inaudible 00:12:31] sustainable environment, et cetera. What’s the value of a species? What’s the value of this? I mean, it seems it’s too easy to sit back and say, “Oh, you can’t do it. You can’t measure that.” Well, you can have a good go, you can have a good go. As you say, it’s always a balance between trying your best, realizing you’re never going to be able to do it, but not to be put off having a good go.
It’s like a lot of things we do in statistics, trying to measure things, trying to model things, that we always know what we do is inadequate, but that doesn’t mean we can’t do useful things.
Robert W: Can you tell us a bit more about NICE and how they prioritize health treatments within the United Kingdom?
David S: Yeah. I mean, that’s something I’ve worked on, a medical statistician. I’ve been a huge supporter of NICE, and essentially what they do is when they decide whether to have a new vaccine or whether decide to have a new recommend for drugs to be paid for under the NHS, for example, they will look at what’s the expected benefit of that intervention, and what’s the expected cost. Then, they look at how much it’s going to cost to provide an extra, or they call it “QALY,” a quality adjusted life year. One year life, it’s discounted if it’s poor quality, you will discount it, it won’t be worth a whole year.
Then, essentially they can make the comparison and they look, if something comes in at least than 20,000 pounds of QALY, just paid for, just, “Yep, fine, we’ll pay for it under the NHS.” If it’s more than about 30,000, then they really try to say, “Well, we don’t want to pay for this” and try to go back to the drug company to renegotiate a price, for example. Between 20,000, and 30,000, well, that’s more in the gray zone. This has been enormously beneficial, partly in order to go back to drug companies and get them to reduce their prices, and also to see that some interventions, for example, cancer screening, come in extremely cheap. They’re just worth doing.
It also means you have to be explicit, for example, in how much you value the future rather than the present, because they put in a discount rate, currently three and a half percent. Now, that means that a year of life in 25 years’ time, for example, is only worth about half what a year of life now is worth. That’s taken into account. Now, that of course is controversial and if we were talking about big policy decisions, for example about climate change, that’s far too big a discount rate. We wouldn’t care less about the world in 100 years time if we used that discount rate, so in different areas you might be wanting to use a different discount rate. But these are taking economic ideas and moving them into very human decisions, and I think they are enormously helpful and illuminating, but they’re never perfect.
Robert W: Yeah. I’ve actually written an article with Professor Toby Ord about why we shouldn’t discount health, even though we should discount financial returns, about why
it’s appropriate in one case and not in the other. We’ll put up a link to that on the website. A lot of people who are planning out their careers, they feel just overwhelmed by uncertainty. They don’t know how much they’re going to earn in one career or what’s the chances of getting into academia, or being elected to parliament. How do you think people ought to make decisions when they’re just surrounded by so much uncertainty?
David S: Well, I mean you could say we’re always surrounded about … This again, this is common to everything we do. We never know what’s going to happen in the future, which is great. People don’t want to know what’s going to happen in the future. People don’t even know what they’re going to get for Christmas, let alone what’s going to happen for the rest of their lives. So, this is just a common problem, and we need to distinguish what we might call risk and uncertainty.
The risk is when it’s a very well defined problem, there’s short-term issues and buying lottery tickets. You know what the chances are, and then we can take a rational approach. When we’re dealing with deeper uncertainty, when really are not even sure what the options are, of how our lives might develop, then it’s very difficult to take a completely formal approach. I wouldn’t try to say, “Oh, we should be able to do all this mathematically” at all, however, the basics of qualitive ideas, of thinking through a rational decision are still very valuable. Because you think of, “What are the options available? What are the possibilities? What are the possible consequences of what I might do?”
Now, that’s always inadequate. You’re never going to be think about the thing, and however, by thinking through that, some things might come immediately apparent, some immediate rankings of what is preferable and not preferable might become apparent. Otherwise, you might need to fall back and think on broader strategies for making decisions in the face of what we call “deeper uncertainty.” Now, this is a deeply contested area. When governments make policies, very often they’re in these situations where they can’t even think of all the things that might happen, so they’re in the situation of deeper uncertainty and people have tested various ways to go about it.
The first thing is to not to think that you can optimize. You can’t be perfect, you cannot be perfect. The sort of suggestions people make in terms of general decision-making in these contexts are to do with flexibility and resilience. You don’t want to commit yourself to something you’re not going to be able to change because you can’t predict everything that’s going to happen. You want to build in flexibility and adaptivity to your decision, and that’s the same in any business, any government, any project at all. The other thing is building resilience in which you are essentially can cope with the unexpected and things.
That means both the possibility of really good things happening as well as the possibility of really bad things happening, that you are making sure that you haven’t put all your eggs in one basket, essentially. So if something goes wrong, you haven’t completely gone up a total blind alley that you can’t renegotiate, which means operating from a most robust basis as possible. It means not trying to optimize the single path, but building in robustness, which means I think in terms of career of course, building in transferable skills you’re going to be able to use in a variety of circumstances. I think these are standard tactics to respond to deeper uncertainty, which can be used in any situation.
Jess W: Yeah, that’s definitely something that the organization [inaudible 00:18:46] does now, is to advise people on how to have a big impact with their careers really focus on is especially earlier in your career, keep your options open, choose things that are robust under lots of different possible things happening. One other question that we find we come up against a fair amount in thinking about careers and charities and all kinds of things is how do you go about choosing between a small chance of some really huge outcome, like saving the world from a pandemic that kills millions, against some high probability of a very moderate or a more moderate outcome, helping people in your community?
This seems like a very difficult thing to deal with. Do you have any thoughts on how people can go about making these kinds of decisions?
David S: Yeah, very difficult but again, these are common decisions always in our lives. Do we go for the high risk option or the safer option? I personally think that actually, a mixed strategy is something that businesses I think would recommend. Although, that’s not what I do. I’m not involved in any investment decisions, but often I think people take an idea of portfolio of risk taking, because these are all risks. You can’t guarantee with anything that you do. These are all risks, and so you might have a certain amount invested in rather safe things with a safe return, you’re pretty sure that what you’re going to do is going to have a reasonable return, and that you really can predict the consequences of what you’re using your career or your money for, what you’re getting for.
On the other hand, on more speculative, where you might be supporting something that actually might not come to anything. Either because the solution it’s promising won’t work or the threat won’t arise in the first place. I think actually a mixed strategy in these circumstances, again, I wouldn’t want to just put all the eggs in one basket. I think it wouldn’t be appropriate. I also personally feel that … This is completely my personal reaction that I give, is that I quite like a mixed strategy in which some things are the big [crosstalk 00:20:54]. You might slowly anonymous, where you’re giving money to large organizations that’s doing something which is contributing in a probably fairly predictable way to improving things with people, versus the more personal ones where you do have the personal contact.
You might get slightly more emotional feedback from them, partly because they might encourage you to perhaps take on a more … Feedback from the altruism might encourage you to take on a bigger [inaudible 00:21:24], a bigger commitment in the future.
Robert W: There are various ways of thinking about how to quantify the risks or possible gains associated with an activity. Can you explain what a micromort or a microlife is?
David S: Oh yeah, yeah. These are units that … Well, we invented the microlife. Someone else invented the micromort. It’s just a way of trying to get a common scale, particularly because a micromort is acute risks, things that might kill you on the spot, but otherwise you’re healthy. It’s a one in a million chance of dying because of an action you might take. Skydiving or something is approximately 7-10 micromorts. Microlife, which is something to do with chronic risks, which is the risk of … I think [inaudible 00:22:07] quite interesting, which is bad lifestyle. Lack of exercise, or smoking, or drinking, or bad diet, whatever, which isn’t going to kill you on the spot but is likely to shorten your life. Not definitely at all. Maybe not, but probably will.
It’s a reduction of half an hour in your life expectancy, and I find these quite useful, because it enables you to compare all these different things like diet and smoking and exercise and things, and makes me for example, obviously smoking is where you’re the biggest return or the biggest in terms of not smoking, the biggest return. Drinking at low doses, well actually, it doesn’t make much difference. [inaudible 00:22:53] it’s a rather huge concern, unless you’re really swigging it down. But exercise, again, I think is extremely important, and diet’s important. It just gives you a feeling of where the priorities are, which of course, it’s all a bit obvious. They’re the priorities that people do in public health, but it makes slightly unconcerned about other things that people might be really concerned about.
Robert W: These concepts were invented to help people get a better understanding and make better decisions. Do you think that we could invent new concepts to help people think about how they can do more good in the world?
David S: Yeah. A microaltru or something. It’s difficult. I mean, a QALY is a good one. If your area of cause is to do with saving lives, improving health, lethal risk to people, then QALY’s a good one. It’s already established, you can look at how many years of life are you going to gain from your intervention, and we know that if you’re giving money to some particular charities, which I know that you promote, they’re extremely cost effective in that way. It’s more difficult if you’re the broader idea, for example, of supporting education, preserving the environment, the actual environment, and so on. Quite hard to put a number on that, within a particular area of cause, then you might be able to do it, but having metrics that go right across different areas is difficult.
I know that people criticize the attempts to do it. I still think it’s worth trying to do, and people struggling to do this within the environment, and so on. I don’t think I’ve got any great idea to be honest. I mean, one possibility of course is looking at general measures of wellbeing, which again, people are doing. It’s measured by national governments now to try to measure wellbeing, in terms of how people feel about their lives. General wellbeing indices which are not just health related, or length of life related, I think have got great potential. You’re not going to be able to get the perfect measure on any of them.
Robert W: People often ask, “How would you rate how good your life is going from 1 to 10?” Typically they give answers between six and eight, at least in the UK.
David S: [crosstalk 00:25:27].
Robert W: Maybe we could have a micro happy, which would be something as good as moving someone from six to seven, or seven to eight on this welfare scale.
David S: Yes, exactly. I think those scales, well actually, I find them very revealing when people do mention them, particularly across cultures, as you said. Everyone says seven, [inaudible 00:25:49]. Maybe because they like the number seven, but it also reflects this idea, “Well, it’s not bad. Mustn’t grumble. Could be better, but mustn’t grumble” so slightly British, stoic attitude. I think it’s almost predictable what people are going to respond, but I find, for example, the age distribution of these responses and how the age distribution varies across cultures extremely interesting.
The idea of whether one could move groups of people on that scale would be very important. When I think of charities, and some of the ones I give to, which are not directly I don’t think any of them actually make people live longer, but would hope to improve people’s wellbeing in the sense of happiness considerably.
Robert W: Yeah, it is a fascinating area of research, and there are significant international differences. I think in the UK and America and Australia, people do tend to give seven or maybe eight if they’re feeling particularly good, but in South America, people are just for some reason, more exuberant and tend to often give eights, even when they’re relatively poor. By contrast, in Eastern Europe and Eastern Asia, like Japan and China, people tend to give relatively low scores relative to how their life seems to be going from the outside.
David S: Yeah. No, no, people in different cultures, I think in Africa they’re even lower. People are very …
Robert W: Yeah. There are some countries where the average is as low as 4 out of 10, which maybe half of people are giving less than four which is a bit unfortunate.
David S: Again, I mean, they’re very culturally, I think, specific. But I think what I see is relative changes perhaps compared with the average in that community that one would [inaudible 00:27:41] for. You can’t just get everyone up to eight or whatever. I mean, there are very different cultural attitudes to do with, I think, that are reflected in those questions.
Robert W: Yeah. By trying to estimate risks or the size of problems in this quantitative way, do you think we risk being biased against approaches that are harder to quantify?
David S: Yeah, it always is a problem that people start believing the model, the measure, which is only a very inadequate thing. Starts becoming the thing that you are in fact then trying to, you’re focused on to the detriment of all others. There’s a real problem with over metricizing any activity, pretending that you can measure the benefits of everything. The academic world, of course, is a real example of that where people start getting obsessed with impact factors and nonsense like that. There’s a real danger with believing these things too much
Also in time, I think we should have a damn good go at trying to do it. Just because something is difficult and cannot be done perfectly doesn’t mean that you can’t at least have a stab at it, because you may be … Often these things can get really clear around things. Great precision might not be required in order to determine that actually this is not an effective use of the sources compared with [inaudible 00:29:07].
Jess W: Yeah and I think that rather giving us a perfect answer to all questions using models and looking at different factors, can as you say just help you to identify clear differences or areas where you need to get more information because you just have no idea to [fight 00:29:23] all that in or something, so it can definitely be very useful. Finally, just to what extent that the average person can benefit from a better understanding of statistics, learning more statistics? Either in their everyday lives, or in being more effective as altruists. To what extent do you think, I guess, relating back to the past question, it could potentially limit us?
David S: Yeah, I think of course I’d say this, wouldn’t I? But yeah, I do believe that people would benefit from having a greater idea of stats and measurement, and number, and quantity, and the frailties of that as well. Because the point about understanding statistics is both understanding their strength and their weaknesses. The two are absolutely hand in hand. What people tend to do at the moment, they’re I suppose not confident with numbers and how they are constructed, because they are in their whole, are always constructed. Someone has chosen what to measure, and they tend to either accept them as if they’re God given truth and that’s the number and that’s so vital, and that is it, that’s what we’re looking for, or reject them out of hand. “Oh, you can’t put numbers” … Just damn lies and statistics and, “This is nonsense.”
Those two extreme views are both equally idiotic. But if you’ve got a slightly more nuanced view, you’ll realize that statistics, numbers, and measurement are an incredibly valuable tool, but they’re just a tool, and they have their frailties and inadequacies. They both need to be able to teach and understand the strength and the limitations. What I’m involved in a number of educational projects in which we’re trying to do just that.
Robert W: It’s really a matter of trying to assign the appropriate weight to each different kind of evidence or each different piece of evidence that comes your way?
David S: I mean, in the end, what statistics is is to do with is quantitative evidence. That’s what we want, what we’re talking about, and evidence can be good, bad, or along a whole scale. It’s not a true or false. It’s just evidence, and we’re trying to weigh up our evidence, which we do all the time in our lives. We do this. Our guts tend to be a bit fallible when it comes to weighing up evidence. We tend to be too influenced by certain salient things that attract our attention. As you’re standing back and trying to be a bit cooler about weighing up evidence, I think it’s a really valuable thing to try to do in all areas of life, it doesn’t matter what.
Jess W: Yeah, absolutely, which is not to say that you should completely discount that gut feeling, but often just supplement it with some extra evidence source statistics.
David S: Yeah. I don’t trust them a bit, but I still go with them.
Jess W: Yeah. I think that’s about all we’ve got time for now, but thanks so much, David. This has been very informative and really interesting.
David S: Okay, thanks very much.
Robert W: Right. If you’d like to hear more from Professor Spiegelhalter, then you can find his blog Understanding Uncertainty online, and he’s often on the radio in the UK. In particular, on “More Or Less” on BBC Four.
About the show
The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.
The 80,000 Hours Podcast is produced and edited by Keiran Harris. Get in touch with feedback or guest suggestions by emailing [email protected].
What should I listen to first?
We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:
Check out 'Effective Altruism: An Introduction'
Subscribe here, or anywhere you get podcasts:
If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.