Enjoyed the episode? Want to listen later? Subscribe here, or anywhere you get podcasts:

…the former Nazi said, “Opposition? How would anybody know? How would anybody know what somebody else opposes or doesn’t oppose? That a man says he opposes or doesn’t oppose depends on the circumstances, where and when, and to whom…”

Prof Cass Sunstein

It can often feel hopeless to be an activist seeking social change on an obscure issue where most people seem opposed or at best indifferent to you. But according to a new book by Professor Cass Sunstein, they shouldn’t despair. Large social changes are often abrupt and unexpected, arising in an environment of seeming public opposition.

The Communist Revolution in Russia spread so swiftly it confounded even Lenin. Seventy years later the Soviet Union collapsed just as quickly and unpredictably.

In the modern era we have gay marriage, #metoo and the Arab Spring, as well as nativism, Euroskepticism and Hindu nationalism.

How can a society that so recently seemed to support the status quo bring about change in years, months, or even weeks?

Sunstein — co-author of Nudge, Obama White House official, and by far the most cited legal scholar of the late 2000s — aims to unravel the mystery and figure out the implications in his new book How Change Happens.

He pulls together three phenomena which social scientists have studied in recent decades: preference falsification, variable thresholds for action, and group polarisation. If Sunstein is to be believed, together these are a cocktail for social shifts that are chaotic and fundamentally unpredictable.

In brief, people constantly misrepresent their true views, even to close friends and family. They themselves aren’t quite sure how socially acceptable their feelings would have to become before they revealed them or joined a campaign for change. And a chance meeting between a few strangers can be the spark that radicalises a handful of people who then find a message that can spread their beliefs to millions.

According to Sunstein, it’s “much, much easier” to create social change when large numbers of people secretly or latently agree with you. But ‘preference falsification’ is so pervasive that it’s no simple matter to figure out when they do.

In today’s interview, we debate with Sunstein whether this model of social change is accurate, and if so, what lessons it has for those who would like to steer the world in a more humane direction. We cover:

  • How much people misrepresent their views in democratic countries.
  • Whether the finding that groups with an existing view tend towards a more extreme position would stand up in the replication crisis.
  • When is it justified to encourage your own group to polarise?
  • Sunstein’s difficult experiences as a pioneer of animal rights law.
  • Whether activists can do better by spending half their resources on public opinion surveys.
  • Should people be more or less outspoken about their true views?
  • What might be the next social revolution to take off?
  • How can we learn about social movements that failed and disappeared?
  • How to find out what people really think.

Get this episode by subscribing to our podcast on the world’s most pressing problems and how to solve them: type 80,000 Hours into your podcasting app. Or read the transcript below.

The 80,000 Hours Podcast is produced by Keiran Harris.

Highlights

The fact that President Trump got elected and that Brexit came out the way it did, is a tribute to the existence of pro Trump and pro Brexit sentiment that had been stopped by social norms. Once the norm started to shift, then something extremely surprising in both cases could happen. Activists do have some lessons, which are; first, if you can draw attention to a social norm that people haven’t realized is in place; that is, people are with you. You don’t know it. Say what you think; that can be extremely effective.

Many activists are, at some level, aware of that. The other thing, which is very recent data, not even sure it made it into the book, is that if people see a norm not as in place, but as emerging, then they are inspired to join, partly because then they feel freed, but partly because they might not know what they think, but they want to be on the right side of history. For activists to say, “increasing numbers of people are,” is smart.

I’ll give you a little study from Saudi Arabia, which is that Saudi men, by custom, have authority over whether their wives work outside the home. Most young Saudi men actually think it’s fine that their wives work outside of the home, but most young Saudi men think that most other young Saudi men think it’s not fine. They think they’re isolated in their openness to wives working outside of the home.

In the experiment, once Saudi men were informed that most Saudi men actually think like them, then the number of Saudi women applying to join the workforce grew dramatically four months later. That was a research study, not a feminist program. There’s a clue there about programs of all sorts.

Why do groups end up going in a more extreme position in line with their pre-deliberation tendency? Why, if you have a group of people who think that occupational safety is the number one issue and workers are dying in extremely high rates, why do they end up being more extreme in that position after they talk with one another? One reason is just information exchange.

Within a group that’s worried about occupational safety, by definition, the number of arguments that support the concern will be more than the number of arguments that undermine the concern; and so if people are listening to one another, they will hear more arguments in favor of the position and that’s going to lead to polarization. Now, that has nothing to do with anything invidious; it doesn’t have to do with bounded rationality. It just has to do with information exchange within a group.

Okay, but here’s the kicker in terms of rationality: that people in groups are not inclined to discount for their group’s composition. There’s insufficient thinking that, “Okay, I’m learning from these people, but these people aren’t the only people in the world and maybe the information flow within the group is unrepresentative of the information that’s there”.

So, even if information is the driver of polarization, it can lead people in not very good directions. What you want, I think, is a group where people can participate in enclaves of like-minded types, that’s part of freedom of association. But you want to make sure also, that social architecture, let’s say, makes it easy or likely that participants within the enclave will be exposed to other stuff, too. And if they are exposed to other stuff, it’s not like to hold it up as if it’s a piece of dirt and say, “Look how ridiculous it is.” This may sound a little bit abstract, but you can think of a newspaper or the BBC, or Facebook as being able to create enclaves or to broaden, let’s say, the enclave exposure, to the other stuff.

Related episodes

About the show

The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.

The 80,000 Hours Podcast is produced and edited by Keiran Harris. Get in touch with feedback or guest suggestions by emailing [email protected].

What should I listen to first?

We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:

Check out 'Effective Altruism: An Introduction'

Subscribe here, or anywhere you get podcasts:

If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.