Human beings have unlocked the laws of nature and dramatically increased life expectancy … yet still swallow fake news, cling to conspiracy quackery and regularly descend into violence.
Are we simply wired to be little more than cavemen?
Absolutely not, argues renowned psychologist Dr. Steven Pinker. We have all the necessary tools for logic and critical thinking. We don’t always use them.
Following the release of his new book, Rationality: What It Is, Why It Seems Scarce, Why It Matters, Pinker discussed, with Sinai and Synapses founder Rabbi Geoffrey A. Mitelman, how the seemingly rational pursuit of individual and group self-interest can lead us to lose our minds — and how we can develop collective rationality to curb such baser instincts.
The Johnstone Family Professor in the Department of Psychology at Harvard, Steven Pinker is the author of 12 books and was one of Foreign Policy’s World’s Top 100 Public Intellectuals and Time’s 100 Most Influential People in the World Today.
This conversation was hosted by the Streicker Center at Temple Emanu-El in New York City.
Read TranscriptGood morning and welcome to the Streicker Center. I’m Gady Levy, executive director of the Temple Emanu-El Streicker Center and I’m super excited to welcome you to the start of our fall semester. During the past 18 months, over 500,000 of you have joined us virtually, making our Streicker community stronger. This semester, I invite you to continue doing so, virtually or in person as major writers and thinkers take on questions about race, education, American culture, anti-Semitism, politics, and Jewish life. As welcome Hillary Clinton and Henry Kissinger, Ibram X Kendi, George Will, Chris Wallace, Anderson Cooper, and Katie Couric. As we remember Leonard Cohen, eat some latkes together and go back to the heights with Lin-Manuel Miranda. I hope you will join us regularly.
Now, to a discussion that could not be more timely. In a recent CNN poll, people were asked how they felt about the state of affairs in our nation. 75% responded that they are pissed off. And we are seeing that anger explode all over the country. Airplane passengers lash out for almost no reason. Fist fights break out over masks, and at Carmine’s on the Upper West Side. And innocent people are pushed off the subway platform in Times Square. We are flooded with conspiracy theories about vaccines carrying microchips, and boxes of ballots mysteriously disappearing. Are these signs that we are losing our collective minds? Have we lost our ability to think rationally?
There is probably no one better to answer those questions than Dr. Steven Pinker, Professor of Psychology at Harvard, one of Foreign Policy Magazine’s top 100 public intellectuals, Time’s most intellectual people in the world today. An author of 12 books, the latest, Rationality – I hope you can see it. He joins us tonight for a conversation with Rabbi Geoff Mitelman, to talk about the tools we have to be rational, why we often fail to use them, and what we can do about that state of affairs. If you have questions for our guests, please submit them at any time using the chat function. If you have yet to purchase a copy of the book and are interested, you can do so using the link posted now and throughout the program in the chat window. Thank you for joining us tonight, and please join me in welcoming Dr. Steven Pinker and Rabbi Geoff Mitelman.
Geoff Mitelman: Thank you, Gady. Wonderful to be with you all. Steve, nice to be with you here. I have loved your books and your writing for many many years and just finished reading Rationality — a fascinating book and an exploration of how and why we act rationally and why we act irrationally.
And so, since we’re through a synagogue, through Temple Emanu-El Streicker Center, I want to frame this question — lead into a question — through a well-known story in Rabbinic literature called the Oven of Akhnai. Two rabbis are arguing over this minute piece of Jewish law, and they’re going back and forth. And one, Rabbi Eliezer keeps saying, “If I’m right, let this miracle prove it, so that stream flows backwards, and the house of the study falls in.” And each time, the rabbi says, “We don’t find any evidence for miracles. We don’t look to miracles for evidence here. And finally, Rabbi Eliezer says, “Look, if I’m right, let God himself come down.” And the text says, “God comes down and says, why are you arguing? Rabbi Eliezer’s always right.” and Rabbi Joshua essentially says, “God, shut up. You’re not part of this conversation anymore. Ever since the Torah has been given to us, it’s on us as human beings to be able to interpret these questions and how we’re supposed to act.”
And so, why the story is so wonderful is that it makes this claim that really only reasoned arguments using evidence and interpretation are accepted. And as Gary mentioned, and as I know that you’ve done a lot of work and research on, that’s not always been the case throughout human history. It’s certainly not the case today. Often, it’s the loudest voices that are heard. So, I want to start by asking: why is rationality seemingly so challenging in today’s society right now?
Steven Pinker: I always resist the leap from “things are bad today” to “things used to be better yesterday.” Franklin Pierce Adams said, “the best explanation for the good old days is a bad memory.” So, yeah, there’s a lot of nonsense going on today, but it kind of comes with being human. We have — I don’t want to say that humans are an irrational species. I deliberately chose to bill as part of the subtitle of my book, “why rationality seems scarce,” not necessarily why it is scarce, but the kind of flaws and, indeed, dangers that we know today, like conspiracy theories and fake news, and belief in paranormal phenomena, have been with us for as long as we’ve been human, and of course, Protocols of the Elders of Zion, the Illuminati, show that conspiracy theories go way back, they’ve triggered pogroms throughout history. So it is a constant danger that we humans deal with. I think a better question is “What accounts for the rationality that we see?” But they are two sides of the same coin, and I’m not willing to write off our species, I’m not willing to write off our era.
Geoff Mitelman: Well, you know, that raises a great question, because we sometimes think — and you talk about this — that rationality gets a bad rap. We think of this unemotional, Mr. Spock view. And what I think we often forget is that our brains didn’t really evolve to help us find truth — and you talk about this — that they really helped us survive. Our brains were there on the African savanna several hundred thousands of years ago. And so rationality is relative to a goal, which is something I know you’ve said. And so, what is the job that rationality gets done, and what are its limitations also?
Steven Pinker: Yeah. So, you brought up several points. One of them is that we should move beyond the dichotomy, or the tension, between reason and emotion, between rationality and, you know, love and beauty and all of the valuable things in life. Because they’re not alternatives. You do not have to be a joyless drone to be rational. Rationality is the use of knowledge to pursue a goal. And there’s nothing that says that goal can’t be love, and good works, and beauty, and richness. It’s just if that’s what you want, you use rationality to get it. Also, it is true that the irrationality that we see is easy to blame on our hunter-gatherer ancestors. And as an evolutionary psychologist, I hear all too often talk [like], “Well, what do you expect of a species that had to chuck spears and antelopes in the savannah. That’s what our intelligence was biologically adapted to.”
But I start off the book, actually, with a discussion of how hunter-gatherers deploy their rationality, and they’re pretty rational. And I spent some time with an expert on the San, formerly called the Bushmen of the Kalahari Desert, who survive in a pretty unforgiving desert and have for hundreds of thousands of years. That’s probably one of world’s oldest cultures. And they are highly cerebral. The way that they make a living is that they engage in pursuit hunting. That means you chase an animal until it keels over from exhaustion or heatstroke and you bash it on the head. Or you dispatch it with a spear or a bow and arrow. Now, you know, animals are faster than we are. You can’t outrun an antelope. On the other hand, we have an advantage — we’re naked. We have bare skin. We’re not covered in fur. We sweat. And so in the desert, we can run marathons without getting heatstroke. Antelopes and kudus and springboks and so on can’t. However, because they can run faster than us, they see us there kind of over the horizon. The San’s weapon is their intelligence. They look at fragmentary tracks. They can infer, both from the shape of the tracks and from background knowledge what species it’s likely to be, how old it is, what sex, how tired, and that allows them to figure out where to go to find it next, which is absolutely crucial to success in the hunt.
I give a lot of examples of how they distinguish correlation from causation. They use what we call Bayesian reasoning, that is, they implicitly appeal to the rule of the Reverend Thomas Bayes, the optimal way to update a hypothesis based on evidence. They engage in critical thinking. They argue, and they don’t accept arguments from authority. So, if the hotshot tribal elder says one thing, and a young upstart thinks that he’s full of it, he’ll point it out.
So, when it comes to human irrationality, I say, don’t blame hunter-gatherers. We have to blame ourselves. I think what’s crucial though, of course, is that the San are applying their intelligence to the world – the part of the world may depend on for survival. And when we do that, we’re perfectly rational. The people who believe in crackpot conspiracy theories, you know, a lot of them, some of them, they hold a job, they get gas in the car, they get the kids clothed and fed and off to school on time. They keep food in the fridge. So, it’s not as if they’re stark raving mad. We have different zones in which we deploy our rationality in different regards. Now, the reality zone, just the stuff around you, the people that you actually interact with, I think we all are pretty rational when it comes to doing that. We have to be, because reality is what doesn’t go away when you stop believing in it. It depends on and determines our survival, back then and now. But the kind of remote, far-away worlds – who what is the origin of the universe? What happens in the White House behind closed doors? What’s the ultimate cause of fortune and misfortune? Why do bad things happen to good people? You know, all of those, for most of our history, you kind of couldn’t find out.
And so, it’s kind of irrelevant whether your beliefs are true or false — almost academic, whether your beliefs are true or false. You hold them because they’re a good story, they’re entertaining, they’re emboldening, they increase solidarity in your own tribe, your political party, your religion, and whether they’re true or false is just kind of academic.
Now, here’s what I think is the key: that since the Enlightenment, many educated people have had the conviction that all of our beliefs should be based in the best evidence and arguments. I think it’s a commendable attitude. Namely, you shouldn’t have any old creation myth. You should really look at what cosmology has to say. And likewise, what happens among the rich and famous and powerful, we should look to the historians and the journalists, and demand access to records, etcetera, etcetera.
But that’s kind of a peculiar and very recent conviction, and a lot of people haven’t really signed on to it. They’re rational in their day-to-day lives, their bread and butter, but when it comes to these big questions, it’s, you know, is it a good story? And it’s not clear how deep in the bones people actually believe the truth of a number of these crackpot ideas, such as the more outlandish conspiracy theories. You kind of think, “Well, my enemies, they’re capable of it, of these nefarious deeds — say, running a ring of cannibalistic Satan-worshiping pedophiles. Whether they do or not, those Democrats are capable and that’s good enough for me.” That’s the kind of the psychology, I think is behind a lot of those beliefs. Sorry, long answer to a — well, it’s a multi-part question.
Geoff Mitelman: It’s a multi-part question, and there are few different ways that I want to move. But one that is striking me is that when we talk or think about rationality, I think very often it’s framed as almost like a computer, of a 0 or 1 binary, “Is this on? Is this off?” And I think part of when you were talking about the hunter-gatherer societies, that the thing that we often forget is that we are a social species, and we’ve got to balance our needs versus other people’s needs. And when you were talking about the hunting, [I’m reminded of] one line that someone said that I love, which is “The best place to store food 200,000 years ago was in other people.” And you don’t know: “Is my hunt going to work? Am I going to be able to find it? Or are they going to be able to find it? If I give you food now, are you going to give it back to me later? Am I going to shirk my responsibility?” The prisoner’s dilemma – “Am I going to cheat or am I going to cooperate?” And so much of, I think, politics and religion and law is designed to be able to say, “How do I achieve my particular goals that’s also part of a larger society where each individual person also has their own particular goals that may be same as mine, and they may be in conflict, and they may be together, at the same time, in cooperation and in conflict?”
Steven Pinker: So once again, I think you packed at least four questions into that question. And as it happens, I talk about all of those in Rationality. So, let me try to answer them succinctly without monopolizing the pixels too long. So, yes, logic. So, logic, first of all, does rationality involve all our non-propositions that are either true or false? Well, that’s what classical logic deals with since Aristotle, and one might wonder: is being perfectly rational the same as being perfectly logical, in the sense of thinking in syllogisms?
And the answer is: it isn’t. That logic is one of the tools of rationality. And the middle part of the book is six chapters, each one of which tries to explain a different tool of rationality. Logic is one of them, and we really need it. But the reason that logic isn’t the same as rationality is that the whole point of logic is to zero in on the propositions listed on a page, and to deduce the necessary consequences of them. Now, that’s a powerful thing to do. That’s how our computers work. But the reason it’s not the same as rationality is is that logic requires forgetting everything you know and just basing your inference on what’s stated on the page.
In the real world, rationality often involves lots and lots of probabilistic, uncertain knowledge, which we have to aggregate. We sometimes call it common sense, or common sense refined. But I mean, just as an example, the syllogism “All things derived from plants are healthful. Tobacco is derived from plants. Therefore tobacco is healthful.” Now, that is a valid logical deduction. It would not be rational to endorse the conclusion, because we know, from our knowledge of all the different kinds of plant products, that the first premise, “all plant products are healthful,” is largely, but not exactly, true. It has some important exceptions. So all of these exceptions to the rule — and little considerations pushing you this way or that way — are what is hard to capture in logic.
That’s one of the reasons that I have a chapter on Bayesian reasoning. I mentioned it earlier because the San do it. Namely, you adjust your credence in a hypothesis, your degree of confidence, up or down, depending on the evidence you just perceived. And that’s another tool of rationality.
The second thing, Rabbi, that you brought up was: we are a social species, and that too is critical to what makes us rational, when we are rational. Now, none of us is infallible. None of us is omniscient. None of us has been vouchsafed with the truth by Hashem, by the Almighty. We are fallible humans, and we are indeed saddled with all of the biases and fallacies and errors that cognitive psychologists, my colleagues, have documented for more than 50 years. Most famously, Amos Tversky and Daniel Kahneman. Kahneman won a Nobel Prize for research, familiar probably to many of you from his best seller “Thinking Fast and Slow.” Tversky tragically died before the Nobel Prize could be awarded.
But, so, we are, every one of us, subject to all of these fallacies. Nonetheless, you know we’ve accomplished great things. We discovered the Big Bang and DNA and neural networks and invented vaccines and smartphones and walked on the moon. So, we can’t be that stupid either. And the reason that we fallible creatures do attain rationality, when we do, is that we cooperate in communities of reason. And crucially, those communities allow the voicing of ideas, many of which are going to be wrong, because we are fallible, and the ability to criticize them. And to keep the ones that survive, attempts to refute them, to aggregate them, to compose bigger ideas out of smaller ideas. That’s one of the reasons, probably the main reason, that we evolved language, so that we could pool our ideas, so we could criticize other people’s ideas. And it’s only in arenas or games that we choose to play where we allow feedback and editing and fact checking and peer review and free speech, that we have any hope of becoming more rational. One person’s rationality can make up for another’s irrationality.
You brought up the Prisoner’s Dilemma. This is a game-theoretic dilemma. I have a chapter in the book on game theory, which is the how we make sense of situations in which the best rational option for one person depends on what the other guy chooses to do, and vice versa. It can make your head spin, but there are interesting results, including the paradox that sometimes everyone doing what is rational for himself or herself can end up with a situation that’s irrational for everyone put together. And indeed, that happens with rationality. Call it the tragedy of the rationality commons— a play on the classic tragedy of the commons, the situation where every shepherd has an incentive to graze his sheep on the town commons, because it’s better to have the sheep munch there than not. But when everyone does it, they can strip the commons bare, and everyone ends up worse off. So it’s good for one. It’s not necessarily good for everyone.
In the case of the rationality comments, it can often make perfect sense if your goal is to be a hero of your tribe, your team, your coalition, your political party. If you prosecute the fiercest possible battle for your team against the other team, if you own the libs, if you show why the Republicans are stupid or evil, you muster all the ammunition, you can do it. And you know, you’re a hero within your tribe. And it’s on the and conversely, if you were to enter some doubt about the sacred beliefs in your clique, in your social circle, you could be ostracized and, you know, socially excommunicated. So, it’s rational for everyone to advance the sacred myths, the dogmas, of their tribe. It’s not rational for all of us put together when you’ve got two warring cliques, each of which is doing everything in its power to win the debate, as opposed to determine truth.
Finally, you talked about conflicts among goals. Now, getting back to our definition of rationality, it’s the use of knowledge and goal. Now, different people can have different goals and the goals may be in conflict. And a lot of what we call ethics, morality, is how we can reason about what to do when people’s goals come into conflict as necessarily they often do.Okay, that’s at least four answers to four questions back together.
Geoff Mitelman: There’s a lot in there. There may be in this in this next one too, that I think one of the things that’s important is recognizing that morality also has a rational basis of this, that, I think, in one of your other books, that you talk about how the prisoner’s dilemma, and tit-for-tat, and this balance of justice and mercy, and the individual and the collective, that can help us understand almost every part of these moral emotions of justice and forgiveness and shame and guilt, which become huge, at least in Judaism, and obviously other religions as well, but we’re here in a synagogue, so we’ll talk about it from a Jewish perspective. And it’s, you know, it’s the line of someone wants to convert to Judaism and goes to Rabbi Hillel and says, teach me the whole Torah while standing on one foot. And Hillel says, “What’s hateful to you, do not do to another; that’s the whole Torah, now go and learn it.” And what most people say is, “Oh, that’s great, you know, what’s evil to you, don’t do to another. That’s the whole Torah.” And what we forget is the, “Now the rest is commentary, go and learn it.” You’ve got to know what are the edge cases, how am I going to manifest these kinds of questions in these specific situations — where, you know, I think it was Paul Root Wolpe who said, “Ethics is not questions of right versus wrong. It’s right versus right.” And it’s often these questions of morality and competing values where we can then start to think, “This is what I need to be able to do on this case where it’s on the border. I’m not quite sure what to do here.”
Steven Pinker: Yes, so, let’s see. Well, the story about Rabbi Hillel, I actually have in Rationality, for the excellent reason that it does capture the core of morality, which is some version of the Golden Rule, the categorical imperative, Spinoza’s viewpoint of eternity. I like Rabbi Hillel’s formulation, because it skirts George Bernard Shaw’s objection to the Golden Rule. “Do not do other unto others as they would have you do unto them. They may have different tastes.”
Now, if it’s “What’s hateful to you, don’t do unto others,” you’re probably on firmer ground, because there are more things that people have in common that they don’t want done unto them than that they do. But putting that aside – call this the Silver Rule — it is true. And this is part of the answer to the challenge to rationality that can actually get you morality. Is morality in some separate sphere where reason has has nothing to say – you just have to look at what’s carved onto the tablets that was handed down by an old merciful God – or do we humans get to figure it out?
Really, the cliche is “You can’t get an ‘ought’ from an ‘is.'” It’s often attributed to David Hume and the series of philosophers that he inspired. And it is technically true, but on pretty narrow grounds. It’s true that, as Hume put it, there is no logical reason to prefer – if given the choice, there can be a scratch on your finger or you save a million people from genocide. And you know, logically, there’s no argument for one or another. Now it’s not that Hume was a psychopath. He himself knew you really should accept the scratch on your finger. But he just pointed out it’s not a matter of logic. Well, okay, and it’s also not a matter of logic that you should prefer to be happy than sad, or healthy than sick, or comfortable rather than poor. And Hume wrote about that as well.
But if you combine the not logical but inescapable preference of self-interest, that good things happen to you rather than bad things happen to you, number one — If you also accept the fact that we are a social species, as we are, none of us is a galactic overlord that can just impose our will on the universe. We depend on the actions of other people. Well, that changes everything, and a lot falls from it, including, as soon as I say, “Hey, I don’t want you to be stepping on my foot or harming my children or stealing my food,” I can’t, by the same token, say, “Yeah, but it’s okay if I steal your food or if I step on your foot.” There’s got to be reciprocity because there is nothing in the pronoun “me” versus “you” that means anything. I can’t say, “Do what’s good for me because I’m me and you’re not.” You’ll just walk away. If I want to persuade you of anything — granted, I don’t have to, but as soon as I start playing the game, certain things follow. Like if we want to affect our each other’s behavior (which we do, because we can hurt each other), then we’ve got to agree on a set of rules that works regardless of whose interests they are. And hence you get Rabbi Hillel’s version of the Golden Rule and the categorical imperative, and the veil of ignorance and a lot of other statements of morality.
Granted, the rest is commentary. So you’re certainly right that there are edge cases that principles that come into conflict, and that’s the stuff of moral argumentation, as generations of Yeshiva students and rabbis and their secular descendants know all too well.
I want to talk, for now, about moral emotions. You brought that up as well. Why do we feel shame and guilt and righteous anger and so on? Maybe a topic for later. Turning the conversation.
Geoff Mitelman: Well, I want to come back to you, to something that you mentioned, of this idea of not coming from a privileged place of “Just because I’m me and you’re you,” if you were to flip that around, it’s like a categorical imperative. And the rabbis actually talk about this, and there’s a debate of “What’s the greatest commandment in the in the Torah?” And one says “Love your neighbor as yourself.” They don’t go with with Hillel, they go with Leviticus, of “Love your neighbor as yourself.” And the other says, “No, it’s actually that we’re all created in the image of God.” That it’s actually — “that we are the descendants of Adam,” is how it’s phrased. And we, depending on people’s theology and reading of the Torah, we could say that, as everyone has value as a human being, simply by the virtue of being a human being. There are values and rights there, and that draws that distinction between the relative nature of “What I love may not be what you like.” If you love your neighbors as yourself, –if I don’t really love myself, then, I may not be treating my neighbor very well, versus like what Paul Bloom sometimes talks about, against empathy, right? You’ve got the decision between empathy and compassion, of “I need to be compassionate towards people because of who they are.” And I think that’s a large part of rationality, of being able to say, “I’m going to come at it from a 30,000-foot perspective than on the ground right now.”
Steven Pinker: Yes, so again, a number of issues have been brought up. You know, one of the problems, of course, with the image-of-God argument, or for simply saying that we have rights because we are human — it’s a very good starting point, but the people who defend the interests of animals will come at you if that’s the only basis of your morality, because it would say, “Well, we can do whatever we want with animals. We can, you know, torture them and eat them for our entertainment or do with them as we will.” And of course, it has been a criticism of the Judeo-Christian tradition that it does not carve out enough of a space for the interests of animals. Perhaps the higher-order principle is that other animals are sentient; they can feel pleasure or pain, they can flourish, and not interchangeably with those of humans, but not zero either. So that the ultimate basis of morality is, you know, vy the same token that I don’t think it’s okay to torture me, it can’t be okay to torture some other sentient creature. It is also true that empathy, in the sense – empathy is a word with a number of meanings, and I distinguish them and kind of just take Paul Bloom’s argument in my book, The Better Angels of Our Nature. The discussion goes back to Adam Smith, that empathy can mean putting yourself in someone’s shoes and seeing the world from their point of view, feeling their pain. It can also mean just caring about their well-being. Even if you have no idea what it’s like to be them, you know at a kind of cognitive level that they’re capable of pain, and you know that that would be a bad thing. And indeed, you don’t always want empathy if your ultimate concern is the well-being of others. So, if a dog lunges at your five-year-old child and the child is, you know, howling in terror, what you should do is not howl in terror to feel what the child is feeling. You should protect and comfort the child. That’s a case that shows that empathy and compassion, as Bloom brilliantly explains, are not the same thing.
Geoff Mitelman: I want to shift to a slightly different question and look at another word that comes up, and the word is “trust.” In Hebrew the word is emunah, which is often translated as “faith,” but I think a better word is “trust.” And I think that trust and rationality go hand in hand in a lot of ways, because so much of what we understand today is wisdom that’s been handed out over thousands of years, and particularly over the last hundreds of years, right? You know, the vaccines for COVID-19 have happened in the last year, year and a half. And this question has really arisen in our society of: do we trust our leaders? Do we trust institutions? There’s a lack of trust now in religious institutions. There’s a lack of trust in scientific institutions. There’s a lack of trust in journalistic institutions. And sometimes the trust allows us to sort of have a cognitive offload. One reason that I think people don’t trust – when they say, “I don’t trust the vaccines, I’m going to do my own research,” whatever that means, that a lot more cognitive work, rather than “I’m going to trust the people who have used the tools that we’ve had for thousands of years to be able to advance human flourishing.”
So how do we regain a level of trust, and what’s the connection between a level of trust and a level of rationality, of being able to believe in the institutions that we want to believe in?
Steven Pinker: Indeed, and I guess it’s fitting that we’re talking about trust at an event, emunah, organized by Temple Emanu-El, which I assume means trust in God. It is a pointed issue, because we are living right now in an era in which lack of trust in science is killing people, and people who are irrationally resisting vaccines. But you know, on the other hand, trust has to be earned. And the thing about science — and here I’m going to try to represent my scientific colleagues, and of course, Geoff, Rabbi, you are involved in Sinai and Synapses.
Geoff Mitelman: Yeah.
Steven Pinker: Religion and science together. But the crucial thing is: why should we trust science? And the answer is not, “Well, they are another white-robed priesthood, or white-coated priesthood, and what they say goes.” Because that does violence to the concept of science, and to the reason that we should trust science, when we should trust it. And the reason is that that trust is earned. And it is earned not just by a track record, although that counts. I mean, it really was science that decimated the rate of infant mortality and death from infectious disease, and that got us to the Moon, and then all the other accomplishments. But not just saying that they’re kind of wizards who have a really powerful form of magic.
No, the real reason that we should trust the institutions that we should trust is that they can show their work. When we peer behind the curtain, we don’t see more curtains, but rather methods that actually are designed to sift truth from falsehood, such as empirical testing in science, such as fact-checking and source verification in journalism, and so on. It is a mistake to just say, “Whatever the people in the white coats say, I’m gonna do it.” I mean, I think in general we should, not because science is just a new priesthood, but because if we ever press them, “What is the basis for your recommendations?” They can tell us, and if we’re prepared to put in the work, we can retrace their steps and understand it.
I think scientists make a big mistake when they deliver authoritative pronouncements as if they are oracles, or they have been vouchsafed with revelation. It’s even worse when scientists, as they sometimes do, tell the critics to just, you know, to shut up, to go away, to cancel them. That is tantamount to saying that our authority comes from from raw power. The authority of scientists, to the extent they have it, come from a track record and from transparent methods that anyone can check for themselves.
And we know that it’s trust in science that makes the difference in a lot of scientifically contentious beliefs. Because surveys show that a lot of what we call science denial — say, on climate change or evolution — does not come from scientific ignorance. And indeed, it’s a rather depressing finding that if you give tests of scientific literacy to believers in human-made climate change and deniers, to creationists and to people who endorse the conclusion that humans evolved from primates, neither one, on average, knows more science than the other. And in fact, if you probe your typical person who has the correct belief in climate change, of what actually causes it, you’ll get all kinds of comical beliefs like, “Oh, doesn’t it have something to do with the ozone hole or you know, toxic waste dumps? Or plastic straws in the ocean?”
Most people, even when they’re on the right side, are kind of clueless as to the actual science. They have a general sense of, you know, “green is good and pollution is bad,” which may have nothing to do with climate change. The thing that predicts climate change is your politics. The farther you are to the right, the more you deny climate change, which goes with skepticism about the scientific establishment, not helped by the fact that many scientists and scientific societies kind of brand themselves as wings of leftist politics. I think science does itself a disservice when it just aligns itself with a political faction. It should cultivate, and should actually have, a deserved reputation for political neutrality, to the extent that it can master that, and should always be able to provide reasons for its recommendations in the form of data and arguments.
Geoff Mitelman: And you know, the word that comes out that drives me nuts on both the scientific perspective and a religious perspective is when the word “believe” comes in on the scientific side of it. Like, “Do you believe in evolution?” “Do you believe in climate change?” And I think that’s a terrible question, because –
Steven Pinker: Well, I cite the great Jewish sage Fran Lebowitz, who said, “I don’t believe in anything you have to believe in.”
Geoff Mitelman: Right. But what’s interesting is that on the Jewish side, I also don’t like the question of “Do you believe in God?” Because it leads this question of it in the same way, “Do you believe in evolution or climate change? It’s a yes or a no question.” Or “Do you believe in God?” as a yes or a no. But being able to open up these kinds of questions to probe and to use, I think, a level of rationality — I also think a level of what I might call charity, of being able to say, “I am going to assume that you have a reason for the way that you’re acting. I am going to come at this thinking that you are acting rationally within your own worldview. And let me at least explore: what’s driving this belief? What are the implications of this belief? How is it going to manifest itself in day-to-day life and on a societal level as well?”
Steven Pinker: Indeed, and since you’ve been on belief, I’m in a great rabbinical tradition of always bringing in an anecdote or a story that touches on that topic of conversation. I’ll have to quote my late colleague, the great sociologist Daniel Bell, who said that when he was 13, preparing for his Bar Mitzvah, he went to his rabbi and he said, “Rabbi, I’ve given it deep thought. I’ve pondered it. I’ve searched my soul, and I just, completely — I don’t believe that God exists.” And the Rabbi said, “You think God cares?”
Geoff Mitelman: Right.
Steven Pinker: But when it comes to — you raise another point when it comes to kind of, you know, respect and a presumption of rationality in the people who disagree with us. So it is essential,that there’d be two rules of this game of discourse that we engage in when we do science or legal argumentation, or political argumentation, matters of factual record. One is that there is a truth, that is, it’s not that everyone has a right to their own opinion, and all opinions count equally, and it’s just a matter of power whose opinion prevails. I think rational discussions have to proceed from the assumption that the truth exists.
But equally crucially, they have to perceive the assumption that no one knows it. At least not for sure. People have different degrees of warrant in their belief, depending on how much evidence they induce, depending on the cogency of their arguments. But no one can stake a claim to have the truth. They can do their best to persuade other people of the truth – this is the reason why that freedom of speech, freedom of the press, academic inquiry, are so vital, that no one has enough claim to truth to impose it on others and shut down debate.
At the same time, we can’t accept any old cockamamie belief. We have to demand the evidence, always being tentative and provisional, always searching for more evidence, but always trying to refine our beliefs so it’s as close to the truth as we humans can make it, given the evidence we have.
Geoff Mitelman: The line that I love is “Science is helping us become progressively less wrong.”
So, there were some wonderful questions that came up. I want to shift to some of the Q&A that came up here. And this actually came from from several people.“We all watched these hundreds of thousands of people die from the current pandemic, and yet people are avoiding getting vaccinated due to their own rationalization.” And I like that the word is “rationalization,” because “rational” and “rationalization” can be — it’s a fine line there sometimes. “So, how can the country be so split about helping ourselves and others?”
Steven Pinker: Yeah, even as someone who wrote a lot about rationality and especially irrationality, I’ve got to say that the degree of vaccine resistance is surprising. Not that it exists — people have resisted vaccines for as long as there were vaccines. Jennerfaced massive ridicule and denunciation in his time. The original variolation from cowpox pustules led to editorial cartoons where, you know, cows were growing out of people’s arms and shoulders and thighs. It was considered preposterous that you would actually take little bits of germ, little bits of the thing that actually gives you the disease, and inject it into people’s arms. There’s something deeply unintuitive about vaccines, so it isn’t surprising that there should be some squeamishness.
What is surprising is that 250 years later, it not only should persist, but be so consequential. It’s, I think, a multiplication of our intuitive, primitive squeamishness about the concept of injecting a contaminant of pollutants and adulterant into the tissues of our own body. So the intuitive odds are stacked against vaccination in the first place. Most of us unlearn that primitive intuition, because we do trust the people in the white coats – they’ve extinguished smallpox, they’ve drastically reduced the terror of polio from even when I was an infant.So we see their track record. We trust what they have to say. The resistance shows an awful lot of people who don’t trust what scientists have to say.
Plus, in a strange turn of events, the Republican Party, the American Right, adopted vaccine resistance as one of their sacred beliefs, firing up perhaps the most powerful of all of the biases documented by cognitive psychologists, that being the my-side bias. Namely, you believe what is accepted, prestigious, sacred in your own coalition — what differentiates you from the other side’s coalition. And we’re seeing that explode in the resistance to vaccines among the American Right. They’re not the only ones who resist it, and there is a kind of Left-wing green anti-vax campaign, pushed, among others, by Robert Kennedy Junior, of all people. But the strongest resistance comes from the populist Right right now.
Geoff Mitelman: And you know, when you were talking about the question of sacred values, that was a question I really wanted to explore. How do we talk about trade-offs versus sacred values? But I wanna get to these — so, a couple of these other questions that are here. Another one that’s very linked to our politics right now: “The military claim they used a drone to kill ISIS leaders in Afghanistan, claiming they acted rationally in basing this action on facts, yet they were totally wrong. How do you deal with such failures in claiming your opponents are not rational?”
Steven Pinker: Yeah, it itself may or may not be rational, and it’s ultimately the best rationality that is the way we adjudicate it or settle it. The fact that sometimes– I hear it as a criticism of rationality, that so and so said they were rational and did monstrous things, or were flat wrong. And indeed, that could happen, the fact that someone claims to be rational doesn’t mean that they are rational. And it’s not an indictment of rationality if someone dons the mantle of rationality but doesn’t have the goods. And the reason it’s not an indictment of rationality, it’s that ultimately it’s only rationality that allows us to say they were mistaken when they claimed to be rational.
What allows us to say it? Well, we’re applying rationality to their beliefs and showing that it falls short, the power of rationality is that it can always step back and look at instances of itself. We can get better and better at applying rationality. We can come to discover that what we thought was rational wasn’t so rational. But it’s nothing but rationality that allows us to do that. But rationality is recursive. It can take itself as the topic of analysis.
Geoff Mitelman: Yeah, I mean, that’s what’s so challenging but also so powerful, being able to use that tool to be able to look at itself and use that as the yardstick.
There’s another question that came up, which is asking how your work connects to the work of behavioral economists. And I know that Kahneman and Tversky have done a lot of this work. “How does your work relate to the work of behavioral economists, which could be read to indicate that striving to be perfectly rational may be inferior to acting reasonably?”
Steven Pinker: Yes, well, as a cognitive psychologist, I’d rather claim that work for my field, because a lot of behavioral economics is done by, and extends the work of, cognitive psychology — including Daniel Kahneman, one of the founders of behavioral economics himself, who was trained asa psychologist. My book very much discusses ideas that are called behavioral economics. Not just how we estimate probability, decision-making under uncertainty and risk, whether the axioms of rational choice actually describe human behavior, whether or how people behave in the situations described by game theory. So really, the behavioral economics and cognitive psychology are quite enmeshed. And of course, behavioral economics is not itself a theory. So, it’s not that I, you know, endorse or criticize behavioral economics, because it’s just a topic, a subject matter. But it’s one of the things that has the feel of something that’s called judgement and decision making. But it’s what has often brought it to the public attention. Also, there’s a Nobel Prize in Economics, there’s no Nobel Prize in Psychology. So, for a lot of psychologists like Kahneman and others, their hope for the Nobel is calling themselves behavioral economists.
Geoff Mitelman: And being able to get that wonderful prize.
So, there’s another question that I think is interesting, leading back to what we talked about at the beginning, with the hunter-gatherers. “In what way are looking at hunter-gatherers, and the world as a whole, somewhat of a Rorschach test for us? There’s enough ambiguity of all these different things that we can unknowingly project aspects of our own psychology onto them, like “nature, red in and tooth and claw.”
Steven Pinker: Well, indeed there is that danger. And so it’s critical to have the broadest possible view of the lifestyle of the hunter-gatherers, so we aren’t cherry- picking the aspect that does ratify whatever we believed in the first place. It’s critical also to look at, to the extent that we can get it, at data, that is, you know, was Rosseau right in talking about the Noble Savage? Was Hobbes right in talking about a war of all against all, and life in a state of nature — nasty, brutish and short?
Well, there’s not that they’re either right or wrong. You really have to count and to, say, compare rates of death from violence in their societies to ours, to look at the whole diversity of hunter-gatherer, and more generally non-state societies, including horticulturalists and pastoralists, so that we can be a little more precise and therefore escape the trap of using them to ratify whatever we want to be true in our nature.
We also have to ask a question whether it is only the most conspicuous hunter-gatherers, like the San, like the Hadza of Tanzania, that really do represent the environment of evolutionary adaptiveness that is the kind of lifestyle in which our genes were largely selected – because we are getting a biased sample when we look at surviving hunter-gatherers. Namely, they’re the ones that have been pushed into land that no one else wants, basically land you can’t farm. And so, the egalitarianism, the cooperation, the pacifism, may itself be something of an artifact of who gets to survive as a hunter-gather, as opposed to being absorbed or conquered or evolving into an agricultural society.
Geoff Mitelman: Yeah. And thinking also of the shift from hunter-gatherer to society and cities and agriculture, and how much the world changed from 200,000 years ago. It always kind of blows my mind that Homo sapiens came in the African savanna 200,000 years ago, 250,000 years ago, but the Ice Age and the rise of cities and agriculture was only 10,000 years, 12,000 years old. So so much of human history was before written language, before agriculture, before these large-scale societies. We were able to connect on a smaller level, and you could know, “Who can I trust, who could I not trust? What are, who am I interacting with?” And that’s so much of our history as Homo sapiens. But that always kind of blows my mind when I think about that.
Steven Pinker: It is — although we have to add, and this is, I think we’re in the midst of a change in our understanding of the environment of evolutionary adaptiveness. The traditional story is almost like kind of a fall from Eden. In fact, there’s a theory that the story of Eden is a historical memory of the transition from small scale, mobile, egalitarian, hunter-gatherer bands to more settled and stratified societies with the rise of agriculture. Turns out that’s probably a little — that dichotomy is too simplistic. And because of the survivorship bias, the sampling bias of the hunter gatherers today, that the archaeological and ethnographic record shows that there were actually some pretty large societies, going back tens of thousands of years,that were settled, that had quite rich and varied diets, that engaged in cultivation and horticulture, that were stratified. Some of them kept slaves. Even anthropology students have been told about the Kwakiutl and the natives of the Pacific Northwest, which will contradict the picture of ancestral humans as small, mobile egalitarian hunter-gatherer bands, and probably a lot more of our ancestors were like the Kwakiutl than like the San.
This is going to be the topic of a forthcoming book by a former student of mine, the anthropologist Manvir Singh. But anyway, it’s just a transition of an understanding we may be living through now.
Geoff Mitelman: That’s fascinating. I was not aware of that. That’s interesting, and you know, when you talk about the fall from Eden, I mean, I’m thinking of the story of Cain and Abel, in which Cain is a farmer and Abel is a shepherd and Cain kills Abel. And it’s maybe a representation of the agriculture that ultimately drove out the shepherds and the hunter-gatherers, that one type of lifestyle tended to die out more than the other.
Steven Pinker: And Esau and Jacob is probably even more pointed, because Cain and Abel, it was kind of pastoralist versus farmer. But you know, Jacob and his brother, they were talking about a hunter,
Geoff Mitelman: Right. Well then, so this leads — I think this will be the last question here, which is a very interesting and challenging question, which is “How does one rationally demonstrate that life is an entitlement of transcendent value?”
Steven Pinker: Yeah, I’m tempted to step back and say, “divine transcendent value.” But there are certain things that are just inherent to even being able to ask the question. You’ve got to be alive and not dead. You’ve got to have health and not be sick. You’ve got to have knowledge and not be ignorant. You’ve got to have rational faculties. These are things that make it possible even for us to ask the question, to think the thought, to conceive of the discussion. And that makes them transcendent, that means that they can rise above our particular embodiment. And it makes you think that their values, like life, and knowledge, and flourishing, that are beyond what happens to be good, you know, just for me or or or just for you. And once you say that knowledge and understanding and reason are inherent, essential, inescapable, that of course opens up a universe of ideas to explore.
Geoff Mitelman: And the rationality is this wonderful tool – it’s almost like a Swiss Army knife – that can be used in so many different ways to enhance material goodness and what we’re able to have an increase of rights, an increase of flourishing, an increase of even our lifespan. And I think that’s such a critical thing that we really need to reclaim.
So thank you so much for both taking the time here this evening, and for writing such an excellent and important book for everybody to be able to read.
Steven Pinker: Thank you, Geoff, Rabbi Mitelman, for a delightful conversation, and thanks to Temple Emanu-El. Thank you all.
0 Comments