Now, maybe more than ever before, it is time to learn the art of skepticism. Amidst compounded complex crises, humankind must also navigate a swelling tidal wave of outright lies, clever misdirections, and well-meant but dangerous mistaken claims….in other words, bullshit. Why is the 21st Century such a hotbed of fake news? How can we structure our networks and their incentives to mitigate disinformation and encourage speaking truth to power? And whose responsibility is it to inform the public and other experts about scientific research, when those insights require training to understand?
Welcome to COMPLEXITY, the official podcast of the Santa Fe Institute. I’m your host, Michael Garfield, and in each episode we’ll bring you with us for far-ranging conversations with our worldwide network of rigorous researchers developing new frameworks to explain the deepest mysteries of the universe.
This week, we talk to Former SFI External Professor Carl Bergstrom and Jevin West, both at the University of Washington, who recently translated their landmark undergraduate course on Calling Bullshit into an eminently readable and illuminating book from Penguin Random House. In this episode, we discuss their backgrounds and ongoing work in the evolutionary dynamics and information theory of communication, how to stage a strong defense against disinformation, and the role of scientists and laypeople alike to help restore the reasoned discourse we all so desperately need.
If you value our research and communication efforts, please consider making a recurring monthly donation at santafe.edu/give, or joining our Applied Complexity Network at santafe.edu/action. Also, please consider rating and reviewing us at Apple Podcasts. Thank you for listening!
Related Links & Resources:
Carl Bergstrom’s Website & Twitter.
Jevin West’s Website & Twitter.
Cost and conflict in animal signals and human language
by Michael Lachmann, Szabolcs Számadó, and Carl T. Bergstrom at PNAS
The physical limits of communication or Why any sufficiently advanced technology is indistinguishable from noise
by Michael Lachmann, M.E.J. Newman, Cris Moore in The American Journal of Physics
Deepfakes and the Epistemic Backstop
by Regina Rini at Philosopher’s Imprint
Hunger Game: Is Honesty Between Animals Always the Best Policy?
by Natalie Wolchover at Scientific American
Visit our website for more information or to support our science and communication efforts.
Join our Facebook discussion group to meet like minds and talk about each episode.
Podcast Theme Music by Mitch Mignano.
Follow us on social media:
Twitter • YouTube • Facebook • Instagram • LinkedIn
Transcript produced by Podscribe and edited by Shirley Bekins.
MICHAEL GARFIELD: Well, this is going to be fun. I got to say, this is the first time that I have ever actually prepared a full hour's worth of questions in advance for this conversation, rather than just sketching something out and then sort of allowing this to go.
CARL: Right, right.
MICHAEL GARFIELD: But I really feel like there's so much here. And I hope that what I've gathered here is going to give you a unique conversation that’s going to be substantially fresh and different from what I'm sure you're going to be catching tons of … what I'm sure will be a punishing round of interviews for this book.
CARL BERGSTROM: You know, this is the thing, we are doing a lot of these and this is the one that I've been looking forward to. I just told my son, “Oh, I've got to go do a podcast for Santa Fe Institute.” He's like … Santa Fe Institute, tell them I miss them!
MICHAEL GARFIELD: Oh yeah, we miss ourselves right now. You know, it's nuts, I think we're starting a staged campus reopening here next week. And it's going to feel like the end of Big Fish, just down at the river and all your family's there. And it's like, did I die?
CARL BERGSTROM: Right, so New Mexico had been doing really well. Are they still doing really well in terms of COVID?
MICHAEL GARFIELD: We are overall. I mean obviously the reservations, there are certain areas that are being hit really hard, but this area, particularly Santa Fe County is doing pretty well and our governor has been very proactive and responsible through the whole process. Well, do you have any questions before we dive into this or…?
JEVIN WEST: I have to leave right at 1:00 because I'm giving a talk for a thing. I think there's going to be 400 people waiting to hear and I'm supposed to be there at maybe even 12:55 PM. So, I guess it would be 1:55 your time, Michael.
MICHAEL GARFIELD: Right, right, right. If you need to bounce a little early, that's fine. I'm sure we can…
JEVIN WEST: Carl can hold the fort.
CARL BERGSTROM: Well, tell you what, he runs. You asked me what it was like having to work with Jevin.
MICHAEL GARFIELD: Perfect. All right. Well, let's just dive in. Carl Bergstrom, Jevin West, it's absolutely wonderful to have you on Complexity Podcast. I know that our time together today is limited because both of you are fighting the good fight nonstop these days. So, let's just dive in. I want to know neither of you started out in bullshit science. So when and why did this become such a big deal for you? Why did you decide to let it command so much of your time and intention? Please talk a little bit about the work you were doing before and the work that you continue to do in related disciplines and how they're connected to your work on the dynamics of misinformation and how to navigate with due amounts of skepticism our very, very confusing society.
CARL BERGSTROM: I had kind of a journey that I think really set me up precisely for this, which is that I wrote a PhD on animal communication and on deception in animal communication and on the strategic issues and the game theory about what keeps communication honest. And we know animal communication is honest to some degree because if it wasn't honest, then no one would bother to listen. And if no one bothered to listen and then no one would bother to send signals. But instead you walk outside and the huge fraction of your sensory experience that you're having outside, whether it's the colors of the autumn leaves or the bird song or the tail of a deer as it goes running by…all of this is signals that evolved for the sake of conveying information.
So, somehow nature has found these ways to keep information honest. They've evolved honest mechanisms of signaling. So that was what I worked on as a PhD student, but went from there and started working on epidemiology and studied epidemiology as a postdoc and spent a decade doing emerging infectious diseases and trying to prepare for situations exactly like the one we're facing right now, as a faculty member. And that got me very, very interested in thinking about how things spread on networks, because diseases spread on networks.
And that turns out to be really focal in understanding disease transmission. Then we saw social media come along and all of a sudden it was the same networks again, and things were spreading on networks, but what was spreading on networks was bullshit. And so that really struck me as something that was going to be a huge problem and something that we would have to face. And it brought together the deception and network spread and with the sort of zeitgeist as to what was happening around this time in our society. And that was my path. Jevin, you want to…?
JEVIN WEST: Yeah, for me, it's both personal and professional. So, I grew up in a community that wasn't very good at calling bullshit, partly because it really was taboo to question some of the organizations and cultural leaders within my community. But also I don't think that we're very well trained more generally. I don't think it's unique to where I grew up. I think it's common actually in some cultures. Let's say, in Korea, for example, you don't question your elders. I mean that's certainly something that doesn't work well, if you're a copilot in a plane and you're saying, actually I think you're wrong about what you're reading and where we're landing. But it's also professional for me too because growing up in that kind of environment really drew me to science because science provides a mechanism or a structure for going about calling bullshit. And so through the work that I did actually working with Carl and others and just thinking a lot about the role of technology on science and society, it's very much what my department does, where I live into.
It just seemed like technology was making it worse too, and not just culturally, but the technology itself was making it worse. And then there was the rise of Big Data and there was all this excitement around Big Data. And I was one of them and Carl was highly skeptical as was I. Although I was teaching it and also selling it to students to go into because there was good paying jobs for it. But there was this confluence of technology trends. There was this rise in looking to technology for answers to everything. We don't need anything. We just need lots of data and lots of algorithms and lots of computers. And I think Carl and I were kind of getting fed up and I think we did have a lot of practice as professionals, reviewing papers is a great way of working on these skills. And I think Carl and I thought, this is the most important thing we could be teaching the students. And then after we created the class, there was this thing going on in the world too, this rise of misinformation after 2016. And then we're finding, Holy cow, it's linked to absolutely everything we're seeing with the pandemic. We're seeing it with the election. And so, both Carl and I are sort of just devoting our careers and our personal time to doing this.
MICHAEL GARFIELD: Okay. So both of you actually headed directly into two of the big questions that I had for you. And I think let's try to tackle them one at a time. One of them, I'm just now realizing, Carl, this is insane. I ended up at SFI instead of doing paleontological research because I was kicked out of academia because of some of the questions that I developed in an animal communication seminar my senior year of undergraduate study led me into these profound investigations for which I could find no place in graduate school in 2005. And one of them was David Krakauer who has worked with Martin Nowak on the evolution of syntax. And then the other one was your work with Michael Lachmann on costly signaling and evolutionarily sustainable strategies of lying and truth-telling.
CARL BERGSTROM: Yeah, absolutely. Wow, that's amazing so I ruined your career.
MICHAEL GARFIELD: Yeah, you did. Thanks!
CARL BERGSTROM: I’m sorry.
MICHAEL GARFIELD: Indirectly, you know, eventually I drunken-walked my way back here, but you know so here's this question, which is lying is an evolutionarily sustainable strategy. In some sense, we have the boy who cries Wolf dynamics, false signals flip over to me and they're opposite, sarcasm becomes meta-ironic. We allow lying leaders to keep going at it because we “know better.” There's this issue of fake news also traveling faster than debunking because debunking takes time and bullshit doesn't. So like is there any way to actually rid the world of fake news or is that really beside the point? I mean is our job to keep nibbling away at the edges of an ever-growing amoeba forever? How do we at least get our priorities straight around this? And where are the thresholds at which point we need to change our tactics from how we respond as the recipients of bullshit to how we punish cheaters at the source?
CARL BERGSTROM: You want to take that one, Jevin? You’re the optimist.
JEVIN WEST: Well, I'll tell you, I am becoming less of an optimist nowadays. Because when you look at the techniques for those professionals, I'm going to call them professionals that spend their days trying to sow confusion, to trying to sow a sort of fomenting emotional reaction to everything that you read online. I think that people are, I wouldn't say giving up, but when you look at how these interviews of people that are posting things online, I'll give you an example.
There was a chiropractor here in Washington in Woodinville who pushed out a Facebook post around the decline or the lower numbers that the CDC was reporting. And they weren't the only ones sort of talking about this. There were other tweets that went viral. This particular tweet claimed that there were only really 9,000 deaths approximately. CDC was over-counting because of these co-morbidities. And they didn't really understand what comorbidity meant that if you die of a heart attack because of COVID, it's still COVID that's causing it.
This reached all the way up to the leader of the free world, to a tweet. And after this person was interviewed, the person basically said, Well, it doesn't matter that it's not true or whatever. It's that I have my freedom and my constitutional right to be saying these things. And it was just, there was no allegiance at all to truth. So, it's those kinds of things that actually make me more of a pessimist, but I'll just end quickly, and I'll turn over to Carl about why I'm more of an optimist overall. And that's partly because we've been through this before in time when Carl and I liked to always look at some old quotes or look back in time when there were these big jumps in information technology, when we had the printing press and when radio and various other forms of information technology came along, there were worries at those times just like there's worries now.
And actually, it's only been a short amount of time since we had and played around with this powerful tool of social media 2.0. So I think we can sort these things out and over just the last four years, there's been tremendous efforts to start to push back. You're even seeing the platforms pushing back a little bit. They're tagging tweets of the president, which is surprising. They now have policies that are much more specific about what they allow and don't allow. So having that allows them to tag and take things down. I mean, I think they're, we're moving in that direction.
Of course, it worries me and that's why I work on it, but I still stick to my optimism. I have more optimism than I do pessimism.
CARL BERGSTROM: Yeah, I'm going to counter that. So, I've been talking with Joe Buck Coleman who's a postdoc here at the Center for the Informed Public and works with Jevin and I. And Joe and I, not completely tongue in cheek, have been thinking about the Fermi paradox. And the Fermi paradox is, you know, why, if there’s intelligent life everywhere in the universe, why have we not heard about it? What happened to it? Where did it go? And one solution that people proposed to the Fermi paradox is they say, well, intelligent life eventually develops atomic power and atomic weapons and destroys itself. And Joe and I think, no, they developed social media.
MICHAEL GARFIELD: You know that’s not entirely off the mark from another…Michael Lachmann, Cris Moore, a few other people posted that piece on how optimally encoded signals are indifferentiable from black body radiation, which is something I know…
CARL BERGSTROM: Okay, one of my favorite papers of all time. And as you know, you probably know it even made its way into a science fiction novel.
MICHAEL GARFIELD: Yeah, and then Edward Snowden was talking about it when he was on a podcast with Neil deGrasse Tyson, like this idea that maybe the solution to the problem you're posing: Fermi 1 issue is incoherent society. But then Fermi 2 issue is that we get so good at encryption that it's just indistinguishable from nature. But that’s kind of a tangent maybe…
CARL BERGSTROM: That's a lovely tangent. No, so I mean I think that the problem that you pointed out is an enormous problem. I think that social media is extremely dangerous because what social media does is, I mean, I'm not saying like ban it or something like that because that's not how we operate as a society. I'm just saying it's an existential threat to democracy and we would need to figure out what we're going to do about this.
Because social media is basically designed to amplify content that is engaging, not content that's accurate. And so everything about the user experience that's built in and then the rules of how things spread and then the way that, Jevin always points out, the way that the algorithms are designed to find content that is likely to keep you engaged rather than content that's going to keep you well-informed.
All of these things contribute to a situation where you can have the rise of movements like QAnon at sort of unprecedented levels. And this is an enormous problem in my sense. It also leaves a fantastic opportunity for adversarial attacks, right? Leaves you open to injection of disinformation from foreign adversaries and the likes, and we’re extremely vulnerable to that. We don't understand how humans make collective decisions once you give them social media. We have a lot of thinking in theory about how humans make collective decisions in a democratic society and by voting paradoxes and all of these things. And we don't really have any adequate theory of what happens once you take all the computers in the world and you link them together and then people start sending dumb s**t to each other and resending it and resending it. And all of a sudden who's controlling what I hear about the world? It's no longer editors and producers who have background to do this, it's my conspiracy theory uncle. And that's the fundamental challenge that we're facing.
MICHAEL GARFIELD: That points to the follow up question I had about this, which is that it may be, there is a kind of an equilibrium point at which the unifying rhetoric of the web in the nineties is challenged by this kind of thing. And we end up seeing what we're seeing now, which is that the web is splintering, and that people are sort of circling up their wagons a little bit. And so there's this question about how does the incentive landscape shift in order to accommodate these challenges?
So, I know that one of your animating questions you mentioned in your biography on the website, in your study of evolutionary dynamics is why don't cheaters exploit and undermine communication by sending deceptive signals? What are those cases in which they're not doing this? If we look in the natural world, how can we look to prior evolutionary solutions for examples of how we can structure our networks and the incentives in those networks in such a way as to mitigate bullshit rather than just deal with it?
CARL BERGSTROM: That's a fantastic question. I don't know if it's bio-inspired solution or not, but I think one of the places that we can really bring in a lot of leverage is by thinking carefully about reputation and the role of reputation. Because I think reputation is actually increasing in importance over the last 10 years or so, rather than decreasing in importance, which is, you know, perhaps a bit of a surprise given some of the other features of an anonymous net and crowdsourcing and all those other things. But you know during the COVID pandemic, I think in particular, as we try to figure out where to get information from, reputation plays this extremely important role. I have a friend John Clippinger who in the early…
JEVIN WEST: I was just going to mention Clippinger.
CARL BERGSTROM: Yeah, in the early 2000s John Clippinger was at the Harvard Berkman Center, and he was obsessed with the notion of the importance of reputation in online communities and in how this was going to be absolutely essential to structuring online communities. And he was very interested in also preserving privacy and anonymity. And so, he wanted ways for reputation to flow obliquely between your different online identities, so that you could be members of different communities, for example. And have these non-overlapping, non-traceable circles, but have your identity, nevertheless cryptographically verified so that you had reputational incentives to be a good actor, the moment you logged onto a site under an anonymous name, and that he thought could make a real difference in the world.
And, you know, John was just always talking about this and I kind of thought he had a one-track mind about it. And I didn't quite understand why he was so obsessed with this. And every single damn thing that the man said was right, which is really frustrating to admit.
JEVIN WEST: Well, and here's a couple of things that we're seeing right now where that's kind of playing out. First, we're seeing a little bit in the academic world. So when you see a little bit of this drop in the role that journals play and pushing out papers, we recently did a study looking at places like the arXiv where you'll see top rated papers by top authors, sometimes just not even going into journals anymore because their reputation is enough to carry it. There are examples on the internet, like the Stack Exchange model, the Stack Overflow model, where individuals get reputational credit for answering questions thoroughly. Those individuals are getting good jobs because of it too. I mean a lot of the technology companies will recruit individuals that do well and have high reputation on Stack Overflow. So, I do think this idea is important but the thing you mentioned about circling the wagons, there are some concerning things too. There's the rise of new platforms like Parler, which is just essentially based on, I don't want to say fully because it's not the case yet, but I mean basically Twitter for one political party. And that is an example where they're splintering not only within platforms but among platforms as well.
MICHAEL GARFIELD: Oh, wow. Okay so you both just, you're doing this for me, this whole thing. So, you touched on two points. I wanted to anchor deeper into this question because we're in a hall of mirrors here. And if you want to look one layer of reflection deeper into this, one issue is network-based reputation, new systems for that at scale. And then one is the issue of politically motivated bias and politically motivated reasoning. So you mentioned very early in the book that sometimes bullshit both what you qualify as the old school kind and the new school kind isn't precisely false, so much as it is nonsense disguised as sense, as many listeners might remember from the hilarious new age bullshit generator website.
So, this isn't a direct lie so much as it is the presentation of a facade of believability in order to acquire some kind of social reward.
CARL BERGSTOM: It's a very good definition of bullshit by the way.
MICHAEL GARFIELD: But this is what fiction is, explicitly. I'm reminded of the expert level of verisimilitude in the work of authors like Michael Crichton. And it seems like we want on some level to be sold on bullshit. Why is this? What purpose does a nutritional requirement of bullshit serve for us evolutionarily speaking? I'm thinking of the placebo effect but then also of Mirta Galesic’s work and others at SFI who have looked at politically motivated reasoning as a way of saving us time and energy, a kind of compression algorithm so that we can collaborate efficiently with others in our social units without having to verify absolutely everything we hear secondhand. And then you have people like philosopher Regina Rini at York University who wrote a paper on deepfakes in the epistemic backstop, arguing that the ability to convincingly counterfeit video and audio recordings brings us back to a pre-modern state of dependency on hearsay we acquire from individuals we consider trustworthy, which is the situation not unlike this issue of this documentary, Mirage Men on Amazon Prime.
It's absolutely terrifying. It's about military disinformation campaigns to convince Americans that they are privy to secrets about government contact with UFOs in order to cover up much more mundane secret projects like illegal nuclear testing. So one consequence of this is that everyone in the military hierarchy, all of whom were in a rigid system on a need to know basis, end up believing counterfactuals about UFOs in both positive and negative directions, that they do or don't exist, because they're implicitly trusting their commanding officers who are themselves often completely mistaken.
So, like what the hell do we do here? It seems as if the solutions that were coming up for this are themselves sort of vulnerable to the same exact exploits that were the problem to begin with.
CARL BERGSTROM: I think this gets at an issue. So you asked why are we evolutionarily vulnerable to a lot of this kind of thinking? And one thing is we're very, very good pattern detectors. That's how we as generalists make a living in the world. We observe nature. We observe patterns in nature, and we use those patterns to, whether it's to find food or whether it's to manage our social interactions or anything else. And so, we are constantly observing patterns, generating hypotheses, verifying those hypotheses. And that's what it's like to think as a human. And the problem is, of course, rules of thumb that tend to work very, very well, sometimes don't. So one of the ones we write about a lot in the book is the thing, I observe A, and then later I observe B. And so I'm going to infer that A causes B and you know, this can easily get you wrong because if you observe that the geese migrate and then a few weeks later, the salmon come up the rivers, you just say, “Oh, well the geese are calling the salmon up the rivers” and so on. I mean that's actually interesting nutrient cycling stories there, but we'll leave that aside. So the problem is that because we have these set of biases that can cause us to sort of self-deceive.
The other thing is that then when you take organisms with those sets of biases and you drop them into a very different kind of communicative environment, one that involves the potential for amplified spread and sort of exponential, sort of explosion of ideas going viral and such on the internet. Those biases may be kind of dangerous ones to have. And so, the problem sort of exacerbates there. And then you get these ideas around tribal signaling, tribal identity. So, these are Judith Donath’s ideas at MIT Media Lab. And Judith writes about the importance of when you get on the internet and you're spreading information. Maybe I really don't like Trump and so I see some story that says he sold the Washington Monument to a Russian oligarch. And so I retweet that thing, and I don't give a damn whether you believe it or not. I'm not trying to inform you about it. What I'm trying to do is I'm trying to signal, Hey look, I'm one of you. I'm one of the people that has this set of political beliefs. And that's bullshit in the sense that we talk about because I'm telling you about myself while pretending to tell you something about the world. And I don't care whether the thing that I'm trying to tell you about the world is true or not. That's irrelevant to the message I'm actually trying to send. But the problem is that message on the social media system takes off and has this life of its own, and then becomes these alternative facts in these alternative worlds that we were talking about earlier, where you get this fracturing of systems and so forth.
So, I think that's another piece of what happens there. We're very, very good also with these reasoning skills. We’re very, very good at talking ourselves into believing things that we want to believe. And so we are able to use confirmation bias and some of these other tools. We can cherry pick the studies that if it's COVID, we can cherry pick the studies that support the stories we’ve had in the first place and convince ourselves that we've been right all along. So, that's another piece of the puzzle, I think.
JEVIN WEST: So, I'll just say a couple of things about the efficiency side of things. I mean there are certainly times where bullshit has utility in human communication and maybe broader animal communication. Because if my wife shows up with a new sweater and it looks terrible but I tell her that it looks good. I don't have to waste time, you know, necessarily dealing with the repercussions of telling her the actual truth. And she's fine. She, maybe, she knows it's a terrible sweater and she can move on, and we don't have to waste effort in dealing with all of that.
But I also think that if there were punishments for those bullshitters out there that were egregious, that would be helpful. But there's no punishment right now for that. In fact, if you bullshit enough, then people just say, “Well, they’re bullshitter.” I mean, of course, thinking about some of our political leaders, then it doesn't really matter. “Well, it's fine. I know they lie a lot and that could be a lie at this time, but sure I mean whatever.” So, I think what that does these strategies by which you just give people mutually conflicting information and at large volumes, and you're just constantly contradicting yourself. You don't know what's true, then you have to retreat into local information and local information can only give you so much. If only trust my neighbors, my friends, and my very small, small digital neighborhood, it's really difficult for me then to rely on more global information things from institutions and leaders and gatekeepers. And really our system, democratic society, depends on our ability to access this global information as well as local information. Right now, global information is just sort of becoming less and less important. You can't really trust it. So, you have to retreat to local information. What the repercussions of that are, at a system level, remains to be seen, I guess.
MICHAEL GARFIELD: Yeah. So this is something that I talked about with David Krakauer in our Transmission series when we were talking about mass extinctions and market collapses and so on. And how this is, you know, if you look at like the Chacoans here in New Mexico. They didn't just disappear. Their society broke apart into smaller units that were able to manage the latencies. You know, they had less latency in the communications network. So, I'm thinking about this in terms of everything that you've just said, links to the conversation I had with Geoffrey West in episode 35, talking about cities as social reactors with a ratcheting crisis-innovation loop.
And then in episode seven with Rajiv Sethi on how we end up falling back on stereotypes and other cognitive crutches. As you were just talking about this, heuristics, Carl, as a way to again, save time in situations like when you're meeting somebody in a dark alley and you need to act quickly without the opportunity to really think through this. You end up requiring a stereotype because you don't want to have to check the chair every time you sit down. So, this is the world that we have because of social media and social media there, in that sense, is basically a dark alley where none of us have time to actually sit down and get to know one another and break bread with somebody.
And it's preventing us from properly establishing common priors. So like in this way, it's understandable why these, you know, Twitter for Republicans or whatever would emerge out of this situation just because it's less insane making in the short term, but it's also kind of more dangerous in the long term. So, there's this balance here. I guess my question is how do we slow down enough in an accelerating world in order to both manage being able to adapt to the rapid pace of things at one level, but also the network latencies that occur from trying to address things at scale?
How do we actually reason about what we're reading when the news cycle seems to want us to respond right away to something that it might take three days to actually sit down and fact check?
CARL BERGSTROM: The first thing to recognize is that the news cycle itself is bullshit. So, the news cycle does not, things are not happening as fast as the news cycle wants to tell you. The news cycle, the 24-hour news cycle, of course, is a creation of a marketing system that's trying to get you glued to the television so you'll watch ads, which is a familiar problem today. It's just not a television set anymore. So I mean, I remember looking and seeing tickers running across CNN that say things like “breaking news, jury selection to start in two days.” And it's like, okay, first of all, no one gives a s**t. Second of all, you know, it's in two days and third of all it's been scheduled for three months. Like why are you running this as a breaking ticker? It’s to give me that little dopamine hit for me so that I'm sitting on the couch. So, things don't happen that fast and we don't have to accept that pace. I think that's one really important thing is to recognize that we're being sold a pace. And it's not that the world is happening at that pace. And we see that so strongly in COVID. This has been a real problem with the pandemic is that COVID, especially when people are starting to adjust to this, but especially in March, COVID triggered our schema for crisis, right?
And we've got this crisis schema where things break very, very fast. The world changes radically. We need to know what's going on up to the minute. And we're thinking about things like natural disasters, does the hurricane hit, you know, come on land or not, thinking about terrorist attacks, you know the whole world changed when the second plane hit the tower and thinking about the acts of war where you don't know what's going to happen in the first Gulf War. And then all of a sudden, the bombs are falling over Baghdad and the world changes. And so COVID triggers our schema like that. And so, people become glued to the media and they try to get more and more fresh information.
You know, yesterday’s newspaper covers are not sufficient. You need an unsourced rumor that someone named Joe Mega1234567 tweeted 15 minutes ago. And what happens is as people get pulled to trying to get fresher and fresher information, they ended up massively degrading the quality of their information because it hasn't been adequately vetted. It's subject to disinformation injection and all of these other kinds of problems. With something like COVID, this is particularly problematic because nothing changes rapidly on that scale.
Occasionally a government will do something stupid and put a stupid policy into place, but our understanding of the disease is unfolding on a very different pace. The pandemic itself is unfolding on a very different pace. It's all statistical and yesterday's information, good information from yesterday is infinitely more valuable than garbage from right now. So, I think one of the things we have to do is to step back and break away from this notion that there is this increasing pace of information that we need to try to keep up with. There may be an increasing volume and that's a separate issue and something Jevin’s thought about a lot in terms of how do we deal with the fact that now we've got more people doing science than ever, and we've got channels that allow us to communicate that science at all stages of the process. We don't have to filter everything down and package it as a single final published paper, but we have open science and how do communities work within that?
But I think that's part of my answer to your question. There is just simply to recognize like, Hey, wait, we don't have to accept this premise of velocity, if you will.
JEVIN WEST: And Michael, I hate to sound too simplistic, but I think we need to improve media literacy for a generation, from all these generations that have never really had any real training in this. And I'll give you a specific example of just what a little bit of media literacy or civic online reasoning could do. I sort of actually liked the term “civic online reasoning” better because it's about civics. It's about reasoning and it's certainly online. So this idea of deepfakes or synthetic media is really scary for good reason.
And it scares people when you tell them for the first time but just by telling them and just letting them know now they're aware. Now they’re aware and then we can have this discussion at a collective level to decide what are we going to do about this. But the fact when it’s in that stage that they don't know. Carl and I talk about this, it's in this transition period where the technology all of a sudden lands in society and most of society doesn't know about it. And I think it's that that scares me. I think most of society doesn't know about …and I think we're all included in that, the repercussions of this new technology. So, we need to spend more time on education. We need to spend more time on research. Policymakers need to come to the table, researchers, podcasters, you know, everyone, we need to talk about it. So, I think that's one big thing that we can do. And one of the things that I've been thinking about recently, and it's nothing new or profound, but what the online world, especially in this web 2.0 world that we live in, has brought to us is…it sort of dissolved geographic boundaries. And before web 2.0 and the internet, you would break into maybe like groups, neighborhoods, or countries or counties, and you were constrained by those geographic boundaries and you sort of made teams, or you even joined a team.
Maybe you didn't want to be a team but you were on their team. And then there were maybe fights, literally fights between them. But now those boundaries don't really matter. There's these new fiefdoms being formed online and how those battle in these information warfares, how that sort of then can be buffered by the actual physical world is interesting to me. And so I think with new policies coming mostly from the European Union around the control of these platforms and the control of people's data and the ways in which we can communicate across these platforms and what the platforms can do, all those things happening right now will sort of determine how these boundaries online either match up or don't match up at all with our cultural similarities in the geographic and the physical world.
CARL BERGSTROM: That's sort of what you're doing full time as the director for the Center for the Informed Public.
JEVIN WEST: Yeah, no pretty much. Yeah.
CARL BERGSTROM: Can you tell us a little bit about that?
JEVIN WEST: Yeah. So, what we have going at the University of Washington where Carl and I…we're part of something pretty special. So, we received funding from the Knight Foundation to start a center devoted to this very issue of misinformation and disinformation from multiple pillars, not just from research, but education, community engagement policy and likely journalism as well. And the way I've seen it for a while, it's been inspired by a person named Sam Gill who's one of the head people at the Knight Foundation, is that a hundred years ago in the early 1900s, there were no public health departments. I mean, there were a few that started popping up and people thought, Oh, that's a good idea maybe we should have them. And can you imagine now universities and medical schools without a department devoted to public health? And so I really think possibly in the future that many university campuses will have centers devoted to this issue of information quality, misinformation, disinformation, the kinds of things that we're talking about right now.
So, when we talk about the health of human populations we'll also be talking about the health of information in the worlds in which we spend most of our time now. And so we have the center and Carl and I, and several other researchers that are devoted to this have now developed a team of postdocs and researchers and made collaborations with industry. We actually have literally a phone call away from many of the big tech companies. So we've been doing some research, for example, looking at various ways in which search results show different results based on where you are in the country or who you are, what age and other things.
And then we can actually almost literally call up Google and say, by the way, there's a bunch of ads on your platform you're not supposed to have and so they're making changes. We have this new project called the Election Integrity Partnership, which is launching on September 7th, working with Stanford University where we are going to be monitoring misinformation around election integrity. Integrity, not that Trump said this or Biden said this but around the integrity of elections. So, it's these kinds of things that I think a center like this can be devoted to. And it's to think deeply about the questions that we're talking about right now but have a place for people that have ideas and they don't have to be academics.
They can be journalists. They can be policymakers. They can be your neighbor who has a great idea about how to deal with this crazy new world. We're trying to try to make it survive so we don't go extinct and we can communicate with other planets in the universe.
CARL BERGSTROM: Bad idea but anyway a separate issue…
MICHAEL GARFIELD: Don't go, Stephen Hawking on us. So, this is, once again, you've led me into the follow up questions I had about this. One is that in the book you discuss how new school or data-driven bullshit can be particularly effective because many of us don't feel qualified to challenge information that's presented in a quantitative form. This is exactly what new school bullshitters are counting on. To fight back one must learn when and how to question such statements. And yet, I have heard so many scientists claim that it does not fall upon them to teach lay people or even experts in other disciplines, how to understand and check the basis of their claims, about how, for example, the latest time this came up for me was in a conversation with someone about the machine-generated reconstructions of what news outlets were calling a black hole photograph and how this proves anything about prior hypotheses about gravity and relativity.
So like whose job is it anyway, in order to educate the public about these things so that we can collaborate in the kind of hive minds that you're talking about? How do we actually assign that accountability to the people
in whose hands it belongs, and then kind of relatedly with respect to verifying this stuff? I don't know. I think I may or may not have sent you this over email in anticipation of this call. But I just came upon a really interesting project. It's not affiliated with SFI in any way, but Public Editor by Goodly Labs, where they're trying to create an online editing and review system that allows the public to openly assess the credibility of news articles and organizations and share their results with the world.
So they're having citizen scientists label points in an article where an author successfully avoided or fell prey to inferential mistakes, psychological biases, or argumentative fallacies, and then Public Editor computes a credibility score for the piece. So, they have a compact data-rich credibility badge along a news article, hyperlink that appears in reader's newsfeeds and search results. So that sounds like one of the most promising opportunities, but then again, that's just like a massive project of volunteers that are all sort of themselves arguing in a kind of Wikipedia kind of way about what is and is not real.
So, I don't know what are your…that's a big chunk to bite off but what do you think about all that?
JEVIN WEST: Well, yeah, be careful about the Wikipedia shout out, Michael. Because I have been just, well, I'll just from my own standpoint, I've been amazed at how well these volunteer efforts can go. I see Wikipedia is one of the few lights of hope on the internet in this world of misinformation, it tends to work amazingly well. If you would have asked me this 15 years ago, I wouldn't have believed you. But anyway I just wanted to jump in on that particular… but I think Carl, you were about to say something.
CARL BERGSTROM: I mean just whose job is it? I work at a public university. My job is to serve the people, not only who are enrolled in the classes, but to serve the people of the State of Washington and by extension serve the people in the United States and the people of the world. So, it's absolutely a hundred percent my job to do this stuff. It may not be a part of my immediate research program in some narrow research question. You know, Michael Lockman and I are working on costly signaling stuff that when we're trying to solve some set of equations that may not involve explaining things.
But if you look at my job in a holistic perspective, it absolutely is my job. And people have to decide for themselves whether that's a job that they're taking on or not when they move into an academic position. But you know, one thing we can do of course … so a lot of my research lately has to do with studying what I call the new economics of science. It’s basically thinking about the way that the institutions of science influence the…and the norms that we have influence, you know, create incentives for scientists to behave in certain ways and then take…
And then those incentives influence the things that scientists do, the research questions they choose, what gets published and then ultimately our understanding of the world. So now one of the things that is an issue is that in the past, there have not been strong incentives for scientists to have public engagement. This is the sort of Carl Sagan phenomenon, right? I mean, people, it's sort of, it's a cliche, but you know, Carl Sagan was never adopted into the National Academy of Sciences. And some people say that’s because people didn't like the fact that he was such a great popularizer and that was actually a negative and so on.
We've got to change that, it is changing. I think especially younger generation of scientists understand the power of media and of social media, and they have the desire to do more than advanced science in isolation but to actually have an influence on society in various positive ways. So, I think that as we can change the incentive structure, you know, from within which is a decision we get to make because we create it. We were all, you know, you can't say, Oh, scientists are doing these dumb things because they have this incentive structure placed on them from the outside.
It's like, we self-govern at all kinds of levels. And so, if we think that's important, we can start rewarding that in a whole bunch of different ways. And maybe it helps to have some people go out and really take their careers in that direction and try to do this aggressively, which is something I've seen Jevin do, for example.
JEVIN WEST: Yeah. I couldn't agree more. I mean it's for those scientists that don't think it's their responsibility. They might want to look at some of these trust indicators of the public around universities and even scientists. And I think it's more important than ever for their jobs and for funding, at least at state institutions. I mean, we don't, you know, unless you’re Harvard and you're basically a hedge fund running some classes or whatever. I mean, you don't have to worry about money and support if you don't want it, but for us at a state institution and just me as a communitarian and as a scientist or someone who loves science, I think it's one of the most important roles I can play.
So, I think it is, the onus is on us. And I definitely think others should pay attention just because the sentiment, at least right now, maybe it's transient. I hope it is transient, but there is a lot of negative assessment certainly among the political right of scientists and research. And it's our goal go out and tell them how good it is or how important it is for society and how they can become participants in all that and helpful participants. And looking for that example you have, you'll have to send it to us, Michael. This badge marking, collective sort of marking of when statements are correct or not.
I didn't quite fully understand it yet but it sounds super interesting. So please send it.
CARL BERGSTROM: I mean SFI has known this for a decade. This has been a core part of SFI’s mission for a very, very long time and it's something that's really admirable. We see the same thing being brought forward with what David is doing with the Interplanetary Festival and with some of the other things that he's organizing. And I think scientists just have to step up. Yeah.
MICHAEL GARFIELD: Well, it's, once again, you've proven that you're like telepathic because you anticipated my next question, which was to step out a bit and to get into this issue of the science of science and, you know, the incentives that are shaping the way that we actually do research. Sid Redner has that piece on sleeping beauties. Some pieces, some papers just take decades before they're actually extensively cited, before they actually land somewhere. You know, someone has an idea that's ahead of its time. And so, that is a big thing at SFI. How do we create a space, a kind of a cradle, a nurturing environment where fundamental research that is not driven by the desire for popularity or the desire for a particular goal-oriented funding. You know, how do we actually create a space for those kind of questions?
Because ultimately, yeah, there is the incentive for truth speaking, which is one thing, and then there's the incentive for truth seeking, right? And so, the fact that our very understanding of reality is in large part shaped by where everyone else in the herd is directing their gaze and where we follow that raises this question about, you know, like Jevin. I know that part of your work is on developing maps for large citation networks in order to understand the evolution of scholarly ideas. So how does that kind of dynamic and how does that kind of research template into the understanding of everything else that we've discussed on this call, which is how do we know what is true independent of the fact that everyone else's is gazing at it, I guess? These two issues feel very intimately related, and I just want to hear you two take the ball and run with it.
JEVIN WEST: Yeah, no, no, they are totally…just another great question. And Carl very much…he sort of helped create some of the main techniques for doing some of that mapping of science. And I think they're totally, completely connected. So when you talk about the reward systems and the metrics we use in science, where it's a citation or it's a download from an SSRN, you know or arXiV or whatever, it's the same thing in broader society where journalists have to go to their editor and say, I had 4,000 downloads for this article.
I only had 200 for this one, even though I think this was probably more important for the world to hear. But Hey, I got, you know, this is how many thousand I got for this other one. And it's those kinds of things that are changing, are sort of clouding out or sort of pushing out those sleeping beauties, really…those more nuanced discussions, the more deeper questions, the more fundamental science. And it's one reason I've always been a huge fan of, of many reasons, of SFI. And I mean that genuinely. The last time I gave a talk at SFI, I kind of embarrassed myself and talked at the beginning of the talk about how, when I was an undergraduate, I drove from Northern Utah down to SFI.
There was a conference on networks there, and I met many people I still talk to this day and at the conference, I took some time and I walked up to the old location of SFI and I didn't go in because I was so afraid. It was like looking at a cathedral. It was only a small building at the time...
MICHAEL GARFIELD: The monastery!
JEVIN WEST: Yeah. It was a monastery. It really was. I mean, and that's what it was, and you know, I still think of it that way. Yeah. But it was because of the way that SFI treated big questions and some of them were crazy questions. I mean, really some of them were out of this world sort of questions, but really, it was one of the, it was one of the places that really sort of captured a lot of the science that I was interested in and not even just the questions, but the way that it was going about it.
And so I think in similar ways we could take that and we could apply it to some of these issues on social media and reward systems for news. And the way that we share information, I think there are clear similarities, but when we talk about the science of science, I will say this, and obviously I'm partial because I work in this area. And so does Carl, but I really think we're going through a stage in science where we'll look back a hundred years from now and it will be one of those big changes in the way that we communicate, the way that we reward. Also, there's so many examples happening just right now, everything from like preregistration to open science, to just the way that we share data and improve our replication methods in the way that we, we do that from step one in the science process.
There are so many things that are changing right now and the way that we even fund science, there's a lot of interesting, innovative work going on right now. So, I think this, the science of science, we need to be thinking about it now. And if you're a physicist or you're a biologist, it is time to contribute to that conversation because I think we're setting the stage for science in a digital world in a way that will have long-term effects.
MICHAEL GARFIELD: So, I know that you have to go Jevin in a minute. If you don't have time to answer this question, maybe we can just throw the ball to Carl. I want to thank you both and, you know, in anticipation of a hasty wrap on this call, for taking the time with us and for managing to squeeze two hours of this into one. But so everything that you've just said, basically reminds me, I tend to think of these kinds of things in terms of entailed metaphors and mythology and so on. And I'm reminded of a great conversation. My friends, JF Martel and Phil Ford, had on their show, weird studies about the role of the fool in the tarot deck.
You know, that like, who is the fool? Why does this character keep showing up? And you think about, you know, being willing to stick your neck out and call bullshit on something that everyone else considers important or considers true. This is like the child calling out the fact that the emperor is not wearing any new clothes. This was why the King had a role for the fool in the first place in the court. And very much just for what it's worth, I feel like the fool at SFI. So, it's sort of like a microcosm of how SFI is the fool in the larger practice of science but it's an important role, right?
To be the guy like bringing up science fiction stuff in these conversations and so on. But so, this really comes down to the question of how do we encourage those who are able to approach these questions with a beginner's mind with fresh eyes? How do we give them a role to speak in this conversation? And part of that means this is the last question I have for you, which is how early can we realistically start teaching people to call bullshit. I mean, developmentally young children do need to learn to trust their elders. You know, many critical thinking faculties don't come online until later anyway.
So what is your most optimistic vision for an effective pedagogy and for skepticism in a world where children are exposed to all kinds of insanity through their family’s devices before they're even in school?
JEVIN WEST: Well, I'll quickly say this and I'm going to turn over to Carl because I do have to jump off this. First of all, this has been so much fun. Michael, anytime you want to talk on podcasts, off podcasts, when we get to the post-COVID world, please come to Seattle, come hang out with Carl and I. But quickly on the pedagogy thing, I think we can go early in your thing. I have an experiment going on right now with my eight-year old and my six-year old, actually, he just turned nine.
And they can do this. Actually, they can do it. It's not a pre-registered, it's not approved by IRB. But I think we can do…but there are efforts right now. Actually just in the last week, I've had three or four conversations with national organizations that are really trying to push this kind of thinking, at least in the digital world, specifically around reasoning, critical reasoning in their digital worlds.
There are slight differences than maybe in the physical world, similarities too. But there are efforts at doing work in that right now. I look at the Stanford history group that Weinberg runs that I think there's really some interesting work there. So, I do think we can go earlier than you think. And I will say this, you are playing an incredibly important role because you're sort of opening up the questions, making it fun and making it wonderful. And I mean wonder and wonderful. And it's my kids. That's what brings them into science and to thinking critically, it's not the rote memorization that we see too much at universities.
So yes we can make a difference, but it's because of things like this, Michael. So, thanks a lot for having me. Carl, why don't you take us across the finish line and it’s been so much fun, Michael. Take care. Bye.
CARL BERGSTROM: Yeah, I agree with everything Jevin was saying. I think we can absolutely start this in middle school as a specific curriculum, as a critical thinking curriculum and where you're actually teaching people a combination of critical thinking, data reasoning, media literacy. Those are sort of the pieces that I would want to be there.
The same themes can come in at elementary school very, very easily and can be worked through and just to help people learn to think in those ways. So Jevin and I have been working with Pacific Science Center here in Seattle, developing some online exhibits around these kinds of themes. And when I've looked at them with tweens and kids of that age, they love it. It's really, really fun. You know, there's things about, you know, misleading data graphics and 10-year olds, 11-year olds can look at these things and they know enough about what a bar chart should look like, and they don't necessarily know what's wrong with it when they first see a misleading one. But then they see one example and boom, you know, they're just…Then they see the second example and the online exhibit says what do you think of this? And then they say, look, dad, that's like…you know! I think that's a really, really exciting thing for people to be able to do. By the time you hit high school, I think this is just a sure winner. There's nothing that's more interesting than being able to call bullshit persuasively on adults when you're a later teen, right. Because you know that they're full of s**t.
That's your starting premise. And if you can actually prove it sometimes that's absolutely delightful. So, you know, we've piloted this in a number of high schools, and it goes amazingly well. I'd love to see eventually a way to expand this out and do it on a much broader scale as part of a high school curriculum. It would be super fun.
MICHAEL GARFIELD: Well, I know I have my work cut out for me because my 18-month-old daughter was…you know. My wife has a stray pack of tarot cards we almost threw out and she was like, no, just let the kid play with them. And she was walking around with the fool going “dada, dada.” I was like, I'm so screwed. You know, by the time she’s 13, 14!
Last, this is a bonus round question. But our communications manager, Jenna Marshall really wanted to know from you why it is that you think complexity science triggers some people's bullshit detectors.
There are people that have gone kind of out of their way to talk about how, “There's no there there.” That it's a pseudoscience. And a lot of people seem to have picked up on that. I think a lot of it seems to be just because of the reputation that acquired in the nineties but you know, like what is it? I mean, the fact that it's polysemantic, the fact that it's still this sort of open area of research? What is it do you think is triggering people's skepticism when it comes to complex systems science generally?
CARL BERGSTROM: That's a really good question. I don't have a very snap answer. I think that we'd have to look and see whether at any time in our history as a discipline, we have been prone to producing a lot of bullshit. So, it may be that just people are reasonably good Bayesians but that said it does seem to trigger people. I mean, it does seem to trigger bullshit detectors and it may be one of these things where people just have…you're further removed from immediate ability to test hypotheses because of the inherent complexities that you're looking at. You're trying to understand emergent phenomenon and it may be more difficult to apply the mental schema that we have to understand notions of cause and effect. And it may be more difficult to generate compelling evidence of the same sort … I'm bull***ting. Dammit!
MICHAEL GARFIELD: Well, I guess we can wrap it there or you know, “wipe,” I guess or something. Thank you so much for joining us folks. I can't imagine you not having enjoyed this. Please follow Carl and Jevin on Twitter and let them hold your hand and reveal to you the insane amount of manure that we're having to wade through on a daily basis. They're both doing great work with this. You can find their Twitter handles in our show notes, as well as all of the papers that we've discussed and their link to their new book and their online course, which is freely available. Thank you so much. Anything else you want to add before we call it?
CARL BERGSTROM: No, that was fantastic conversation. I mean it really could have gone in a million directions at every point. Like you promised at the start that you were going to do something a bit different than the usual talking points interview. And my goodness you've certainly delivered. So, this is why I love SFI. It's like every time I come out…conversations like this.
MICHAEL GARFIELD: Thanks, and good luck fighting the good fight.
CARL BERGSTROM: Thanks a lot, Michael.