Disinformation is Sabotaging America: Barbara McQuade

August 8th, 2024

”To love America is to love the truth.”

Barbara McQuade is a legal analyst for NBC News and MSNBC, co-host of the podcast #SistersinLaw, and a professor at the University of Michigan Law School. Her first book is Attack From Within: How Disinformation is Sabotaging America. We discuss the dangers of disinformation and how we can defeat it.

Democracy depends on truth, and as Americans we should prize truth over tribe. A lot of disinformation is hiding behind the First Amendment – telling lies as a right to free speech. We are overrun with disinformation. One of the strategies is to exhaust us by constantly pumping false claims into the media ecosphere, so that we become cynical or disengage. That is fertile ground for would-be authoritarian regimes. For example, in response to the big lie that the 2020 election was stolen, a number of states have passed laws making it more difficult to vote. Defending truth is vitally important when it is under attack.

Follow Barbara on X: 

https://x.com/BarbMcQuade

Follow Mila on X:

https://x.com/milaatmos

Sponsors:

Thanks to Shopify for supporting Future Hindsight! Sign up for a $1/month trial at shopify.com/hopeful.

Follow Future Hindsight on Instagram:

https://www.instagram.com/futurehindsightpod/

Love Future Hindsight? Take our Listener Survey!

http://survey.podtrac.com/start-survey.aspx?pubid=6tI0Zi1e78vq&ver=standard

Take the Democracy Group’s Listener Survey!

https://www.democracygroup.org/survey

Want to support the show and get it early?

https://patreon.com/futurehindsight

Credits:

Host: Mila Atmos 

Guest: Barbara McQuade

Executive Producer: Mila Atmos

Producer: Zack Travis

  • Barbara McQuade Transcript

    Mila Atmos: [00:00:04] Welcome to Future Hindsight, a podcast that takes big ideas about civic life and democracy and turns them into action items for you and me. I'm Mila Atmos.

    It's 2024 and the future of America is in your hands. Democracy is not a spectator sport, so we're here to bring you an independent perspective about the election this year and empower you to change the status quo.

    Truth is central to democracy. Yet we are firmly in a post-truth world where we are all well past shouting "fake news" at each other, but swimming ever more deeply into muddy information waters. To learn more about the dangers of disinformation and how we can defeat it, we're joined by Barbara McQuade. She's a legal analyst for NBC News and MSNBC, co-host of the podcast #SistersinLaw, and a professor at the University of Michigan Law School. Her first book, Attack from Within: How Disinformation Is Sabotaging America, is out now.

    Welcome, Barbara. Thank you for joining us.
    Barbara McQuade: [00:01:24] Thank you, Mila, glad to be with you.

    Mila Atmos: [00:01:29] So there is a long history of information manipulation. It's perhaps as old as time, but just so that we're all on the same page for this conversation, how do you define misinformation and disinformation and what's the difference between the two?

    Barbara McQuade: [00:01:44] Yeah. You know, I know a lot of people use those two terms interchangeably, but in my book, I really separated the two out because I wanted to discuss them differently because I think there's different intent behind them. So disinformation, as I define it, is lies and misleading statements designed to evoke an emotional response, to manipulate people and deceive people deliberately. You know, sometimes people ask, isn't that just a fancy word for lies? It is, but it encompasses more than just lies. It is a term that comes from counter-espionage. It is this idea of deliberately trying to manipulate people through that false message that pushes our

    buttons in some way. So that's disinformation. Misinformation is kind of its unwitting cousin. And that is when people read something they believe to be true and then they repeat it and in that way become sort of a, you know, a useful idiot for the disinformers, by sharing something they believe to be true, that is, in fact, false. I'll tell you about a time, Mila. I myself fell for some disinformation and became a propagator of misinformation by sharing it and passing it on. There was a story I read online about NFL quarterback Patrick Mahomes saying that he had refused to play another down until the Kansas City Chiefs changed their name to something that was not offensive to Native Americans. I thought, wow, that's a big story. You know, one of the highest paid players in the NFL is is going to take a stand on this issue. That's amazing. So I, I retweeted that and then later that day I was talking about it with my husband and my son, and they said, I haven't heard anything about that. Are you sure that's true? And it caused me to go back and look at the the tweet I had retweeted. And I noticed that it had come from a credible source. It was an ESPN account. Right. That's a credible sports journalist outlet. But when I looked a little closer, I saw that the account was not their flagship sports center, but it actually said sports center misspelled s p r o t s and I realized it was a fake. So I took down my post. But it's a great example, I think, of how disinformation works. So it looks for issues that are divisive in society. You know, this idea of using team mascots with Native American names that elicit some, either outrage or excitement and encourages people to promote it. You know, maybe you want it to be true or you don't want it to be true. Whatever it is, it's how disinformation works. And shame on me for passing that on without verifying its authenticity.

    Mila Atmos: [00:04:16] Yeah, thank you for sharing that example, because it just shows how easy it is to manipulate the information. And, you know, that is a small typo, so to speak. Obviously that's on purpose, but it's hard to see right away because our eyes are trained to see the correct word. Like if you take one of those spelling quizzes that used to exist on Facebook all the time, can you still read this? You know, if you, even if the words are spelled incorrectly. So of course we are squarely in post-truth times. And like you said, it's difficult to ascertain really what's true anymore, even for those who are well informed, like yourself. And Tim Snyder said that post-truth is pre fascism precisely because it seems that everything is a lie. All politicians are crooks. Nothing matters anymore. And we know that this is part of the authoritarian disinformation playbook. So how does that playbook work for them to gain power?

    Barbara McQuade: [00:05:11] Yeah, absolutely. And it's such a good point, because one of the strategies really is to just exhaust us by constantly pumping false claims into the media ecosphere. We see these things and we don't know what to think, right? The cries of fake news. And this is something that exists in Putin's Russia. This idea of, I'm going to bombard you with so much conflicting information that you're not going to know what's true and what's not. I'm going to tell you that everybody's corrupt, that there's really no such thing as truth, everything's PR, everything's spin. And so if you can't trust anybody in politics, then you might as well vote for the guy that will give you the greatest prosperity or shares your values in terms of the identity of the country that you want to seek. And at some point, people become cynical about politics or numb or disengaged from politics altogether. And that is, you know, the goal of authoritarians: to get the people to care less about what's going on in the world so that they can take the power for themselves. So that's certainly an essential part of this authoritarian playbook.

    Mila Atmos: [00:06:18] Mhm. Well, even if the authoritarian doesn't win power or gain power effectively -- notably Trump lost the 2020 election -- disinformation is still dangerous, though, right? Like one of the things that you say or actually, your book, first of all, is a beautiful book and very, very comprehensive. So anybody who has not read your book should read it. It's definitely a one stop shop. Everything is in there. You don't have to read another book on this topic. But even if you lose an election and you can't quite get to power, disinformation is dangerous to public safety and national security. It erodes the rule of law and destroys democracy. And since we are a pro-democracy podcast, let's talk about the democracy bit first. How is disinformation dangerous to democracy?

    Barbara McQuade: [00:07:05] Yeah, that's such a huge issue because, as you said in the introduction here, that democracy depends on truth. It depends on an informed electorate. Liz Cheney, who lost her House seat because of her role in the January 6th committee investigating the January 6th attack, talked about how we cannot abandon truth and remain a free country. So truth is so critically important, but it's in a number of ways, most directly in attacking our free elections, whether it's to influence the outcome with a particular candidate or to suppress the right to vote, you know, keeping people away from the polls with false claims or simply undermining public confidence in fair elections because that causes people to disengage. If people think that the elections are rigged, then they won't bother to show up at the polls. And that's something that's

    very damaging. Since World War II, it has been the foreign policy of the United States to promote democracies around the world, to lift up democracies, to engage in democracy building, because it is the belief of the United States government that when we have lots of democracies around the world that will benefit the national security of the United States, because we will have fewer wars, we will have fewer famines, we will have fewer refugee crises. We will have more and better trade partners. So all of those things inure to the benefit of the United States. But one thing we're seeing now is the backsliding of democracy around the world. And in many ways, it is this attack on the 2020 election that has brought us here. One of the things that happened after the attack on the Capitol, you know, those horrible images of people storming the Capitol, engaging in hand-to-hand combat with Capitol Police officers, storming into the Senate chamber and trashing the place in Russia. When a government official said that American democracy was limping on both feet and the United States was no longer the model for democracy around the world. In China, President Xi said January 6th was an example of how democracy doesn't work, and that we should abandon democracy as a form of government. And in fact, a strong central leader is what's necessary. Because, after all, the goal is not democracy, but prosperity. And when you have a strong leader, you can deliver on things when people can't disagree on things, of course. Not to mention that in China, people are jailed for being journalists or protesting or being dissidents in other ways. And so it's really damaging to democracy. Some of the things that Robert Mueller concluded in his investigation of the 2016 election was some of the tradecraft that was used by Russia to influence that election, and one of their goals was simply to sow division in society, but another was to influence the outcome of the election and also to undermine confidence in democracy altogether. But some of the things they did was to create false personas online, looking like American political activists, when in fact it was, you know, some Russian operative in a hoodie in a boiler room somewhere near Moscow. But the names they adopted looked like they were ordinary Americans with names like Heart of Texas or Tennessee GOP or United Muslims of America or Blacktivist, and they cultivated followers for many, many months leading up to the election. And then, as the election drew near, started saying things that were intended to be very divisive in some to push Donald Trump and to undermine Hillary Clinton. For example, Blacktivist, who was posing as a Black political activist and had amassed hundreds of thousands of followers, posted a message that said Hillary Clinton has never done anything for our community suggesting the African American community, so we should send her a message by staying home from the polls on

    Election Day. Now, we'll never know how many people heeded that call to stay home on Election Day. But if even only a few voters in a swing state that was decided by a narrow margin did so, that could have had a dramatic impact on the outcome of the election. So all of these things, whether it's trying to influence the outcome or whether it's trying to keep people away from the polls or it's trying to undermine confidence in democracy altogether, all of these things are damaging democracy. And then one last thing I'll mention Mila, and that is in response to false claims that the 2020 election was stolen. We have now seen a number of states pass laws making it more difficult to vote: Georgia. Florida. Texas. Arizona. Wisconsin. All of those states have made it harder to vote, and when asked why they needed to pass this legislation, the answer was to prevent all the voter fraud. And so instead, it's going to have the effect of making it more difficult for people, particularly people of color, people of lower income students, likely Democratic Party voters from getting to the polls on Election Day. And, you know, democracy means that we should have one person and one vote. But when there's a thumb on the scale like this, some people aren't going to be able to vote altogether.

    Mila Atmos: [00:12:14] Mhm. Yeah. It's an interesting feedback loop, right. That the bad faith actors were able to enact an outcome that then loops back to another action that enacts further bad outcomes.

    Barbara McQuade: [00:12:27] Yeah. That feedback loop is something that is really interesting. It's sometimes referred to as "active measures." You know, it's part of the intelligence world tradecraft where I'm going to do something and then when it gets reported, I'm just going to go back and look to that original thing as confirmation that it happened. There is a great commentator. She is a professor at Yale University and a former FBI counterintelligence officer. And she refers to this concept as "rinse and repeat." You know, the same way the shampoo bottle might say, you know, apply, lather, rinse, and repeat. It's just a constant loop that reaffirms itself. And so this is a lot like the reason that Texas Senator Ted Cruz, U.S. senator from Texas, said that he was deciding to vote against certification of the election on January 6th, and he said he was going to refuse to certify because many people believed there was fraud in the election, and so therefore he was going to refuse to certify the election. Then, of course, if the election is not certified, that causes more people to believe that there was fraud in the election. So it is this, you know, tautology that just goes in this constant feedback loop based on no evidence whatsoever.

    Mila Atmos: [00:13:43] Right. Right. It just perpetuates the Big Lie. Speaking of the Big Lie, the people who stormed the Capitol on January 6th, they did that precisely because they believed the Big Lie that the election was rigged and stolen. And in my mind, it feels like the ultimate example of how disinformation is dangerous to public safety and national security, but also the rule of law. It's clear, right, that disinformation isn't an inanimate object that just lies around on social media and has no effect. But actually people take action based on this disinformation. So there's also an element of the Second Amendment and militia movements here, which you referred to in the book and I had never thought about. Tell us more about how the Second Amendment fits into January 6th and the Big Lie.

    Barbara McQuade: [00:14:29] Yes. So, you know, there is a movement afoot among militia members and Second Amendment advocates that the Second Amendment is not there to permit people to possess guns, to protect themselves in their homes or for hunting or other sporting purposes. But it is there so that they can overthrow the United States government. And they talk about, you know, the traditions in the United States of opposing tyranny and other kinds of things. This is a theory that was debunked by Congressman Jamie Raskin. And, you know, one of the things he points out is that this theory is a misunderstanding of two documents, the Declaration of Independence, which was a call for revolution and a separation from Great Britain, as opposed to the US Constitution, which is something completely different. And that is a blueprint for governing. And the Second Amendment is included in the Constitution as this protection of rights. But the idea that the Second Amendment is there to allow citizens to overthrow our government is completely nonsensical and inconsistent with everything in the US Constitution. Of course, the Second Amendment itself talks about the need for a well-regulated militia. The idea of the Second Amendment was we ought to have people who have access to arms so that if and when we are attacked by foreign adversaries, we have the ability to quickly call people into action to defend the country. That's the idea behind the Second Amendment. And then there are other parts of the Constitution that support that idea. You know, you can't read any one part of the Constitution in isolation. You have to kind of read them all together. And there are other parts of it that make it illegal to have an insurrection against the United States, that the president may call for people to rise up in defense of the country. The idea that insurrection is somehow permissible under the Constitution would render the Constitution null and void

    altogether, because it's there to protect the country, the states, and the people. And so these militia groups that talk about defending their Second Amendment rights and their ability to overthrow the government is itself a lie and disinformation. But it's used, I think, by people selectively to get people to rise up, to support their cause, to support gun manufacturers and to continue creating fear, and that there are other bad people out there who want to take what's ours. And we need our guns to protect ourselves from either an overreaching government or from people of color, immigrants, Jewish people, people labeled as other who want to come and take what rightfully belongs to white people.

    Mila Atmos: [00:17:22] Right, right. Well, the disinformation sphere is so thick. It's really, it's really scary for a lot of people. And I totally understand that. And all humans clearly are at risk of believing disinformation because of the tactics to disseminate them, like you said, to manipulate us into these emotions. And the American people are particularly vulnerable because of the First Amendment, which protects the freedom of speech. Connect the dots for us here. How does the First Amendment create an environment where disinformation can become both prevalent and toxic?

    Barbara McQuade: [00:17:55] Yeah, this is such an interesting point. You know, the First Amendment is something that is cherished by all Americans, whether you're on the right or the left or anywhere in between. Everybody reveres the First Amendment, because that is what gives us the right to speak out against our government. When we see something that we think is wrong, or we disagree with, or to advocate for policy changes that we prefer. So of course it is a given. I think that the First Amendment is this cherished right. And yet what I see is people who are engaging in disinformation, hiding behind the First Amendment in an effort to say, "you can't touch me, I can say anything I want, I can make up all kinds of lies, and I get to do that under the guise of the First Amendment. The First Amendment says Congress shall make no law. So therefore my rights under the First Amendment are absolute." And that's not quite right. Again, as we said, when you interpret the First Amendment like any other provision, it has to be done against the entire document, which also talks about providing for a common defense and doing other things. One Supreme Court Justice once famously said that the Bill of rights is not a suicide pact, and so courts have upheld laws that do restrict free speech, such as threats. You know, you can't communicate a threat, you can't commit perjury, you can't go into court and lie. Conspiracy, which is a crime, you

    know, that involves a plan to violate the law that's usually achieved through speaking words. Fraud. I might offer somebody something that's a deal that's too good to be true, because it's a fraudulent claim with an effort to defraud somebody out of money. So there are exceptions to this, but it makes it difficult to regulate things that are said on social media and in the public sphere. It's why there is such tension, I think, on college campuses from time to time because of these ideas of free speech, but also safe spaces. And so it's complicated. But it also, I think, is important to remember that it's not absolute.

    Mila Atmos: [00:19:55] We're taking a short break, and we'll be back with Barbara McQuade in a moment. But first, I want to share about a podcast I know you'll love called What Could Go Right.

    What Could Go Right: [00:20:06] Climate change, global conflicts and upcoming election. No wonder so many people feel like we're on the brink of disaster. Enter what could go right. It's hosted by me, Emma Varvaloucas and Progress Network founder Zachary Karabell on What Could Go Right. We sit down with expert guests and discuss the world's most pressing issues without resorting to pessimism or despair. Instead, we look back at how far we've come and look forward at what it will take to achieve an even brighter future. Is progress on the way? We might not have all the answers, but on What Could Go Right, we ask the key questions. We hope you'll tune in to hear interviews with upcoming guests like writer Coleman Hughes, CNN host Fareed Zakaria, and economist Alison Schrager. If you're looking for a weekly dose of optimistic ideas from smart people, join us every Wednesday on What Could Go Right. It's available wherever you get your podcasts.

    Mila Atmos: [00:20:59] And now let's return to my conversation with Barbara McQuade.

    You have some really great ideas, and we'll go over those. But before I ask a question about how we could potentially regulate tech companies, I feel like we should try and figure out what everyday people can do in their daily lives. Like you said, there's so much apathy and cynicism. Like basically it's so confusing. People just throw in the towel and they follow fashion or celebrities or food, and they don't really, they don't really follow news about politics, policy, the things that actually impact their daily lives. So with all that's going on, I think it's really easy to feel powerless and we can easily

    become unwitting dupes in spreading misinformation. So how do we protect ourselves as individuals? How can we discern the truth?

    Barbara McQuade: [00:21:51] Yeah, I'm so glad you asked that, because I think that that that is, you know, the place to begin is if each one of us individually can learn some skills to detect disinformation, then we can begin to help others. So there are a number of tips. I include them in the book. You know, first is when you see something online, you read it somewhere, if it is something out of the ordinary, you should look for a second source. Something I failed to do in my Patrick Mahomes example. If I had simply looked to see whether that story was reported anywhere else, I would have realized it was not, and that that should be a good clue that maybe this story is not accurate. I think another important thing to do is when you see a news story, you should actually read the story and not rely solely on the headline. Headlines today are often designed to generate clickbait. If you click on the story, you'll see the ads, and that is half of the battle for the publishers. But sometimes the headline doesn't really resemble the article itself at all. It is designed to be a little more sensational, and you read the article and you say, boy, the article doesn't really even say what the headline said. You know, the headline might say something like "World to End Soon." And then, you know, the article itself says "No experts say," you know, so it's just the opposite of what the headline might say. So that's really important to actually read a story and don't assess it based on the headline alone. I think that we can understand the difference between causation and correlation. You know, there are all these kinds of things that say x happened and then y happened, therefore X caused Y. And that's not necessarily the case. It may be that x and y are both true, but it's a coincidence. Or there are other factors that are causing that. For example, there's one popular study that says children who eat dinner with their parents are more likely to graduate from high school. And so you might conclude, well, just sitting down and eating dinner with your parents means you will graduate from high school. Well, maybe that helps, but I think it's more correlation than causation. Because if you're eating dinner with your parents every day, it's probably also the case you have a lot of other good things going for you. You have parents. That's a good start. You have a roof over your head, you're food secure. You're having a meal every night. Your parents are participating in your life and that's probably important, too. So it's not just that one thing, it's all of those factors. So that's the difference between causation and correlation. And then one more I would share, which is the idea of statistical studies. Oftentimes there are studies that get published. And

    when you read about the sample size you realize that some are more credible than others. And so you should look and see what was the sample size in this study that said, you know, if you eat food X it will cause this, or vaccines are leading to this result, or whatever it is. What was the sample size? Was it 2 million subjects, or was it two? Because that can make a big difference into the reliability of that study. So a lot of those kinds of techniques are things that we all can use to ensure that what we're reading is credible. I would also say one last thing, which is a credible source. So when it comes to voting, which is one of the things we were talking about today, is to make sure that you're getting your voting information from a source that is credible. There are all kinds of fake news outlets out there, and some organizations out of Russia have been creating newspaper sites that look real, like one called the "Miami Chronicle." It has all kinds of stories. It looks like it's from a real newspaper. And there's no such thing as the "Miami Chronicle." It's just made up, but it looks pretty good when you're looking for things. Go to the authoritative source. So if it's about voting information, your Secretary of State's website will be very good on voting information. Or the League of Women Voters, nonpartisan organization dedicated to providing accurate information about voting to all citizens. So the source is not just a second source, but the credibility of the source is also important.

    Mila Atmos: [00:25:41] Mhm. Yeah. This is all really good advice. But I will say I think that for a lot of Americans this is like a burden. This is one step too far. They don't have time or maybe they're not even interested. But I agree with you that we must make truth our national purpose. And to that end, you suggest a few ways to -- actually not a few -- you suggest a lot of ways to reduce disinformation from the demand side, which include conducting public service campaigns, promoting civic engagement, and teaching media literacy, which is essentially what you just talked about. So there are some countries that teach media literacy in schools. Which one of the countries that are doing this do you like best, and you think we should emulate here in the United States? How are they doing it?

    Barbara McQuade: [00:26:27] Yeah. You know, one country that comes to mind is Finland. And Finland has been facing this problem of disinformation for far longer than we have because of their proximity to Russia. And so Russia has been picking on them for a long time. And so one of the things they have done is to implement media literacy in their schools. And it's been very successful. Young people are raised now to read

    news with a skeptical eye, to understand some of these factors that I just discussed that could be tells that something is a fake. I think that would be a great model to replicate here in the United States by implementing media literacy in our schools. It's really just another form of critical thinking, but I think it's a recognition that there is deliberate disinformation out there online. And rather than throw up our hands and say, who knows what to believe, you know, we should look for the things that are credible and find the things that are not, so that we can discern between the two. But I don't think we should stop with just young people because that'll solve the problem going forward. But we have a lot of people today who are beyond school age, who are just as likely to fall prey to disinformation as young people. And I'd like to see us offer media literacy and critical thinking skills for people at all levels. There are a lot of great organizations that put out continuing education. I've learned about all of these great lifelong learning organizations that many retirees participate in. There are civic groups like Rotary Clubs and Kiwanis Clubs and others that could do it. Public libraries are now engaged in a lot of programming for citizens that is free. So I think there are a lot of places where media literacy for adults could be offered that would also be helpful to building resilience against this information warfare.

    Mila Atmos: [00:28:11] Mhm. Well, the other part that you also suggest, of course, is to regulate tech companies or the media sphere in a time where the 24/7 cycle of breaking news inundates us with so, so much information. It's really I think it's imperative at this point to clean up the information ecosystem. And one of the suggestions you have is to amend Section 230. And I'll tell you that I have heard repeatedly that there is limited to no value in amending Section 230, but the suggestions that you have in the book are actually really practical, I think, and doable. Tell us more about the ideas that you share.

    Barbara McQuade: [00:28:52] Yeah. And so importantly I'm saying, you know, in the words of Bill Clinton about I think he was talking about the welfare system or something, I don't remember, you know, "mend it, don't end it." You know, Section 230 of the Telecommunications Decency Act of 1996 gave immunity from legal liability for all internet service providers. And that includes the social media platforms. And so, you know, in 1996, no one was using social media. It even predated Myspace, which has now come and gone and some of these other platforms. And the idea behind it was to promote creativity and foster innovation. I mean, if you read the language of that section, it sounds like something out of a children's book. It's like the internet is a

    wonderful world full of possibilities, and it all sounds great. And in fact, you know, it is in some ways. But I think what the lawmakers failed to foresee at the time was all of the ways that things can go awry. And so, you know, it's a little like raising a baby alligator in your bathtub. It might be really cute and adorable now, but at some point it grows into a man eating predator. And the rules you use to take care of the baby alligator no longer apply when it's the man-eating predator. And maybe it's time to rethink some of those things so I wouldn't remove the immunity for all purposes, but I would have a few exceptions, because what the immunity allows is for a social media platform to allow anybody to come in there and publish what they want to publish, and it protects the social media platforms from legal liability. In case you or I put something defamatory on there about somebody, they can't possibly monitor the billions of messages that are posted there every day. It would put them out of business, but I would propose regulating only the things that they themselves are controlling and so not the content posted by others. But things like their algorithms, or things like the way they scrape their data, or things like the money they accept for ads and requiring disclosure of who's paying for them. So with regard to algorithms, for example, we know from the Facebook whistleblower named Frances Hawkins, who testified before Congress that the biggest problem on social media isn't the content but the algorithms. And that's what she said. She said at Facebook, they had deliberately programmed the algorithms. And, you know, that's just a computer program that tells the platform what to do and how to sort information and other things. She said, they deliberately programmed the algorithms to maximize content that would outrage people. They gave new emojis that you could rank things if you liked it. You know, you can give it a thumbs up or a smiley face. But they also started asking people to indicate that they didn't like something or that they hated something. And the higher the score of anger or outrage or dislike, the more likely it was to rise to the top of everyone else's feed. Because what they discovered is the more outraged people were, the longer they stayed on the platform because they had to share it with people. Mila, did you see what this person wrote? It's awful. Or they had to respond to the person. I think I'm going to write a long screed about this, about how terrible it is. And the longer people stay online, the more Facebook is able to sell its advertisers for higher rates. Outrage equals money for Facebook. And that was the big disclosure of Frances Haugen's. And so why can't we regulate the algorithms? This isn't something that other people are posting online that the social media companies can't possibly monitor because of the volume. This is something they're creating for themselves. And so that's something I think that could be regulated and overseen and

    at the very least disclose so that people know if they're being manipulated. I think the same is true with regard to the way social media companies scrape our private data and sell it to data brokers, who in turn sell it to commercial enterprises and to political consultants. And so, because they have all of this private data about us, they are able to micro-target us with ads that they know will push our buttons. You know, is it any surprise I got the Patrick Mahomes thing? Right? I'm a sports fan. I care deeply about issues of civil rights. So, you know, probably not a surprise that based on my likes and shares and other things, they knew exactly how I would react to that message. And so that is one where I think there's actually some current progress. There are some members of Congress who are concerned about data privacy online and the way our data is collected and used against us. And I think the third thing I would propose that I mentioned, Mila, is that when social media platforms accept money in exchange for ads, they ought to disclose who has paid for that ad when it comes to political ads on television or radio, if there is a paid political ad, there's a requirement that there be a disclosure as to who paid for it. That's why you hear, like, I'm Donald Trump and I approve this ad, or I'm Joe Biden and I approve this ad. There is no similar counterpart online. And so it could be that an ad is paid for by a campaign or by special interest or by a foreign actor. And I think that if we simply disclose that, that could help people to assess the credibility of the messenger of that message.

    Mila Atmos: [00:34:03] Yeah, for sure. I agree. We need to have transparency. I think it's so difficult to be like where did this come from? You know, I was at a, I was at a talk by a CIA analyst, now retired, and she said, if I see an image, the first thing I think is who is serving me this image and why are they serving me this image? Like, I don't care really exactly what it is. But first, what is the purpose of showing me this in this moment, you know, in what way am I being targeted? So of course, I agree with you that we need to build a culture of truth, and you argue towards the end of your book that liars should be denounced as traitors. So now, in the aftermath of January 6th, and in the persistent belief of the Big Lie, how do you think about the connection between patriotism and truth?

    Barbara McQuade: [00:34:54] Yeah, to me, that's the essence of all of it. Our country was founded on this idea of participatory democracy that, you know, we all have to have information. James Madison said, democracy depends on an informed electorate. And so if people are out there dis-informing us, they're trying to cheat, they're trying to

    manipulate us. They're trying to deceive us into voting for someone based on false information, or to prevent us from voting altogether, or to undermine public confidence in elections as a form of government, and that is disloyal to the United States. I think we ought to really think about it that way. I dedicated my book to the 9,000 service members in the United States who died on the shores of Normandy, fighting for democracy. You know, many of them were 19 years old. 20 years old, 21 years old. These people died for democracy, fighting against fascism and Nazism. And then today, it seems that there are some people who can't even tell the truth to defend our democracy. But there are those who fall for misinformation and disinformation, and there are others who simply go along with the con, and they're doing it because they want to achieve power and steal that power from the people. When Donald Trump refers to the January 6th attackers as hostages, I'm sure he knows better. He knows those people aren't being held as political prisoners. They've been charged with crimes, given due process, convicted and sentenced in court. And yet he calls them hostages. And now we're hearing others echo that phrase. Elise Stefanik, member of Congress from New York. Marjorie Taylor Greene, member of Congress from Georgia. They, too refer to those attackers as hostages. And that is really an effort to just mislead people through language. I think that what they do is, is nothing short of traitorous to our country. I think to love America is to love the truth. And that, you know, we've always had disagreement, we might have controversy, but we have reached an agreement on how we resolve those disputes. And that is through the court system and at the ballot box. And if you don't like the outcome, you have the ability to change things through lobbying activity or running for office yourself. But what you can't do is use physical brute force to impose your will on the rest of us. That's vigilante violence. That's what happened on January 6th, when people went in and used brute force to try to certify the election contrary to the will of the people. And so I think we need to have this national conversation about truth and how important it is in our society, and to label those who refuse to engage in truth as the traitors that they are.

    Mila Atmos: [00:37:34] Mhm. Hear, hear. So this is obviously an election year. And I feel like in this election cycle pro-democracy politicians and candidates who are pro- truth really should be communicating better. If you were a political advisor, a media advisor, what would your advice be? Because I feel like the people who are pro-truth and pro-democracy are not being heard.

    Barbara McQuade: [00:37:59] Yeah, that's such a good point. You know, there's a phrase, I quote this in my book, and even the source of this phrase is the subject of some debate over disinformation as to who said it first. Some say Mark Twain. Some say Jonathan Swift. But something along the lines of "lies are halfway around the world while truth is still putting its boots on." People who tell Whopper lies are more exciting and scintillating, and so people are more likely to gossip about those things, and they spread. And truth itself might be seen as dull or boring. But I think when truth is under attack as it is defending, it becomes of paramount importance. And I think people ought to understand that. I think that sometimes people tune it out. As you said, there are a lot of people who can't be bothered and want to focus more on things like celebrity gossip or other kinds of things, but I hope that my book will help sound the alarm about how important it is that we engage in truth and truth telling. I think if I were advising any particular candidate, I would suggest to them that they speak in much the same way Liz Cheney has spoken. Of course, I guess it didn't do her much good in Wyoming, but you hope that it will prevail in other segments of the country. This idea that our country was founded on the notion of democracy, on the notion that we are all created equal and that we should have an equal voice in our government. But when there are people who use disinformation to deceive us, they betray that notion of America, even though it may be that the tribe that is espousing disinformation is one that shares your values today. It may not be the case tomorrow. And so rather than put up with disinformation and an attack on the truth, what we really should commit ourselves to is the process of truth telling instead of allegiance to tribe, because I think that's one of the things that is really causing this moment, that people have fallen for, this idea that there's a threat to America, that certain groups hate America, that people are being replaced by Black and brown immigrants who are coming into our border using that fear mongering. If you don't support the status quo, then you won't recognize our country anymore. I think people are falling for that set of lies. You know, it's important to remember that if you allow lies to become part of democracy, then you're going to have to tolerate those lies even when they're harming what you see as your best interest. So I think truth over tribe I guess would be the buzz phrase I would suggest to political candidates.

    Mila Atmos: [00:40:36] Well said. So here at Future Hindsight, we're always invested in bolstering our civic action toolkit. As an everyday citizen, what are two things I could be doing to actively defeat disinformation, to basically be in allegiance with truth over tribe?

    Barbara McQuade: [00:40:58] Two things. I think one thing that you can do is to get out of your bubble. You know, so many of us, I think, work from home, we spend time online, and all we ever experience are people who are like minded and they reinforce each other. What we need to do is to get outside of our comfort zone and get out in the real world and meet people. And where I think we would be reminded that we have far more in common than we have differences. When people are in the workplace or places of worship or social activities or civic organizations, you meet people from all walks of life, and it's much harder to demonize people when you actually know them as individuals. And so I think that's a really important thing, is just spending more time with people. That's one of the reasons that we have fallen prey to all of this. I think the other is as this election year approaches, and we know that disinformation is coming in an effort to upend our democracy and our elections, I think we need to make a plan for getting accurate information for ourselves, for voting, and to be able to share that plan with other people. So whether it is, I'm going to get information from my Secretary of State's website or my county clerk's website, or my city clerk's website, or the League of Women Voters, whatever it is, having that plan so that if you start seeing things that you think sound a little squirrely, you know where to go for accurate information. And I think sharing that with our friends and family is important too.

    Mila Atmos: [00:42:25] Yeah. Good advice. So as we are rounding out our conversation here today, looking into the future, what makes you hopeful?

    Barbara McQuade: [00:42:35] One of the things that makes me hopeful, I think, is the students that I get to work with as a law professor, I spend every day with young people who are bright and curious and do care about truth, and are willing to call out when they see people engaging in disinformation and manipulation and other kinds of things. And so I am hopeful that we're in a very difficult and dark period in American history right now, but that it won't last. I don't think we will overcome it just by hoping it goes away. I think it will only go away by action, but I see young people committed to defending truth, to critical thinking, to process over outcome, and that's what gives me hope.

    Mila Atmos: [00:43:17] That is indeed very hopeful. Thank you so much for joining us on Future Hindsight. Barbara. It was really a pleasure to have you on the show.

    Barbara McQuade: [00:43:25] Thank you Mila, it was a great pleasure talking with you as well.

    Mila Atmos: [00:43:28] Barbara McQuade is a legal analyst for NBC news and MSNBC, co-host of the podcast #SistersinLaw, and a professor at the University of Michigan Law School. Her first book, Attack from Within: How Disinformation Is Sabotaging America, is out now.

    Mila Atmos: [00:43:51] Next week on Future Hindsight, we're joined by Daniel Alvarenga. He's a journalist who covers issues pertaining to immigration, racial equity and Latinx cultures. He's also the podcast host of Humo, Murder and Silence in El Salvador.

    Daniel Alvarenga: [00:44:08] LA is the home of gangs like MS-13 and Calle 13, 13th Street, and they started because they were Salvadoran refugee children who came into Los Angeles in the 1980s. And remember, these refugee children experienced war. Some of them might have been child soldiers. And so they enter an environment in Los Angeles that already had gang culture, that already had this antagonism between youth of color and the police, and to defend themselves from everything going on and from the lack of social services, because this is the Reagan era, many formed gangs to defend themselves.

    Mila Atmos: [00:44:43] That's next time on Future Hindsight.

    And before I go, first of all, thanks so much for listening. If you liked this episode, you'll love what we have in store. Be sure to hit that follow button on Apple Podcasts or the subscribe button on your favorite podcast app, so you'll catch all of our upcoming episodes. Thank you! Oh, and please leave us a rating and a review on Apple Podcasts. It seems like a small thing, but it can make a huge difference for an independent show like ours. It's the main way other people can find out about the show. We really appreciate your help. Thank you.

    This episode was produced by Zack Travis and me. Until next time, stay engaged. The Democracy Group: [00:45:38] This podcast is part of the Democracy Group.

Previous
Previous

U.S. Influence in Central America: Daniel Alvarenga

Next
Next

Protecting Democracy: Daria Dawson