In this episode of "Democracy Nerd," host Jefferson Smith is joined by David Gilbert, a senior reporter for WIRED magazine specializing in disinformation and online extremism. As the 2024 presidential election looms, their conversation delves into the profound implications of disinformation and misinformation on democratic processes worldwide.
Gilbert explains the core differences between disinformation (deliberately false information spread to deceive) and misinformation (false information spread without harmful intent), and how both differ from propaganda. The discussion highlights the alarming inaction of social media platforms in addressing disinformation and misinformation campaigns. Despite the evident threat these campaigns pose, platforms are criticized for their lackluster response and insufficient guardrails.
Gilbert contrasts the capabilities of different state actors in executing misinformation campaigns, noting that Russia has been particularly adept compared to others like China. He discusses how these foreign influences further complicate the information landscape.
The conversation underscores the potential consequences of unchecked disinformation and misinformation, including the erosion of public trust in elections. Gilbert and Smith stress that the continuous onslaught of lies could lead to a scenario where the public loses its ability to discern truth from falsehood, threatening the very foundation of democracy, bringing to mind the line from the TV miniseries 'Chernobyl': "What is the cost of lies? It's not that we'll mistake them for the truth. The real danger is that if we hear enough lies, then we no longer recognize the truth at all."
This episode provides a thorough examination of the current state of disinformation and misinformation, particularly in the context of the upcoming 2024 election, serving as a wake-up call about the urgent need for robust measures to safeguard the integrity of democratic processes against the backdrop of rampant online falsehoods.
[00:00:00] All right. So we are here with David Gilbert to discuss the TV show, Bodkin and do an in-depth analysis and behind the scenes of the new streaming television show that his in which his wife plays a major role.
[00:00:24] Big role. Yeah. Yeah.
[00:00:27] But actually, David Gilbert is a reporter at Wired who covers disinformation and online extremism and how these trends are impacting people's lives around the world with a special focus on the 2024 US presidential election.
[00:00:39] He worked at Vice News prior to being at Wired and David thank you so much for being with us.
[00:00:46] No problem. It's good to be here.
[00:00:49] The presidential election in the United States is less than a half year away, which means it is disinformation season. I don't know if that has always been the case but in the social media age it is certainly the case and with the advent of new technology, the amount of disinformation is more sophisticated, more widespread and more algorithmically tailored to each viewer and listener.
[00:01:11] From AI phone calls New Hampshire where a deepfake Biden told Democrats to save their vote and onward.
[00:01:18] The new technology isn't needed for disinformation campaigns that target the press and social media testimony in Trump's hush money case or election interference case detailed the catch and kill operation that stopped negative press about Donald Trump at the same time the operation made up fictional negative stories about his opponent.
[00:01:37] So what are voters to do to discuss this and more we do have David Gilbert.
[00:01:42] David, where do you think is the best place to start a discussion about misinformation?
[00:01:48] And in part I ask because the word is no longer new to people and I do wonder is anybody bored of it or people thirsty for understanding it? What's the area of misinformation that people are missing most?
[00:02:00] That's a really good question because I was speaking to René de Resta recently from the Stanford Internet Observatory who's a new book coming out called Invisible Rulers and she discusses how algorithms and just things and people you can't see are helping to define how people or what people believe.
[00:02:23] But she said that she has an issue with the terms misinformation and disinformation that they've kind of lost their meaning to a large extent because I think where everything now is disinformation or misinformation, you know whether you're on the right or the left if it's something you disagree with its misinformation or its disinformation.
[00:02:43] And the terms are used interchangeably in a lot of senses when you know they do have very specific different meanings.
[00:02:51] And I think that's the big issue for me is that we're at a point now six months out from the 2024 election as you say where people just don't know what to believe.
[00:03:08] People are struggling to get timely information especially when it comes to breaking news and they want information fast but they want accurate information.
[00:03:18] That's incredibly hard to get now because of the way social media platforms have gone in terms of prioritizing engagement over everything else.
[00:03:28] So I think that's the big issue heading into the 2024 election is that the vast majority of people are struggling to get accurate balanced information.
[00:03:41] So they've either given up on news altogether which a lot of people have I think or they're just in their own versions of reality where they're either fed Trump's feed from truth social plus Fox News plus the Daily Wire and they've got their worldview from that.
[00:03:59] Or they're on the other side of the aisle where they're just consuming MSNBC and they're reading threads and they've got their worldview from that.
[00:04:09] And they're two very, very different worldviews. So we're going to election where the country is already divided and there's very, very little center ground where people can agree on.
[00:04:21] I think that's that to me is the big, big issue going into 2024.
[00:04:27] I want to get into definition of terms and even want to ask you to sort of call out the most frequent instances when people are misusing the term to just use a paintbrush of lies to describe something with which they disagree.
[00:04:42] I am reminded as you ponder that for a moment the opening, I think line of Chernobyl we started by watching an arguably prestige TV series in.
[00:04:54] I will not do it the justice that Jared Harris did it with the opening line was what is the cost of lies. It is not that will mistake them for the truth.
[00:05:03] The real danger is that if we hear enough lies and we no longer recognize the truth at all. And what can we do then?
[00:05:11] Definitional to start differentiate misinformation from disinformation and when people misuse those terms.
[00:05:20] So I suppose misinformation is information that's just factually inaccurate and is being shared, not necessarily for any reason or for any cause, but just because people believe it to be true and they want to be the first to share it or first to post it on X or first on Facebook or Instagram.
[00:05:39] And that's that's a really, really powerful driving force for the spread of misinformation is people's desire to be first and to get the information out there that they believe is true.
[00:05:52] And you know, to get whatever kudos they get from being the first person to do it or being on top of the news. So that's misinformation.
[00:05:59] And, you know, the reasons for why it's misinformation are myriad. But, you know, mostly it's just because people don't take the time to vet sources or check if something is true.
[00:06:11] They'll just go and see something interpret whatever way they want and post it online.
[00:06:17] This information, on the other hand, is information that is inaccurate and factually false. But, you know, probably has a grain of truth in it.
[00:06:27] But it is purposely spread either by an individual or typically in a coordinated campaign, either by a political party or a nation state.
[00:06:38] The most famous example obviously being Russia over the last few years.
[00:06:43] And it is spread with a particular purpose in mind, whether that's to swing an election or to just simply in a lot of cases cause chaos and cause confusion and disrupt people's belief systems so that they don't know where what's true and what's not true.
[00:07:03] And I think there's another one that we need to kind of look at here as well as propaganda. It's kind of an old school term that, you know, has been around for a long time.
[00:07:12] But it kind of falls between the two where propaganda is something that campaigns and we've seen this mostly from the Trump campaign where they post and spread information that they know to be false.
[00:07:26] But they're kind of packaging it in such a way that it is part of their policy agenda or something like that.
[00:07:36] So it's an effort by them to appear as if they are, you know, really statesmanlike or whatever in what they're saying.
[00:07:47] But what they're actually pushing is based on something that's not true, is based on disinformation.
[00:07:53] So I think it's important to understand the three of those are separate buckets. There's crossover obviously between them, but I think it's important to kind of make sure when you're saying something is disinformation or misinformation, you're saying it accurately.
[00:08:07] And again, you're saying who is calling out who is the who are the ones who are painting everything as disinformation.
[00:08:15] Everyone is kind of, you know, their instinctive reaction to news that they don't agree with these days in a lot of cases is just to call it misinformation, call it disinformation.
[00:08:30] We've heard from Trump for years the term fake news that's still being thrown around, you know, the fake news media.
[00:08:39] And that's kind of where it came from is that everything that is published in the mainstream media can be dismissed as fake news because that is kind of coming from the top.
[00:08:51] That's coming from Trump. So, you know, if he's saying it, then it must be true.
[00:08:59] I think about the I worry about this for myself.
[00:09:02] I wrote my own confirmation bias impacting my guesses about what is real and what is not real and what sources are real and what sources are not real.
[00:09:14] How are you tracking disinformation when it when it is when it rises to when it's not merely misinformation and one of the things I heard
[00:09:24] where you differentiate is misinformation something that's false and might be published because of a fast click or a desire to be the first clicker.
[00:09:31] Whereas this is more of a coordinated campaign.
[00:09:35] And and what advantage it has is there's enough grains of truth that people who are truth conscious will still continue to spread it.
[00:09:43] And I think your focus is in disinformation, which overlaps the propaganda overlaps with disinformation.
[00:09:48] But I think your focus is disinformation. When do you decide that you need to start tracking it?
[00:09:53] When does it rise to the level or sort of move over to the lane where you say, yep, this is my jam.
[00:09:59] This is the thing I need to focus on and I need to uncover it.
[00:10:03] It varies, I guess.
[00:10:07] Like, for example, today, the Slovakian prime minister was was shot in Europe and virtually instantly my reaction is OK, let me go and see what the reaction is.
[00:10:22] And instantly, because he is a populist, far right leader in the kind of mold of Viktor Orban from Hungary.
[00:10:29] He is pro Kremlin and anti sending aid to Ukraine.
[00:10:33] Immediately, the reaction in Russian circles that I would be monitoring is that this was Ukraine.
[00:10:39] And you can see it immediately that everyone is singing from the same hymn sheets that Ukraine across the board is being blamed.
[00:10:48] And that's it's quite easy to see that it's quite coordinated.
[00:10:52] You can see that happening at the same time on X on Twitter.
[00:10:57] He was being blamed.
[00:10:59] The WHO, the World Health Organization, has been blamed by anti-vax groups because he was.
[00:11:05] He was he was critical of the WHO's covid regime.
[00:11:11] So they were blaming the WHO.
[00:11:14] And that, again, was pretty widespread, pretty instant, pretty straight off.
[00:11:19] Then you go into kind of more conspiratorial channels where they're all conspiratorial.
[00:11:23] But you go into the kind of more extreme ones and, for example, QAnon, which I was attracted for quite a long time, you went to their channels and they're blaming everyone.
[00:11:31] They're blaming the global elite.
[00:11:33] They're blaming everyone from Biden to the liberals, to whoever they wanted.
[00:11:42] They can put their particular bogeyman on this.
[00:11:45] And because Robert Fico, who is the PM of Slovakia, was kind of part of this wave of populist leaders in Europe, then they believe that the liberals are out to get them.
[00:11:59] So that is like there are, I suppose, in instances like that when it's breaking news, you can just see that someone kind of flicks a switch and it's spread everywhere.
[00:12:13] In a lot of other cases.
[00:12:15] Oh, go ahead.
[00:12:16] In a lot of other cases when it's kind of a more, it's not linked to very specific breaking news, but it's kind of a general murmur effectively.
[00:12:29] You just see references to certain things pop up across the different channels that you're monitoring and you might not see it day after day, but you'll see it one day and the next day.
[00:12:39] And the tempo of it increases and you can kind of, you kind of tune into it and you see it more and more.
[00:12:45] And that's kind of, we'll say with the current conspiracy in the US about migrants, a flood of migrants coming in to vote in the elections for Biden.
[00:12:54] We kind of saw that that's been kind of bubbling away in the background for quite a while.
[00:13:00] And then suddenly it just took over and it was everywhere, especially in right wing media.
[00:13:06] So it's very hard to say kind of how you can spot coordinated disinformation.
[00:13:14] It's just a matter of kind of basically spending a lot of time in these channels every day just to take the temperature of them and just to see what trends are happening.
[00:13:25] I want to get in on that and how it gets tagged and how the tagging can be believable rather than the tagging itself, the identification itself, the description of something itself as disinformation isn't then manipulated and called itself fake news and how anybody can, as you say, how can the center hold?
[00:13:40] And I don't mean the political center. I mean the truthful center. I mean the accuracy center.
[00:13:44] I am reminded as you were talking of an interview we did a couple years back with, was it Miles Armali at University of Mississippi?
[00:13:54] You did a study, Why Me? The Role of Perceived Victimhood and that a growing dynamic in the social media era, not only media but social media is, I used to coach basketball like basketball.
[00:14:08] You draw a foul, right? Somebody touches you a little bit and you make it real clear to the ref that they touched you a little bit. Sometimes you make it look like they touch you a little more.
[00:14:16] Or if you trip and it's your own fault, you know, somebody's not quite as ethical a player or just wants to win and who says anything about ethics in that context?
[00:14:24] They just pointed somebody and say, oh, that's the person at risk. That's the reason I tripped.
[00:14:27] Even if you didn't get a little bit fouled, you pretend you got fouled when there is such a thing.
[00:14:32] And that in the modern social media context, drawing a foul saying somebody else did it or when something happens saying, yeah, we got fouled seems to have particular rhetorical advantage because it then adds moral authority and social permission.
[00:14:46] It marshals people to your cause. Oh look, that person got harmed. We're with them.
[00:14:50] And the counterpunch feels often that it does have more moral authority and social permission.
[00:14:57] Russian misinformation, excuse me, Russian disinformation seems to be a particular focus of yours and should be a focus of the world's.
[00:15:05] What's the methodology by which you say what you know it is?
[00:15:09] And I'll tell you that I have thought about this a bunch more than as healthy and not as smartly as you have.
[00:15:14] And what I did was any time I saw somebody who I knew confirmed from news reports was a Russian communications tool.
[00:15:22] And I identified several of them. I put them on a list that I see on what used to be Twitter.
[00:15:28] I use it a little bit less now, and I just have a little list that's called Russian Voices.
[00:15:33] And then when I would see somebody who would retweet a bunch of their stuff, sometimes I would add it to the list.
[00:15:39] When my scientific brain tried to, which isn't most of my brain to be clear, but my scientific brain tried to analyze that methodology, there is a risk of confirmation bias myself.
[00:15:48] I say, oh, this is something that I think is the there's something I disagree with.
[00:15:52] Therefore, I want to call it. I want to call it Russian disinformation.
[00:15:55] What's your methodology? So you already said I got to dig into it.
[00:15:59] There's no easy answer. I got to pay a lot of attention to these channels.
[00:16:02] How do you start not only identifying because at some point you start gathering a sense.
[00:16:07] How do you prove that sense? How do you show that a thing really is spread as information disinformation?
[00:16:13] Not only that the thing is not factual, but in fact it is part of a coordinated campaign and the people sharing it are agents of a coordinated campaign.
[00:16:21] Yeah, it's it can be difficult and it can be very easy to just go with your gut and say this is coordinated and kind of write something up and just go.
[00:16:32] And I think it's very tempting to do that a lot of times because it's a lot less work than the rest of us.
[00:16:39] So I suppose first to understand what a Russian disinformation campaign is and to be clear, they are.
[00:16:45] Disinformation campaigns happening every single day, every week of the year, every month of the year, all around the clock.
[00:16:52] It's just happening. They're targeting countries all across the globe, and it's not just the US and it's not just Russia.
[00:17:00] Obviously, pretty much every nation state does it most are not as good as as Russia.
[00:17:07] So a Russian disinformation campaign typically involves three separate aspects.
[00:17:13] One is a kind of official part of it where you will see the story run on Russian state media or it will be tweeted out by officials from the Kremlin.
[00:17:26] The second part is a telegram channels which are hugely influential in Russia, Russian language telegram channels be specific in a lot of cases.
[00:17:36] And they are you know, they have millions of subscribers and they're typically they're all pro Kremlin.
[00:17:45] Obviously, they're all they a lot of them would have gained prominence after the Ukraine invasion because they went and embedded with the Russian military.
[00:17:54] And they kind of became military bloggers and they were on the front lines and they were able to spread Russian propaganda that way to people in Russia.
[00:18:01] So they're another factor. And then the third factor is the coordinated inauthentic campaigns that we see on social media, whether that's Facebook and Twitter, which is kind of where they typically run these campaigns.
[00:18:15] Say the third one again, say the third one again.
[00:18:17] An inauthentic kind of campaign run on social media, typically Facebook and X.
[00:18:24] So these are what we kind of think of as Russian disinformation campaigns where bots on Facebook and on Twitter will just spam out the particular narrative that they're sharing.
[00:18:37] So to take one recent example and kind of look at how I went through it was that I don't know if that's helpful, but just to explain kind of how it happened.
[00:18:48] So the I suppose the first thing that would would happen is you kind of get a sense that something is happening because there is a big event.
[00:19:02] So for example, the college campus protests in the US. That is just from experience.
[00:19:10] I know that would be a prime example for Russia to take advantage of because it is an event that is already dividing people in the US.
[00:19:19] So that is perfectly aligned with what Russia wants to do. They just want to divide the country more and more and more.
[00:19:26] So what I did was I looked at the Russian state media and the Twitter accounts and Facebook accounts of the Russian Kremlin that the officials that do post on those.
[00:19:39] And I could see that they were pushing narratives that were kind of both pro-Israel and pro-Palestine at the same time.
[00:19:47] But the main narrative at the background of all of this was that this shows how weak a country the US is because freedom of speech is meant to be one of the cornerstones of the US Constitution.
[00:20:00] But look, they're shutting down all these students. So it was kind of a way of mocking the US and mocking the hegemony of the American government.
[00:20:09] And it's kind of typical Russian playbook stuff. So then what happens is I need to see if this is the same narrative is being pushed at the same time in telegram channels.
[00:20:19] Now, I don't speak Russian, but I have sources who do speak Russian and who monitor hundreds of these telegram channels.
[00:20:29] So I went and I spoke to them and they were able to see that this narrative began to be pushed pretty much uniformly at the same time across all these channels.
[00:20:39] And it got huge numbers of interactions. So that would tell the people running the campaign that this is something that is gaining traction.
[00:20:48] So let's roll it out with our social media bots. So then like 24 hours, 48 hours later, we see one of the groups, one of the networks of bots.
[00:21:00] It's known as Doppelganger and has been operating for years.
[00:21:04] They began to promote the idea of Russian or of American government being weak and not being able to deal properly with the protests that are happening, you know, in campuses in their own country.
[00:21:21] So that's kind of how it plays out over the space of maybe 48 to 72 hours. They kind of test something out in the state media channels.
[00:21:32] They see if it'll gain traction under telegram channels and then the inauthentic bot accounts on Twitter and on Facebook would push that out to the wider world.
[00:21:41] They'll use English language accounts, they'll target American users and they'll spread that misinformation and disinformation across the globe as quickly as they can.
[00:21:53] Lots of campaigns don't necessarily work as well as they do, but it doesn't really matter because it costs the Kremlin nothing to do this.
[00:22:02] It's, you know, apps, they're so used to doing it. It's almost automated at this point. The platforms can seemingly do nothing to stop them because they've been operating for years and while they have taken down masses amounts of these accounts, there's still loads of them up there.
[00:22:19] And if one of these campaigns suddenly takes off, then they've won. That's all they need.
[00:22:27] And so the tempo of these campaigns in the last six months, we've been able to see that they've been picking up slowly but surely campaigns that are attacking the US.
[00:22:38] And now with six months out, this is the kind of the season, as you said, start for disinformation where we're going to see a ramp up or we expect to see at least a ramp up of Russia trying to really influence the outcome of the 2024 election.
[00:22:55] Why is Russia better at it? You said nation states do it. Russia does it particularly well. Why do they do it better? How does that manifest itself? Just because they've been doing it longer, they just put more money into it.
[00:23:04] How or why or in what ways are they better?
[00:23:07] They put a lot of money into it initially.
[00:23:10] And the kind of Kremlin Trial Factory in St. Petersburg was kind of way ahead of its time in terms of how it was, in what it was doing and the platforms just did not have a clue.
[00:23:23] So Russia had two, three years runway before the platforms were aware of what they were doing. So they were able to really figure out what worked on these platforms, what didn't.
[00:23:34] We saw in 2016 they were able to create these accounts that pretended to be, you know, the Tennessee GOP account that was famously quoted multiple times in media articles as kind of a bellwether of GOP sentiment was a Russian boss.
[00:23:49] So they were able to get ahead. They also are really, really good at understanding the context of US politics. They guess what makes people angry in the US. They guess what topics that they should target.
[00:24:05] For example, in the wake of the George Floyd murder, the protests that we saw there, they were able to not only insert themselves into the argument online.
[00:24:18] They were able to help organize or try and organize protests on the ground from both sides. So all they're doing is trying to sow chaos.
[00:24:27] And other countries, for example, China, and I wrote about this recently. China has been running disinformation campaigns for seven years and has had absolutely no success.
[00:24:37] It's called their campaign is called Spam of Flash Dragon. And they are like it's incredible how researchers who follow this, it's huge scale.
[00:24:48] It's hundreds of thousands, millions of accounts across all platforms and researchers who are tracking this campaign are so paranoid at this point that they believe it's there's another campaign happening that they're not seeing it.
[00:25:01] That's Spam of Flash Dragon is designed just to take researchers time. So it's because they're the only ones who are looking at it.
[00:25:08] No real people are looking at these because and the reason is China does not understand the US context because it is operating in an internet environment that is completely alien to the open internet.
[00:25:21] What works in China in terms of government disinformation campaigns or control, whatever you want to call it, does not work on the open internet.
[00:25:29] Yes, you can flood WeChat with whatever messages you want and everyone will see it because you control that. You don't control Twitter or Facebook.
[00:25:37] And they just don't have that type of context of how it works. The very most recent campaign that they did has shown very small signs that they seem to be changing their tactics and that may work.
[00:25:51] We'll have to wait and see. But Russia is just so far ahead on these things. One researcher said to me the Chinese account is like Russia's was 10 years ago, and they've been going for seven years.
[00:26:04] So Russia is just very, very good at this. They understand how to engage people and make people angry. And the internet is just a perfect vehicle for that.
[00:26:16] Any example you can give of the way a Russian trollbot farm would do it? How they do it at the St. Petersburg troll factory? I've seen outside pictures of it. I don't know if anybody's been inside.
[00:26:27] I don't know if a journalist has ever been inside of it.
[00:26:29] Yeah, Adrian Chen was in there. He was the first journalist who broke the story.
[00:26:34] Adrian Chen was an intern of mine.
[00:26:36] Oh really? That was like a long time ago.
[00:26:41] The St. Petersburg troll factory does it better. Can you give an example, either an object example, either a narrative or just a tactical example of the way that they would get it right and smart?
[00:26:55] And by right I mean wrong and inaccurate but effective and persuasive. Whereas the Chinese operation might miss the beat or might not get the right note.
[00:27:08] Yeah, there's two parts to that. The Chinese one is wrong for so many reasons. Their grammar is wrong. They talk about a Republican but they're actually a Democrat candidate. They just get very basic things wrong.
[00:27:22] So that rules them out. What Russia does is, and it's constantly changing. And the most recent example, not the protest but the one before that where they were kind of, remember the crisis on the Texas border earlier?
[00:27:36] This year and there was a convoy going down there and all that stuff. So Russia ran a campaign around that as well. But rather than just, you know, spamming out these messages on these accounts that have, you know, a number of followers and haven't really done anything.
[00:27:53] What they do is they respond to other people's tweets. They get involved in conversations. They look as if they are engaged in real world conversations with real world people about a real issue.
[00:28:09] And it's so much more difficult for Twitter to identify those accounts because they're acting as if they are real people. And that is one tactic. And in a lot of cases, they're not real people. They've just been kind of automated to do this, to reply to people's tweets.
[00:28:31] So let's say someone has tweeted something that seems to be gaining traction. You'll all of a sudden see a Russian boss responding to that. And they are experts in understanding how the Twitter, the X algorithm works in terms of who it's promoting.
[00:28:51] Like for a while there, they were promoting if your post got more responses than retweets or shares, that was then promoted higher up. If your post had video, it gets promoted higher. So they're aware of all these tactics and they're able to kind of train their systems to do that, to be able to look and operate under the radar.
[00:29:15] And they don't need huge amounts of engagement. They just need some people to see it. And if one of those tweets gets shared by someone with a massive number of followers, then that's a success.
[00:29:28] That's a success.
[00:29:32] It is fascinating to me. I want to lobby you on something, maybe something like this already exists, but I'll tell you what I have wished existed. What I wished existed was a top 10 list. Wouldn't that be top 10? It could be top 3, it could be top 50.
[00:29:43] But a top 10 list that said, here are, that identified confirmed disinformation trolls and bots. And then, and had a legitimate way of understanding that. And then just listed every day or every hour or every week you pick the time frame.
[00:30:01] And maybe this exists. If it exists, just tell me where to follow it. That said here are the, not here are the top 10 places because then you might, it might help them. People might go follow it. That's not what we're asking for.
[00:30:12] But just lists their topics, right? The, here are the, here are the things, here are the top 10, here are the top 10. It reminds me, it's sort of internet age of the pants on fire thing that won the prize.
[00:30:26] Here's just, just so you know, here's the top 10 things, could be top 5 things that the confirmed Russian bots or you pick the nationality or transnational are sharing this week. Ranked, right? Ranked with frequency.
[00:30:41] I think it'd get huge following. I would follow it all the time and I think you'd then have more folks that would do it. And I've, and I talked to a buddy of mine who sold his company to Twitter.
[00:30:49] He said, that's a good idea. I'm busy. And I, and I, there's somebody who was called bot sentinel. It was purportedly doing something that made me think a little bit of it, but I, it wasn't doing that. So that's my idea to you. If it exists, I'll follow. If it doesn't exist, somebody should do it. And maybe you.
[00:31:04] I don't think it exists. I can see a huge amount of problems with it. I think it would cause probably more problems than it would solve because it would just automatically be weaponized.
[00:31:22] Yeah, they would just put on things that are manipulated, but it would be weaponized by the people who have a reason to weaponize every single thing that's on the internet these days anyway.
[00:31:33] And so if it is all, you know, Trump stuff or all Biden stuff, then one side or the other will take that and say, Oh look, they're just trying to pretend that everything about us is fake. You know, I just.
[00:31:49] I will say it's hard to get anything to be believed.
[00:31:54] And I'd want something, I would love to log on every morning and look, someone has done the work for me and I can see what the five most, to be honest, if you just go to the trending tab on X, that's not too far away.
[00:32:10] That's fantastic.
[00:32:15] That's the line of the day.
[00:32:17] That's fantastic.
[00:32:20] When you dig in, how do you know something so when you're looking at one of the sources from China, they give themselves away.
[00:32:28] What are the tools use? How can you sort of beyond just again your sense say more about how you identify what the what sources are fake versus what sources are real now again not just what facts are fake what facts real but sources are fake with sources.
[00:32:42] You mean the accounts themselves.
[00:32:44] Yeah.
[00:32:45] Well, it's, it's not too difficult but I suppose, mainly, I will typically fall back on researchers who are like deep inside network maps of all these people to confirm anything that I may have.
[00:33:01] So like the researchers are the ones who I lean on as companies across the globe who are doing this 24 seven.
[00:33:09] I suppose the, it's, it's, it's for the Chinese ones. It's quite easy they use generative AI profile imagery. So you kind of tell still if something is gen AI because it just doesn't look right and that person has six fingers on their hands so you know maybe it's not real.
[00:33:28] You look at their, their interactions like even if an account has been around for a long time, and they have posted a lot of messages, you can kind of see typically what happens is they get reused so a lot of these bots are not made by the Chinese government they will be bought from
[00:33:45] from bot farms. And so the bot farm they won't sell them but they will rent them out to whoever is using them on that day. This happens on Facebook as well massively with Russian campaigns. So that's how some researchers are able to spot them again generative AI imagery, but they
[00:34:04] will spam for like 24 hours, something about Cuba. And then they'll stop. And then like a month later, 24 hours they will spam stuff about Ukraine. And that's a way of knowing that they've just been hired out to push a certain narrative for specific
[00:34:25] amount of time. On Twitter it's similar, you'll see their history where they've kind of one of the one of the very quick ways that I use to identify campaigns like this. And I uncovered one actually in relation to the Cape Middleton controversy, conspiracies that came out. And we kind of
[00:34:47] tracked back to India, which is interesting. But there was a line, just it was a line and it just didn't read naturally. It was just the post seemed a bit stilted or they were trying to get these certain keywords in or something. So I just copied it from the tweet that was it was on and put it into the Twitter search bar.
[00:35:07] And it was just hundreds and hundreds and hundreds of identical tweets. And that's a, that's a very common tactic is instead of writing individual tweets for every single of the thousands of bots they have these, these guys won't do that. They will just use the same tweet over and over again on the same accounts and then spam it out.
[00:35:27] And that's how you, you know, at least at a relatively low level, that's how you spot them. I guess the big issue now is we're getting into the era of generative AI. Just to be clear, I don't think we're there yet. I don't think it's a massive, this massive threat that everyone seems to think it is.
[00:35:47] But down the line, generative AI, what it's good at is iterating and coming up with unique lines. So you could feed it a line and say come up with 1000 different versions of that. And then you feed that automatically. And that's all automated. That doesn't have to be done manually.
[00:36:03] And then you're in trouble. That's going to make it exponentially harder to track these things. But as I said, there hasn't been evidence of that happening yet. It will happen. It just depends how well it's done to, you know, whether it's going to be those groups will be able to be tracked or not.
[00:36:26] You said something that one of the risks is any effort. You didn't say any effort, but I'm saying any effort that aims to categorically describe and label and sound the alarm about this information itself carries with it the risk of being described as this information by the people who are being criticized or by the people who disagree with that message.
[00:36:54] And then you said, well, I'd like to have the list of what are the top five trending disinformation stuff. And you said, well, you basically just look at the trending topics on Twitter on access.
[00:37:06] You laugh, but that wasn't a joke.
[00:37:08] I am not laughing because of its inaccuracy. I'm laughing instead of crying and shivering in the corner from fear and sadness over the state of democracy.
[00:37:19] My guess is, and then you said if it's more Biden or more Trump, then it'll get criticized by. And I want to be careful. So my inclination, but you've looked at this more closely, is that if I were going to look at Russian disinformation, I would guess that it would be more pro-Trump.
[00:37:37] And is that not half and half, not 60-40, not 70-30. You tell me what? First of all, am I wrong in that estimation from your research? And if you wanted to give it a 60-40, 90-10, 90-9-1, whatever ratio, what would it be?
[00:37:54] No, I think you're wrong. It's not like it's more, if I was to kind of say how to describe Russian disinformation right now, it's anti-American.
[00:38:11] It's kind of pointing and laughing at the US and all the problems that the country has in terms of how divided it is, and as you said, the state of democracy and the claims that are being made.
[00:38:26] And that's ultimately a part of how that happened is because of what Russia did around 2016. That was their moment to go in and do things under the radar without being spotted and cause chaos.
[00:38:45] And in 2020, everyone thought that they would be back and that they would be doing the same thing. And they weren't. They didn't come back and kind of boost Trump or attack Biden.
[00:38:55] They just, they didn't just sit back. They were obviously doing things in the background, but it was more to drive a wedge through the division that had already kind of been created.
[00:39:05] And I think in Trump, they saw the thing that could help undermine American democracy, which is their ultimate goal. It's not to get Trump into the White House.
[00:39:16] It's not to get Biden into the White House, obviously, but it's just to create absolute chaos. And they saw Trump as the person who would do that.
[00:39:26] And, you know, boy were they right. And I think right now, well, yes, there is some pro-Trump stuff. There is some vaguely pro-Biden stuff.
[00:39:40] It's mostly, look how disastrous everything is going over in the U.S. So around the campus stuff, there was pro-Israel, there was pro-Palestine stuff, but it was all with an underlying narrative of the U.S. democracy is broken.
[00:39:59] Dig into that a little bit. So Israel Hamas, again from afar, not looking up close, not at spending the time in the channels that you spend. I do now have, I've almost become a conspiracist myself.
[00:40:11] If I see a social media cause salem, if I see something that's spreading a bunch, I now have a hypothesis, not a conclusion, but a hypothesis that I don't spend the time to rigorously examine.
[00:40:25] I now have a hypothesis that that is something that is being pushed by, not, I didn't say originated by necessarily, but I wouldn't rule it out necessarily.
[00:40:33] I wouldn't say push only by or I didn't even say primarily by. But if I see something that is becoming a cause salem, a negative cause salem, a wedge cause salem within the United States, then my hunch is no, no, not even hunch.
[00:40:47] Let me just call it hypothesis. My curiosity is, oh, I wonder if or opening hypothesis, I suspect that Russian bots help push that stuff.
[00:40:55] Yeah, that's a good hypothesis.
[00:40:58] All right.
[00:41:00] Every single incident, the Baltimore Bridge collapse, the Texas border crisis, these protests, anything that can be used to highlight deficiencies within the US democratic system is going to be jumped on by bots.
[00:41:22] As I said, like they, some of them, and it's, they don't originate it virtually.
[00:41:32] Like the issues will not originate with Russia. It's very hard now for Russia to gin up a conspiracy or a narrative that is kind of unique.
[00:41:45] Put it out there in the world and it for take hold in the US.
[00:41:49] A lot of people believe that that's what happens, but that's not the case.
[00:41:54] What they do and what their experts ask is now that the US is on fire in their view, when things kind of explode, they'll jump on it.
[00:42:05] And that's where they are making their gains now. It's not in kind of coming up with, you know, they didn't come up with the kind of idea that Israel and people supporting Israel and people supporting Palestine should have face off on campuses across the US.
[00:42:23] That's kind of something that happened in the US. And then they saw it dividing the country. Okay, let's go.
[00:42:31] So what I hear you saying is they don't, you know, quoting Billy Joel, they didn't start the fire and switching metaphors.
[00:42:40] They did start the fire to an extent in 2016 by helping Trump get there. But this is all the long tail of him coming to office and the destruction that it's caused to US society as a result.
[00:42:52] His divisive rhetoric, his hateful rhetoric in a lot of cases has caused the country to just split apart in a lot of cases.
[00:43:04] And there's very, very little middle ground where families are in a lot of cases no longer talking to each other.
[00:43:10] And so Russia did. So I think Russia is overestimated and at the same time underestimated in what it's doing in relation to this information.
[00:43:21] It's overestimated because I think it is not the seed of these issues that are plaguing the US right now and underestimated because it's not clear what it's doing, but it is doing something.
[00:43:38] It's very hard to kind of go, look, this is what it's did. This is what they did because they're not flashy big thing like we'll say, for example, the Biden robocall that we saw.
[00:43:51] The AI generated robocall. Immediately a lot of people instinctively went, I bet you that was Russia.
[00:43:58] And it turns out it wasn't Russia. But I guarantee you Russian bots were pushing narratives around that saying, look, the US doesn't know what's real and what's fake because that is something that is now happening organically in the US, unfortunately.
[00:44:15] And Russia just is sitting back and kind of going, okay, let's just wait for tomorrow because something else will happen tomorrow and we'll use our bots and we'll use our understanding of social media to just make people angrier.
[00:44:28] Is it an overstatement to say or is it accurate to say that the United States is under information attack every day? Psychological attack, psychological attack, psychological information attack every day.
[00:44:41] Yeah, I think that's a fair thing to say.
[00:44:44] The other metaphor.
[00:44:47] Just the caveat that it's from outside the US but also from inside the US.
[00:44:54] Yeah, but this isn't just Russia doing. This is the whatever way that the media and the social media system has developed.
[00:45:05] There's attacks coming from everywhere. You know, the attacks on the LGBTQ community in the US are coming from within not from Russia.
[00:45:14] Sure.
[00:45:15] School boards attacks are coming from within not from Russia. Russia is taking advantage of all these things, but it's not necessarily at the forefront of it.
[00:45:27] Couple analogies come to my mind as you were speaking. One is a guy went to high school with his name was Manny Bignago and we call him the instigator because if somebody did a small thing, would say, did you see what he told us about you?
[00:45:39] Did you see what he said? You're not going to take that are you? And he would try to give him Friday or Saturday night party, try to see if you get two people to get into it.
[00:45:47] The other so they're playing the role of Manny Bignago. Sorry Manny. The other role, it seems like they're playing is I worked as a cattle hand years ago and we dug, most of my summer was miserable, digging post holes to plant fence to dig fence.
[00:46:03] And and to start it, you would use like a digging bar right sometimes called a pry bar, sometimes called a pencil point. And where you where you throw it at.
[00:46:12] So and you and you'd sometimes hit it just straight on the ground. But what you're hoping to find it just a little, little, little bit of a crack because you've had a little, little bit of a crack.
[00:46:19] You could get way deeper, right? You could pry that thing open. You get a good start in this hole. You had to dig which before you had to take out.
[00:46:26] So the post hole digger, it was way better to get a good start on it. So I hear you saying is they look at for those little cracks, sometimes medium sized cracks, sometimes pretty big cracks. And then they and if it's a little tiny crack, they take out a digging bar. They take out a pencil point.
[00:46:39] And if it's a pretty decent size crack, maybe they drop in some dynamite.
[00:46:44] Yeah, I think there that is a perfect analogy of what's happening.
[00:46:48] All right, that's helpful. The cracks are there.
[00:46:51] They're just taking advantage of that. What are social media companies doing about it now?
[00:46:58] Nothing. Oh, sorry. You want me to expand? Okay. No, no. My my my I I appreciated your answer. I appreciated the candor of your there.
[00:47:12] They're absolutely doing nothing. X is just consistently updating its product every single day to make the situation worse.
[00:47:21] And it has for instance, well, for instance, I think the biggest thing is when Musk took over whatever it is, like 18 months ago now, he the big the big change is like Twitter was always great for breaking news.
[00:47:42] Always go there. The you know, the news organizations would be at the top, the most trusted people.
[00:47:49] So what Musk did was he changed the whole way that it worked by getting rid of verified accounts and making people pay for verified accounts.
[00:47:59] But then he's told people, oh, well, you can get your money back if you just get engagement.
[00:48:05] So what that did was that drove people to just post whatever got the most engagement.
[00:48:13] And so what you'll see now when you go on breaking news, like for example, as I mentioned today, the Slovak Prime Minister Robert Fiegel got got shot.
[00:48:21] Go on Twitter because you know what else are you going to do?
[00:48:25] Went on Twitter and like the first 10, 15 are all blue checkmark accounts.
[00:48:32] Lots of them with OSINT in their name, all posting videos because videos are all that gets engagements or gets promoted on Twitter these days.
[00:48:42] And none of them with any information that was accurate or fact checked.
[00:48:48] And they're getting tens, hundreds of thousands of views and engagement.
[00:48:54] And they'll all make money because that's what that's the way Twitter set up.
[00:49:02] So if you're trying to share information on Twitter instead of going, what's the most accurate, most verified information and balanced information I get, that's not what you do.
[00:49:14] What you do is you do what Trump used to do on Twitter is you use post the most outrageous and you know, I was looking today at a Secretary of State candidate in Missouri.
[00:49:28] Someone Gomez, I can't remember her first name.
[00:49:30] She's a real estate.
[00:49:32] She works in real estate and she's running for Secretary of State and she posted a video of herself in a bulletproof vest running through a predominantly LGBTQ community telling her voters not to be gay and weak and then saying stay f***ing strong at the end.
[00:49:56] So it was like a Temick Denik clip of her holding a massive rifle, obviously.
[00:50:00] So that was her post and then she tagged the two Tate brothers who are wanted for sex trafficking and alleged rape in Romania.
[00:50:10] So she tagged them.
[00:50:12] So that was one tweet and it got 2 million views and she's running to be the top election official in her state in a deeply red state.
[00:50:24] And that's the type of content that you're going to see more and more on Twitter ahead of the election because that's the stuff that gets engagement.
[00:50:34] And there's no trust and safety team there anymore.
[00:50:38] They don't respond to media.
[00:50:40] If you send them a press request, now what you get is you get we're busy try again later.
[00:50:46] And it's just an automated thing that it sends out.
[00:50:49] So that's Twitter.
[00:50:50] Facebook, they just rolled back a lot of their protections for elections.
[00:50:55] They don't really care anymore it seems.
[00:50:57] They're not doing news anymore so they say they're not doing news.
[00:51:00] So like they're kind of checking out.
[00:51:02] They don't really want to get involved.
[00:51:04] You know for years they were doing these war rooms and they were really looking as if they gave crap about what they were doing.
[00:51:12] They didn't really because they were never able to really fix the problems and they realized that now.
[00:51:18] So they've just gone, they've held their hands up and go you know what?
[00:51:21] Just do what you want.
[00:51:23] And it's terrifying because there are massive elections happening in India right now.
[00:51:30] There's European elections happening over here in a month's time.
[00:51:33] There are elections happening across the globe and obviously the US elections in November.
[00:51:37] And the stuff is running unchecked on these accounts, on these platforms and no one seems to care.
[00:51:46] TikTok they just kind of, they take action like for example for the Palestine Israel stuff that was spreading on their platform.
[00:51:58] What they do is instead of filtering out the disinformation or the hate speech, they just block everything.
[00:52:05] Which is another terrible idea because that you know silences voices that should be heard.
[00:52:10] So they're not much better.
[00:52:12] And it's just a free fall from what I can see at the moment and there doesn't seem to be anything these platforms really want to do about us.
[00:52:21] So who will watch the Watchmen?
[00:52:24] What are we supposed to do?
[00:52:26] I assume that you answered part of my question already, which is why not?
[00:52:31] What has changed?
[00:52:32] Why did they give it a shot at the first place and stop?
[00:52:35] My first mind went to, well they were worried they were going to get regulated.
[00:52:38] And then they got little, then they realized that they spent enough money on lobbying and or allowed for other things to distract the American voter.
[00:52:46] Or if they allowed for elections to be won by folks who kind of didn't care about it, then they didn't have to do anything about it.
[00:52:52] What you said was, yeah, another reason or maybe the first reason is because they tried and they couldn't really fix it anyway.
[00:52:58] What should be done?
[00:52:59] What should we do?
[00:53:00] What should you do?
[00:53:01] What should government do, etc.?
[00:53:02] What should be done?
[00:53:03] I don't know.
[00:53:05] I genuinely don't know.
[00:53:07] The thing I'll come back to again and again is that it needs to start with education and it needs to start with kids and kids need to be taught about being digitally literate in terms of looking at content and critically deciding is it trustworthy or is it not trustworthy?
[00:53:29] I think that's the only way that I can see is if you give people the power to kind of understand what's happening rather than just trying.
[00:53:41] We've tried and failed now for eight years since the 2020-16 election to make it better, and it's worse by a significant margin.
[00:53:50] Congress is going to do absolutely nothing because, as you said, Facebook and everyone else is spending so many millions or billions on lobbying efforts that anything that does come out of Congress is going to be so toothless that it just won't make a difference.
[00:54:05] And lawmakers, some of them do seem genuinely that they want to make a difference, but they're stuck in a system that just won't allow it to happen.
[00:54:14] Like we've seen some situations where individual states are trying to impose laws, but typically those laws are more repressive rather than trying to fix any issue with disinformation.
[00:54:25] And as I said, the companies I don't think are really going to do it because I think ultimately even if they may want to make it better, they just can't because they cannot figure this stuff out.
[00:54:38] Facebook has spent money on it. They have lots of money to spend on it, and they just don't because I don't think they can figure out how to make their platform safer but at the same time continue to make tons of money.
[00:54:53] And so I think ultimately it just is down to individuals or schools to provide education for kids on how to navigate the world now because this is the reality, but this is what we're living in.
[00:55:10] It's not going to change overnight at least. So I think by giving people the tools or the understanding of how the internet works, how people are trying to influence them, how people are trying to make them believe certain things and understanding how they can navigate that world by checking sources,
[00:55:30] by verifying things, by fact checking. I think that's the only way but maybe that's just kind of a pipe dream as well because that will require government intervention in the US at least.
[00:55:43] It's happening in some European countries. It's happening here in Ireland to a little extent. It's happening in the US in a patchwork way, but I think it needs to be kind of part of the core curriculum because it's such a hard work
[00:56:00] part of kids lives now. They want to be online from very early ages and I think if they don't have the tools to understand what's happening and who's talking to them and what they're saying then it's hugely dangerous.
[00:56:16] High school I graduated from just banned phones. When school starts this fall, it'll be the first time the phones are allowed during, say again?
[00:56:26] High school.
[00:56:27] Yeah.
[00:56:29] That's yeah. I think that's probably a good step.
[00:56:35] I don't think it's because of disinformation. I think it's because teachers weren't able to teach a lesson plan without somebody texting their friends.
[00:56:41] Yeah, that's fine but it doesn't solve the problem of once they get out of school they're still no less informed about what they should believe and what they shouldn't believe.
[00:56:51] Tick-tock. Accurate to say or in what ways accurate, what ways unfair to say it's a security threat in the United States?
[00:57:04] I don't know. There seems to be a lot of belief among certain people that there is a security threat there but having spoken recently to a pretty senior DOJ official on background, they were saying that they were absolutely adamant that giving people
[00:57:22] data that is then controlled by the Chinese government is a security risk. And that's it. That's the bottom line.
[00:57:31] I don't think people seem to be looking for something else, some other dare that's there, some other threat, some other bug in the code or something but I don't think that's there.
[00:57:44] I think ultimately the US government will lose their battle with Tick-tock. I could be wrong but I think they will.
[00:57:54] And by lose you mean they won't be able to force a sale to...
[00:57:58] Yeah, no I don't. I think if a sale may happen anyway because I think Tick-tock, like they're talking about a sale without the algorithm but that's like selling a car without an engine. It's kind of pointless really.
[00:58:10] But I'm sure someone from Trumpworld will probably buy it for a lot of money because why not?
[00:58:18] But I think, yeah, I'm not that much more concerned about giving my information to Tick-tock than I am about giving my information to Mark Zuckerberg.
[00:58:32] He isn't exactly someone who has taken my information and protected it and not tried to sell me all manner of things on my Facebook feed or on my Instagram feed.
[00:58:47] Everyone is trying to leverage it for different reasons. Is China trying to create a database of US citizens? I don't know.
[00:58:59] But a lot of lawmakers believe that it seems.
[00:59:02] Are there any social media engines that have been emerging that you find more trustworthy or that give you any help?
[00:59:11] I like Blue Sky. I can't seem to stop using Twitter.
[00:59:22] And I just don't have the headspace to use another social media platform regularly but I do like the kind of more relaxed or...
[00:59:35] It's not relaxed either.
[00:59:38] It's just a different vibe over there and it's more interesting. You kind of hear, you see things and hear things that you don't see in here on Twitter.
[00:59:46] And it's nice kind of change of pace. It's not going to ever become mainstream. I don't think.
[00:59:55] How come? Because the later Iver...
[00:59:58] Yeah, I think it will get its own little corner. And to be honest, Twitter is to an extent that as well because it doesn't have a huge user base.
[01:00:07] It's got a couple of hundred million that's falling and it's like...
[01:00:11] That's relatively niche when you look at TikTok and Facebook and threads now as well and Instagram and WhatsApp which are all owned by Facebook.
[01:00:21] But yeah, I don't see anyone that I go... Yeah, that's a better way of doing things.
[01:00:29] All right. Hypotheses that I have or at least things I root for, maybe both, are growth in pro-democracy media engines, more purpose driven media engines, even other ownership models, even including nonprofit models, cooperative models.
[01:00:47] More communications based on genuinely trusted relationships with trust not only based upon if you like the content of a given message, but actually if you know the human being.
[01:00:57] Those things are opening hypotheses, but I'm mostly beginning with a state of flummox and trying to get to a state of more positive hope or...
[01:01:08] Yeah, I'm definitely more positive or upbeat about how some of the media, independent media stuff that I've seen that are kind of trying to tackle this stuff.
[01:01:23] There's obviously missteps and some of it just won't work. And all the big money seems to be going in the wrong places such as The Messenger.
[01:01:32] And there's lots of places where money could go that would be better spent. I think local media is just such an important thing that doesn't get any money and it's just been gutted and being decimated everywhere, in the US, in the UK, here in Ireland as well.
[01:01:52] And it's such a pity that people just don't value that type of information. But then there's other stuff that isn't as positive, I guess.
[01:02:09] But yeah, it's a very tough environment right now to be upbeat or to be positive about.
[01:02:26] I'll go back and then forward, then we'll wrap. So going back, thinking about you said, well, Russia had a big head start.
[01:02:32] A friend of mine who is a civilian worker in the defense arm of the government says...
[01:02:44] And I don't know exactly what his job is. I know what his business card says, but I don't know exactly what his job is.
[01:02:51] And he says little, as much of his talent is an ability to say very little. So when he says anything, I find it very interesting.
[01:02:57] And he'll say something like, well, that's not that new. He'll say it's not that new. And he doesn't mean that social media isn't new. He doesn't mean that the election 2016 doesn't do.
[01:03:06] He didn't mean that. He means that Russia working on interfering with American culture and doing political organizing with the United States.
[01:03:15] Again, I'm putting words into his mouth or I'm inventing things I imagine he might say when he said not everything's new.
[01:03:22] Or if you want a popular manifestation of it, I watched the TV show The Americans. Okay, I really like that.
[01:03:28] And that this idea of finding those little cracks and then putting a pry bar in it, right?
[01:03:33] Or if you find a bigger crack, drop a piece of dynamite in it. That's been... That was happening during Vietnam War protests.
[01:03:40] That was happening during the Civil Rights Movement. And that is certainly happening now. Social media does make it different.
[01:03:46] It does mean that there isn't a gatekeeper. There isn't a filter. There isn't a Walter Cronkite. There's not an editorial board.
[01:03:51] There's not somebody saying, wait, is this true first? Or is it just something that will spread because somebody points click?
[01:03:58] Is there anything else about the dynamic that we haven't covered or anything that we need to?
[01:04:05] Or looking forward, what do you pay attention to? And I don't mean to the bad. What do you read that you believe?
[01:04:11] What do you root for? What do you share? And I don't mean you already answered. Well, I'm not saying what's giving you hope.
[01:04:16] I'm just saying what do you rely upon?
[01:04:21] I rely on... I suppose I rely a lot in this area on researchers, on people who have no skin in the game effectively, who are...
[01:04:34] And especially the ones who don't give you the quote. Well, while it's the opposite of what a journalist wants, it's the ones who will...
[01:04:45] Like, I really appreciate and I really rely on the journalist who I message and say, have you seen this? This looks bad.
[01:04:54] You know what's happening? And they'll go, I don't know.
[01:05:00] And the ones who will say, I don't know are the ones you need to rely on because they're the ones who are being honest.
[01:05:06] It's the ones who will come back to you with like two paragraphs of perfectly quotable lines about whatever you've sent them within a matter of five minutes.
[01:05:17] It's what you have to worry about because they didn't go and do any real research or they didn't really look.
[01:05:24] And there are so many unbelievably great researchers out there who are so deep and into these things and so well versed in what's happening and can give you so much context about...
[01:05:38] Like that context you were saying.
[01:05:40] Like, I don't know if you know him, Thomas Ridd, who's a professor and who's written a book called Active Measures.
[01:05:46] And it talked about Russian pamphleting back in the early 20th century.
[01:05:53] And you know, it's over a century old Russia trying to pamphleting in New York, putting up pamphlets, anti-Semitic messages in New York a century ago.
[01:06:02] So this stuff has been around for a long time and he knows that context and he will talk about that context and he won't hype what's happening now.
[01:06:10] He just wrote a piece along with a couple of other researchers about how we shouldn't overhype the disinformation threat.
[01:06:17] Yes, we should take it seriously, but we should not be overhyping the threat and disinformation.
[01:06:22] And it's... You can see it's a loss from certain researchers and certain organizations who want headlines and who want to be quoted in the New York Times or want to be quoted in the Washington Post.
[01:06:36] And they're, you know, happy to give the perfect little soundbite that can make your headline pop or whatever.
[01:06:45] But I am more reliant on the people who will say, I don't know.
[01:06:51] Let me look at this or no, this isn't a story.
[01:06:54] Well, yes, that's important.
[01:06:57] I wouldn't suggest. And there was a story recently that in relation to Telegram that I asked three researchers who I respect and rely on and they all came back to me and said,
[01:07:09] No, don't. This isn't the story. Don't do it.
[01:07:12] And you know, you kind of that's invaluable to me.
[01:07:15] But again, that's me and I built up those relationships. That's not something that's available to the public.
[01:07:20] Unfortunately, it's a discipline, but it's a discipline that is because you just said something that's the flip of something you said early.
[01:07:27] You said before you said, well, too much is rewarding the people who will think and move fast.
[01:07:32] And what you just said is we have to we have to reward and amplify the people who move slow, the people who read the thing, take a breath, look around, check.
[01:07:44] Think. Yeah.
[01:07:46] And then move and say when they really know.
[01:07:49] And the people who move fast were the people because their first they might get more retweets because they're fast people rely on being fast because they're woe John ESPN.
[01:07:57] Com. Everybody wants to follow them so they can get the instant news before it's news on who's getting traded where who's getting signed where everybody wants to be woe.
[01:08:05] And what you just said is no, no, you should be well, you should be David Gilbert.
[01:08:11] You should be somebody who takes a breath as you have answered questions to me. I don't know.
[01:08:16] And then you've gone on to say things which I appreciate, but not rewarding only the person who is certain quickly, but the person who is uncertain quickly and who builds certainty by earning it.
[01:08:26] Yeah, but that's not how the word works.
[01:08:29] Yeah, I hear that. I have no retort. Yeah, I'll use your line. I don't know.
[01:08:36] I don't know. But I do know that I really appreciate the conversation. Anything else and I and I do think when you said I would I would give you more credit than that I'd give us more I'd give the conversation more credit than that.
[01:08:49] That you said we've got to teach young people who are engaging with this stuff, who are digitally native, who get really good at testing stuff out.
[01:08:58] And so what I think you're saying is for anybody listening and for anybody who has anybody who might listen to them.
[01:09:05] Think like David Gilbert rather than like, whoa, just think slow. Yeah, but just just just before all the kind of kids kind of file in on me.
[01:09:14] Like a lot of them are already supremely expert at another way better than my uncle.
[01:09:21] I'm used to forward stuff to me all the time. It was just full of crap and said, yeah, and I appreciated it because he would say, Jeff, what do you think about this?
[01:09:28] He wouldn't say, wow, like he wouldn't use it to inflame me.
[01:09:31] We had very different politics and he would ask me to say what was going on and give me a chance to research and I realized what bonkers stuff was being shared on the list serves it of which he was a member.
[01:09:40] No, no, no. I trust my nephew's generation much more than my uncle's generation.
[01:09:43] Yeah. But it's just it's just that you need they all need to have a basic skill set so that they can just be more.
[01:09:53] And it's just about being more open to certain ideas and not just shutting things off.
[01:09:58] And we can all we can all learn from that, I think, because, you know, we all have our biases.
[01:10:04] The anything I screwed up on, anything I should have asked you and didn't anything you were championed a bit to talk about that I failed to give the opportunity.
[01:10:11] I suppose I think the thing is like a lot of people are wondering what you know what is yeah, there's going to be disinformation.
[01:10:19] Yeah, there's going to be Russian campaigns all out.
[01:10:23] So what you know, we've been around this has been around for a long time.
[01:10:27] Is it really going to make any difference?
[01:10:29] And, you know, maybe you won't. But what I would what I what concerns me, I think the most ahead of the election in the US in particular is that for the last four years we've been there's been a massive court people online who believe with an absolute religious vigor that the election was stolen from Trump.
[01:10:51] And they believe it no matter what, like they are led by the likes of Michael and Mike Flynn and Patrick Byrne.
[01:10:59] And they are kind of marching towards 2024 with the firm belief that that election will be rigged as well.
[01:11:07] And they are now those election denial groups are now co mingling with more extreme groups such as this group called the Constitutional Sheriffs.
[01:11:19] I don't know if you've heard of them, but they're a group of sheriffs who are actual sheriffs who believe that they are the ultimate power in the county that they are in charge of no federal or state laws above them.
[01:11:32] They can decide.
[01:11:34] And a number of them, I was at a conference.
[01:11:37] I studied that line of thinking back in college.
[01:11:40] Yeah, the posse.
[01:11:41] There's been an argument going for a while.
[01:11:42] Yeah, the posse comitatus stuff.
[01:11:44] So they've kind of emerged out of that.
[01:11:47] And I was at a conference in Las Vegas with a number of them recently with Lindale and Byrne and Michael Flynn were all there as well.
[01:11:56] And so on one hand, they were talking about elections being stolen and what we need to do and watch the polls.
[01:12:02] And you know, you need to be alert.
[01:12:03] You need to do all this stuff.
[01:12:05] And on the other hand, these sheriffs are talking about building posse, actual posse.
[01:12:09] One sheriff there, a county of 5000 people.
[01:12:12] He has a posse of 150 people already.
[01:12:16] And he was giving out a guidebook for other sheriffs on how to build your posse effectively.
[01:12:22] And I have it.
[01:12:23] It's fascinating.
[01:12:24] And so what they are saying is that on Election Day, you, the sheriff, you need to be on top of these voting machines.
[01:12:35] You need to be watching what's happening.
[01:12:37] There's a sheriff in Michigan called Darleve who after the last election tried to seize voting machines in Barry County.
[01:12:45] He's still investigating the 2020 election.
[01:12:48] He told me they're going to be very soon winding up the investigation.
[01:12:54] Those guys, along with the election denial movement, concern me greatly because that's not going to be happening online.
[01:13:04] That's going to be happening at polling places.
[01:13:06] And all it takes is one county in one swing state where a sheriff decides, because he's been told by some electioneer that something nefarious has happened,
[01:13:17] which we all saw around 2020.
[01:13:19] Everyone saw ballots being thrown here or, you know, suitcases of ballots coming out of.
[01:13:24] All that stuff will happen again.
[01:13:26] But this time around the sheriffs in this group are going to be hyper aware of it because they've been fed this disinformation for four years.
[01:13:34] What are they going to do?
[01:13:35] Are they just going to go, let's wait and see?
[01:13:38] I'm going to take a step back.
[01:13:40] I'm going to verify this.
[01:13:41] Or are they going to be the people who are going to go, I need to charge in there now and grab the machines with my guns?
[01:13:46] You know, blazing.
[01:13:47] And in a number of cases, I believe it's going to be the latter, unfortunately.
[01:13:52] But it just means that all the disinformation and the people who are saying, you know, we've seen it all before.
[01:14:01] It's fine.
[01:14:02] All it takes is one person in one county in one swing state to have a massive, massive impact on what happens with this election.
[01:14:13] And once that happens, who knows what dominoes may fall and what might happen after that.
[01:14:19] So I just think people need to be aware that these things are happening in the background.
[01:14:25] They're not front page news, but they are happening.
[01:14:28] And there are lots of militia groups who are aligning with sheriffs across the country.
[01:14:33] And they are also embedded with the election denial telegram channels and Facebook groups that we've seen.
[01:14:38] So it's all kind of co-mingling and it's just getting a little bit worrying to me.
[01:14:44] And I think people just need to be aware that this is happening because I think the more people are aware of what's happening out there and the more informed they are, the better it is for everyone.
[01:14:55] And to be clear, the word new itself is problematic just because something has happened on a small scale before.
[01:15:03] It was happening at a large scale.
[01:15:05] I don't know whether to call it new or more important, but at some point the difference in degree becomes a difference in kind.
[01:15:12] And it's why I link this stuff to democracy, the truth of democracy, that ultimately if they're growing anti-small-D democratic operations abroad and at home with information tools and motive means and opportunity to try to undermine American and global democracy, then that's really important.
[01:15:33] And yeah, there have been anti-small-D democratic movements before.
[01:15:36] And in fact, before World War II, democracy was even harder.
[01:15:39] And before the French and American revolutions, it was much more rare than it is now.
[01:15:43] So we can't take it for granted.
[01:15:44] So I take none of this for granted, which is why I'm so grateful for your time.
[01:15:48] Yeah, glad to be here.
[01:15:51] David Gilbert, Senior Report from Wired magazine.
[01:15:54] I've been a Wired subscriber.
[01:15:57] Wired still prints things, yes?
[01:16:00] Yeah, it does.
[01:16:01] Congratulations.
[01:16:02] There was actually a bit of disinformation going around the office that the print edition was finishing recently, but that was quickly cleared up.
[01:16:11] So no, it's still a print magazine, still coming out.
[01:16:14] Hopefully that was April Fool's Day, prank humor.
[01:16:17] David Gilbert, thank you so much and thanks for being a Democracy Nerd.
[01:16:21] No problem. Thanks for having me.
[01:16:22] Cheers.
[01:16:32] This has been a program brought to you by Kat Buckley at kbuckleygraphics.com.
[01:16:35] I am Jefferson Smith.
[01:16:37] Thank you so much for listening.
[01:16:38] You can rate and review, hope you will, and follow Democracy Nerd on Facebook, Twitter, and YouTube.
[01:16:43] Past episodes of the show, Democracy Nerd, can be found online at democracynerd.us.
[01:16:49] Go America.
[01:16:51] Thank you.
[01:16:52] Thank you, Democracy.