skip to Main Content
Home of the finest science fiction and science fact

Editorial

Guest Editorial: Breaking the Cycle of Fake News
by Richard A. Lovett

Fake news isn’t new. Silly headlines in supermarket tabloids, for example, have been around for years. My favorite? “Hillary Clinton Adopts Alien Baby. Secret Service Building Special Nursery in the White House!”1

The exclamation point is all most people need to realize this story isn’t to be taken seriously. But sadly, that’s not always the case. Nazi Germany used misinformation to stir up hate against Jewish people. American newspapers used it to stoke fervor for the Spanish American War. Remember the Maine? Spain had nothing to gain from sneak-attacking an American battleship in Havana Harbor and a lot to lose, as the ensuing war proved, but that wasn’t how the story was told at the time, and the sensationalism that seized on it soon went down in history as “yellow journalism.”2

Why yellow? I’d wondered that for years, but it turns out that one of the top New York papers was running a comic strip featuring a yellow-dressed kid spouting melodrama. Not to be outdone, a rival paper created a copycat strip, and the dueling “yellow kids” eventually produced the term “yellow journalism.” Today we call it click bait, but it’s the same idea.

The day I started this editorial, I woke to news about how a family of four had been shot up, trying to cross a Russian checkpoint in Ukraine. What caught my attention was what a Russian officer said, when he eventually intervened. “I’m sorry,” he told a woman who’d sustained a dozen bullet wounds to her legs, “but you cannot imagine how many cars like this we have full of Nazis who are trying to bomb us.” The Russian soldiers, the wounded woman later said, really were terrified. “[They] truly believe that everyone around is a Nazi” she told NBC News.

In a police state, protecting yourself from this type of disinformation is difficult. In the West, our problem isn’t so much that as the free market, in which anyone with access to the Internet can try to stir the rest of us into endless frenzies of “Remember the Maine” fury.

But if you think it’s only the people you disagree with who are doing this, think again. Victoria Parker, a doctoral researcher at Wilfrid Laurier University, Waterloo, Ontario, has found that most Americans, liberal or conservative, aren’t really all that far apart on many fundamental issues. The problem, she says, is a false polarization in which each side envisions the other in terms of the worst types of hard-liners they’ve heard about in the news. “Our data suggest that many people are walking around with an exaggerated mental representation of what other Americans stand for,” she wrote in a December 27, 2021, article in The Atlantic.3 Furthermore, she said, her data show that such misperceptions appear to come from watching too much partisan news, whichever side of the divide it comes from. The more that people do this, she wrote, the more they believe in a “caricatured version of the other side.”

At one level, this is good news. We aren’t really as different as we think we are. In fact, I’ve seen studies showing that liberals and conservatives actually share many of the same values, but simply rank them in a different order. But that’s a topic for a future editorial.

For the moment, my own personal view is that the solution starts with as many of us as possible trying to break out of those caricatured versions of people on the opposite side by identifying friends we disagree with and—instead of blocking them out of our social circles, as so many want to do—inviting them in, and listening to what they have say. Listening to them, not to what we think they’re going to say. Listening, not arguing. Because maybe we aren’t as far apart as we think we are. Because nobody, including ourselves, is right all the time.

Recently, a friend (and fellow Analog writer) noted that someone had offered up this challenge on his Facebook page: “Name something you believe that might not be true.” What a great question. It’s much of what science, and science fiction, are often about. What if I’m wrong? What if everyone is wrong? Many of the greatest breakthroughs have come from some variant of those questions.

Unfortunately, there are powerful forces keeping us from asking them. Computer algorithms encourage us to click ever more often on links supporting the ones we’ve just read. It’s like we’re not just being exposed to two competing “yellow kids” but to a whole army of them, all marching in the same direction.

Help may be coming, though, from scientists studying the way disinformation and the resulting unjustified polarization spread. One of these is Rebecca Goldberg, a behavioral science researcher at Google.

One of the more important things to realize, she said at the 2022 meeting of the American Association for the Advancement of Science (AAAS), is that basic methods of spreading disinformation haven’t changed, even as the technology has evolved. For example, there are strong parallels between the types of claims made against Covid-19 vaccines and those made 65 years ago against polio vaccines. “They occur over and over again with every new vaccine,” she said.

She and other researchers have concluded that there isn’t much that can be done to suppress conspiracy theories and other types of disinformation at their sources. If you’re one of those people who truly believe the Apollo landings were faked or that jet contrails are “chemtrails” spewing noxious chemicals overhead, there’s not much anyone can do to dissuade you. A better approach, she said, is to “pre-bunk” disinformation by training people in advance on how to recognize it.

She compares this to an inoculation, like those many in my generation got to ward off polio or smallpox (or are now getting to protect against pneumonia or shingles). “Inoculation takes a medical metaphor and applies it to propaganda,” she said. “It attempts to build psychological resistance to an attack trying to persuade you [of something wrong], just like your body builds physical resistance with a medical inoculation.”

In a test, her team created a set of short videos, each focused on a common disinformation technique. One explained fearmongering. Another discussed ad hominem attacks (personal slurs unrelated to the underlying issue). Others depicted scapegoating, false dichotomies, and other logical fallacies. They then recruited 5,500 volunteers, some of whom were shown one of the “inoculation” videos, some of whom weren’t. All were then shown hypothetical tweets, some of which used disinformation methods addressed in the videos, some of which didn’t. Afterward, they were asked to rate the tweets’ trustworthiness and how likely they’d be to share them with friends.

The results were promising. Those who’d seen the “inoculating” videos were substantially more cautious, regardless of their political orientation. “People don’t like to be fooled,” says Daniel Rogers, executive director of the Global Disinformation Index and an adjunct faculty member at New York University.

Other research has shown that this can literally be a matter of life and death. David Yanagizawa-Drott, an economist at the University of Zurich, Switzerland, found a rare opportunity to test this during the initial stages of the Covid-19 pandemic, in January-March of 2020, when nobody was really sure how dangerous the virus would ultimately prove to be.

Two of the most popular shows on Fox News, he realized, were those hosted by Tucker Carlson and Sean Hannity. Viewers of the two shows were demographically and politically very similar but Carlson and Hannity saw Covid very differently. Carlson was concerned about it early on, often bringing up the point that it was on the rise and potentially a problem. Hannity was slower to react to it, creating a ninety-day window in which viewers of the two shows received significantly different information. “Carlson early on sounded the alarm, whereas Hannity [initially] downplayed the severity of the virus,” Yanagizawa-Drott told the AAAS meeting.

The result was “a neat natural experiment,” because it allowed Yanagizawa-Drott’s team to use Nielsen ratings to compare the progress of the virus in counties where one host or the other was more popular.

In counties where Carlson was more popular, they found that people changed their behavior earlier on, with a resulting reduction in both cases and deaths. In counties where Hannity was more popular, they found sharper rises in both cases and deaths. The bottom line: this stuff isn’t just about politics. It really, truly matters.

Not that the basic finding was all that surprising. Yanagizawa-Drott’s Ph.D. research came from studying the Rwandan genocide. Specifically, he looked at the impact of radio broadcasts spewing out hate toward the minority Tutsis, comparing the level of violence in villages where these broadcasts could be heard to those in which they were blocked by intervening mountains. Not surprisingly, Tutsis in villages within range of the disinformation-spewing antennas saw more violence than those in villages unable to hear it.

Unfortunately, combatting online disinformation isn’t as simple as figuring out whose village is within range of a radio transmitter, or who watches which opinion show on TV.

In an attempt to figure out whose computer is within (metaphorical) range of the equivalent of an anti-Tutsi radio station, Neil Johnson of George Washington University, D.C., has constructed an enormous map of 3 billion online social network users, focusing on understanding how they are connected, particularly regarding the hot-button issue of Covid-19 vaccines.

He found that these users could be grouped into three basic groups: the pro-vaccine community, the anti-vaccine community, and a previously overlooked “huge mainstream” of people looking for anything from parenting advice to pet care, organic foods, or dietary supplements.

The first of these, he found, largely talks to itself: a depressing discovery for scientists hoping to spread their findings beyond the scientific community. The dedicated anti-vaccine community gets a lot of attention in the press, but is similar in that it, too, talks mainly to itself.

The ones who matter, Johnson argues, are the third group. They aren’t normally all that interested in vaccines, masks, or other polarizing topics, but his research shows that they have much closer online links to the conspiracy-theory world than to the scientific world. Not because they are anti-science, but because that is how their social media interconnections are currently structuring themselves. “They are discussing pets, or babies, or their kids,” he says, “[but] they get entangled.”

What that means, Johnson says, is that combatting disinformation doesn’t involve either confronting the spreaders of it or calling on their opponents to become ever more vocal, because neither group is talking directly to anyone whose mind isn’t already made up. Rather, the goal should be looking for those incidentally caught up in the fray. “That is where misinformation is leaking into the mainstream,” he says.

Or, in Goldberg’s terms, that’s where it’s most effective to inoculate against it—something her group is working on by testing their detecting-misinformation videos as ads. If that works, maybe you’ll soon be getting online ads offering to show you (for free) how to detect fearmongering, false dichotomies, and other logical fallacies we all should have been taught about in high school.

Will any of this work? Time will tell. But in the increasingly disinformational world, I think we have three main defenses. Goldberg’s “detecting misinformation techniques” ads is one. Listening politely to friends with opposing views in another (especially under a mutually agreed on time limit). But for readers of this magazine, the biggest answer might be realizing that reading fiction creates empathy—which might be one of the best ways to inoculate yourself against Parker’s “unjustified polarization.”

If you can empathize with Spock, Worf, or a Kzinti warcat, you are already far ahead of the curve. Not because Mr. Spock’s logical mind is in and of itself going to save you from malicious internet memes, but because the time we spend living in the heads of science-fictional characters much different from ourselves might be another way of inoculating ourselves against the barrage of mis- and disinformation to which we are daily exposed. Not to mention that it’s just plain fun.

 

Endnotes:

   1 From the Weekly World News, sometime in 1993.

   2 The best guess today is that the Maine sank as the result of a coal-bunker fire that spread to a munitions magazine.

  3 Victoria A. Parker, et al, “The Ties that Blind: Misperceptions of the Opponent Fringe and the Miscalibration of Political Contempt, https://psyarxiv.com/cr23g.

 

Copyright © 2022 Richard A. Lovett

Back To Top
0
    0
    Your Cart
    Your cart is emptyReturn to Shop