Skip to main content

Fact or Fake? The role of knowledge neglect in misinformation

May. 15, 2020, 11:34 AM

ANDRANIK HAKOBYAN/ISTOCK

By Lisa Fazio

At this point we’ve all heard about the problems posed by misinformation in our society. From accusations that we’re living in a post-truth world, to deep fakes that can make former President Obama say “Killmonger was right,” to coronavirus cures—the past five years have catapulted the problem of misinformation into the public discourse.

But it is easy to feel removed from the problem. We often think, “I’m a smart person. If I see something that’s fake, I’ll realize it and not believe it. Misinformation is something that only affects other people—people who aren’t as smart as I am or who have poor critical thinking skills or who vote for a different political party.”

However, it’s not that simple. Everyone can be fooled by misinformation. Psychological research demonstrates that noticing errors in what we read is often difficult, and that those errors can affect our later beliefs, even when we know they’re wrong.

So, why can’t smart people just realize when something’s false and then not believe it? The first challenge lies in actually noticing the misinformation. As a society we function on the assumption that the vast majority of information we encounter will be true. Thus, when we encounter a new piece of information, like a piece of gossip from a neighbor, a cool science fact on Reddit, or a post on social media, we often don’t search through our entire knowledge base to ask whether the information fits with what we already know. Instead, we simply consider that it’s “good enough.”

Let me give a few examples that illustrate this problem. Try to answer the following questions:
• How many animals did Moses take on the ark?
• Which museum houses Michelangelo’s portrait of the enigmatically smiling Mona Lisa?
• What phrase followed “To be or not to be” in Macbeth’s famous soliloquy?

If you’re like many readers, you may have answered the questions without noticing the errors within. It was of course Noah, not Moses, who took the animals on the ark; Leonardo da Vinci painted the Mona Lisa; and that soliloquy is from Hamlet, not Macbeth. Yet in a recent study, Duke undergraduates failed to notice the error in 41 percent of similarly distorted questions.

The students had the correct knowledge in their head—they were only included in the study, for example, if they could later indicate correctly that Noah took the animals on the ark—but they failed to use that knowledge in the moment.

We call these lapses knowledge neglect. We have relevant knowledge stored in our heads, but we fail to use it to spot incorrect information.

Knowledge neglect also affects our beliefs about which information is true. More than 100 research studies during the past 40 years have all found the same thing: Repetition increases belief. And while our prior knowledge helps us decide which statements are true or false, it does not protect us from this effect of repetition.

For example, even among people who know that Einstein proposed the theory of relativity, when they read twice that “Newton is the last name of the man who proposed the theory of relativity,” they are more likely to say it’s true as compared to when they’re seeing it for the first time. In fact, in a recent study, Vanderbilt undergraduates were more likely to believe statements that contradicted Kindergarten-level science facts—such as, “The part of the plant that grows underground is the stem”—when they were repeated. More troubling, repetition also affects belief in false political headlines and politicians’ falsehoods.

What can we do to avoid knowledge neglect and avoid spreading misinformation? One of the most effective techniques is simply to pause and think about whether a piece of information is true or false. That is, rather than going with a gut feeling, stop and think about your relevant prior knowledge.

Asking people to slow down to deliberate about the accuracy of political news headlines greatly improves their ability to identify false news stories, even false stories about the novel coronavirus. Similarly, in our own work, we’ve found that asking people to explain how they know a headline is true or false reduces people’s intention to share false news stories on social media.

While the problem of misinformation can seem overwhelming, it’s important to remember that all of us as individuals can play a role in improving the quality of information we see. We control what spreads on social media and can be responsible for what we share and publicize. We all have a responsibility to promote accurate information online.

Don’t share stories without reading them first, and double-check information when it feels too good to be true. But mostly, take a breath and pause. Rely on your brain, not your gut.

Lisa Fazio is an assistant professor of psychology and human development at Peabody College. With a Ph.D. from Duke, her research areas include cognition and cognitive neuroscience and developmental science.

For more information, watch this Ask an Expert video with Lisa Fazio.

VIEW MORE EVENTS >