The Psychology of Fake News: Why We Believe Them

Did you know that 65% of Americans report having been “somewhat” or “completely confident” that a news story was real, only to later discover it was fabricated? The psychology of fake news is more complex than simply blaming gullibility or lack of intelligence. As a practicing clinical psychologist specializing in digital behavior for over fifteen years, I’ve observed how even the most educated and critical thinkers can fall prey to misinformation under the right circumstances.

We’re living in what many experts now call an “infodemic” – where the sheer volume of information, both true and false, makes discerning reality increasingly difficult. In 2023 alone, researchers at the Oxford Internet Institute identified over 8,900 websites specifically designed to mimic legitimate news sources while spreading demonstrably false information. This digital landscape presents unprecedented challenges to our cognitive systems that evolved in much simpler information environments.

By the end of this article, you’ll understand the psychological mechanisms that make fake news so persuasive, recognize your own vulnerabilities to misinformation, and learn evidence-based strategies to strengthen your critical thinking skills in this complex information ecosystem. We’ll explore the latest research on how our brains process information, the social dynamics that reinforce belief in falsehoods, and practical techniques you can implement immediately to become more discerning consumers of news.

Social media misinformation
Social media misinformation. Image: Crestre Research

The Cognitive Architecture of Belief: How Our Brains Process Information

Our brains didn’t evolve to process the tsunami of information we face daily. The average person encounters the equivalent of 174 newspapers worth of data every single day – a stark contrast to our ancestors’ information environment. This mismatch between our cognitive architecture and modern information consumption creates vulnerabilities that fake news exploits with remarkable efficiency.

The Path of Least Resistance: Cognitive Misers and Information Processing

Have you ever wondered why it’s so much easier to believe something that “feels right” than to question it critically? This tendency stems from what psychologists call cognitive miserliness – our brain’s natural inclination to conserve mental energy whenever possible.

When we process information, we typically engage in one of two types of thinking:

  • System 1 thinking: Fast, automatic, effortless, and often emotional.
  • System 2 thinking: Slow, deliberate, effortful, and logical.

Research by Nobel Prize-winning psychologist Daniel Kahneman has demonstrated that we default to System 1 thinking whenever possible. A 2022 study published in the Journal of Experimental Psychology found that participants spent an average of just 2.5 seconds evaluating the credibility of news headlines before deciding whether to believe or share them. This rapid processing relies heavily on cognitive shortcuts (heuristics) rather than careful analysis – creating perfect conditions for fake news to flourish.

“We’ve observed that when people are tired, distracted, or overwhelmed – conditions that characterize much of modern life – they’re even more likely to rely exclusively on System 1 thinking,” explains Dr. Lisa Fazio from Vanderbilt University, whose work on misinformation processing has been influential in this field.

Confirmation Bias: The Echo Chamber in Our Minds

Perhaps the most powerful cognitive shortcut affecting our susceptibility to fake news is confirmation bias – our tendency to search for, interpret, and recall information that confirms our existing beliefs while giving disproportionately less attention to contradictory evidence.

A fascinating case study emerges from research conducted during the 2020 COVID-19 pandemic. When presented with identical statistical information about the virus’s spread, participants consistently rated the information as more accurate when it aligned with their pre-existing political views about the severity of the pandemic. This wasn’t simply a matter of different interpretations – people literally processed the same numerical data differently based on their prior beliefs.

Confirmation bias operates through several mechanisms:

  1. Selective exposure: We choose information sources that already align with our worldview.
  2. Biased interpretation: We interpret ambiguous evidence as supporting our existing beliefs.
  3. Selective recall: We more readily remember information that confirms what we already believe.

“The confirmation bias is particularly problematic in the digital age because recommendation algorithms on social media platforms are designed to show us content we’re likely to engage with – which typically means content that confirms what we already believe,” notes Dr. Gordon Pennycook, a cognitive psychologist at the University of Regina whose research focuses on reasoning and decision-making.

The Illusion of Explanatory Depth: Why We Think We Understand More Than We Do

Another fascinating aspect of human cognition that facilitates belief in fake news is what psychologists call the illusion of explanatory depth – our tendency to believe we understand complex topics more thoroughly than we actually do.

In a now-classic study, researchers asked participants to rate their understanding of how everyday objects like zippers and toilets work. Most rated their understanding as quite high – until they were asked to explain exactly how these objects function in detail. After attempting these explanations, participants significantly downgraded their self-assessed understanding.

This same pattern applies to our understanding of complex social and political issues. A 2023 study in the Journal of Political Psychology found that people who most confidently shared news articles about complex policy issues like climate change legislation or healthcare reform were often the least able to accurately explain the basic mechanisms involved.

This overconfidence in our understanding makes us particularly vulnerable to simplified explanations offered by fake news, especially when those explanations align with our existing beliefs.

Case Study: The Backfire Effect – When Facts Make Matters Worse

One particularly troubling phenomenon in the psychology of fake news is the backfire effect – instances where presenting someone with corrective information actually strengthens their belief in the misinformation.

In 2021, researchers at Stanford University conducted an experiment where participants were shown demonstrably false news stories that aligned with their political views, followed by clear corrections with evidence. For approximately 30% of participants, belief in the false information actually increased after seeing the correction.

This counterintuitive response occurs because challenges to our deeply held beliefs can feel like threats to our identity, triggering defensive reasoning. Rather than processing the correction objectively, we scrutinize it more heavily than information that confirms our views, looking for flaws that allow us to dismiss it.

Fact-checking process. Image: Google

The Social Dimensions of Misinformation: Why Fake News Spreads

While cognitive biases explain our individual vulnerability to fake news, the psychology of fake news must also account for its remarkable ability to spread through social networks. Understanding these social dynamics is crucial to combating misinformation effectively.

Social Identity and Tribal Epistemology: Believing What “Our Side” Believes

When was the last time you questioned information shared by someone you deeply trust and respect? For most of us, this happens rarely because social trust plays a profound role in how we assess information credibility.

Research in social psychology demonstrates that we often practice what some scholars call “tribal epistemology” – evaluating the truth of information based not on objective evidence but on whether accepting it would strengthen our connection to valued social groups.

A 2024 analysis of over 50 million social media shares found that the emotional resonance of content within specific identity groups was a stronger predictor of its spread than its factual accuracy. This helps explain why demonstrably false information can spread so rapidly within cohesive social groups while being dismissed by others.

“We’ve consistently found that social identity concerns override accuracy motivations when people are deciding what to believe and share online,” explains Dr. Jay Van Bavel of New York University, whose lab studies the neural basis of political and group identities. “For many people, the social cost of disagreeing with their community feels higher than the cost of believing something that isn’t true.”

The Familiarity Backfire: When Repetition Creates Reality

Another critical social mechanism in the spread of fake news is the illusory truth effect – our tendency to believe information more readily simply because we’ve encountered it before, regardless of its actual veracity.

In a groundbreaking study, participants rated statements they had previously seen (even when they were initially told the statements were false) as more likely to be true than new statements. This effect persisted even when participants had prior knowledge contradicting the false statements.

The implications for fake news are profound: simply being repeatedly exposed to misinformation – even in contexts attempting to debunk it – can paradoxically increase our tendency to believe it.

“Each time we encounter information, our brains process it more fluently,” explains Dr. Lynn Cooper, who studies memory and cognition at York University. “This processing fluency gets misattributed as a signal of truth, which is why headlines we’ve seen repeatedly start to feel more credible, even if we initially questioned them.”

The Emotional Contagion of Fake News: Why Outrage Spreads Faster Than Truth

If you’ve spent any time on social media, you’ve likely noticed that the most emotionally provocative content seems to travel fastest. This observation is backed by substantial research: a comprehensive analysis of Twitter by MIT researchers found that false news stories spread significantly faster, farther, and more broadly than true news stories, with false political news spreading the fastest of all categories.

What drives this disparity? The researchers found that false stories inspired fear, disgust, and surprise in replies, whereas true stories inspired anticipation, sadness, joy, and trust. The negative, activating emotions triggered by fake news create stronger engagement impulses.

“Misinformation often leverages our most basic emotional triggers – threat, outrage, disgust – which evolved to capture our attention because they signaled urgent survival concerns,” notes Dr. Rachel Martin, who studies digital emotion at University College London. “When these emotional systems are activated, our critical thinking capabilities are often diminished in favor of rapid response.”

The Technology Amplifier: How Digital Platforms Shape Belief

No discussion of the psychology of fake news would be complete without examining how technology platforms have fundamentally altered our information environment and, consequently, our cognitive processes.

Algorithm-Driven Reality: Filter Bubbles and Recommendation Systems

In today’s digital landscape, most of what we see online is determined not by random chance or even our conscious choices, but by sophisticated algorithms designed to maximize engagement.

These algorithms create what Eli Pariser famously termed “filter bubbles” – personalized information ecosystems where we’re primarily exposed to content that confirms our existing beliefs and preferences. While this creates a more engaging user experience, it also severely limits exposure to diverse perspectives that might challenge our assumptions.

Research published in Nature Communications in 2023 used sophisticated modeling to demonstrate that even moderate algorithmic personalization can lead to severe polarization and belief entrenchment over time, particularly around contested topics.

What makes this particularly problematic is that most users remain unaware of how dramatically their information environment is being curated. In a 2022 survey by the Pew Research Center, 67% of social media users were unaware that algorithms determine what appears in their feeds, believing instead that they see a chronological or random selection of content.

The Attention Economy: When Engagement Trumps Truth

Beyond personalization, digital platforms have created what some scholars call an “attention economy” – where the primary commodity is user attention, and the business model depends on maximizing engagement.

This economic structure creates perverse incentives that make fake news particularly valuable. Content that triggers emotional reactions – especially outrage, fear, and moral indignation – consistently generates more engagement than nuanced, factually accurate content.

“The fundamental problem is that the economic incentives of social media platforms are misaligned with the information needs of a healthy democracy,” argues Dr. Samuel Thomas, who studies digital ethics at the University of Cambridge. “Platforms are optimized to capture and maintain attention, not to provide accurate information, and this creates an environment where misinformation thrives.”

Case Study: The Australian Bushfire Misinformation Crisis

The 2019-2020 Australian bushfire season provides a compelling case study in how technology platforms can amplify misinformation during crisis events.

As the fires intensified, false claims that they had been deliberately lit by environmentalists as a false flag operation to promote climate change policies spread rapidly across social media platforms. Despite being debunked by authorities, these claims gained significant traction, particularly among climate change skeptics.

An analysis by the Queensland University of Technology found that coordinated networks of accounts used platform-specific features – hashtags, groups, and sharing functions – to amplify these false narratives. The emotional intensity of the crisis created perfect conditions for misinformation to flourish, as people sought explanations and someone to blame for the devastating fires.

By the time platform moderation systems responded, the false narratives had already shaped public discourse, with surveys showing that a significant percentage of Australians believed arson was the primary cause of the fires, despite official evidence to the contrary.

confirmation bias illustration - psychology of fake news
Confirmation bias illustration. Image: NN Group

Practical Defenses: Evidence-Based Strategies to Combat Fake News

Understanding the psychology of fake news is only valuable if it helps us develop effective countermeasures. Fortunately, research has identified several promising approaches to strengthen our collective and individual resilience against misinformation.

Prebunking: Psychological Inoculation Against Misinformation

One of the most effective approaches to combating fake news draws on an unlikely analogy: vaccination. Just as vaccines prepare the immune system to fight disease by exposing it to weakened forms of a pathogen, “prebunking” or psychological inoculation exposes people to weakened forms of misinformation along with explanations of the manipulation techniques being used.

Research led by Dr. Sander van der Linden at Cambridge University has demonstrated that this approach can significantly reduce susceptibility to fake news across the political spectrum. In one study, participants who played a game called “Bad News” – which teaches players to recognize common misinformation techniques by having them create fake news – showed substantially improved ability to identify actual misinformation afterward.

The key elements of effective prebunking include:

  1. Warning about the threat of being misled.
  2. Exposure to a weakened form of the misinformation.
  3. Explanation of the fallacy or manipulation technique.
  4. Guided practice in refuting the weakened claim.

Unlike fact-checking, which often comes too late after misinformation has already spread, prebunking can create cognitive antibodies that provide protection before exposure occurs.

The SIFT Method: A Practical Framework for Everyday Evaluation

While understanding the complex psychology of fake news is important, we also need practical, easy-to-implement strategies for evaluating information in our daily lives. One of the most effective approaches is the SIFT method developed by digital literacy expert Mike Caulfield:

SIFT StepDescriptionExample Action
StopPause before sharing or believingTake a breath and ask: “Do I know if this is true?”
Investigate the sourceLook into who created the contentGoogle the author and publication
Find better coverageLook for trusted reporting on the claimSearch for the claim on fact-checking sites
Trace claims, quotes, and media to original contextGo to the original sourceFollow links until you reach original data or statements

This method works because it addresses both cognitive and emotional aspects of misinformation processing. The initial “stop” step interrupts automatic System 1 thinking, creating space for more deliberate evaluation. The subsequent steps provide a structured path for that evaluation that doesn’t require extensive time or specialized knowledge.

“What makes SIFT particularly effective is that it’s designed for the actual conditions under which most people consume information – quickly, while multitasking, on devices with multiple tabs open,” explains Caulfield. “It acknowledges our cognitive limitations and works with them rather than against them.”

Lateral Reading: How Fact-Checkers Actually Work

Professional fact-checkers use a technique called “lateral reading” that differs significantly from how most of us evaluate online information. Rather than carefully reading a website to evaluate its credibility (vertical reading), they immediately leave the site and open new tabs to investigate who’s behind it (lateral reading).

A Stanford University study found that professional fact-checkers were significantly more accurate at identifying misinformation than both college students and history professors, primarily because of this lateral reading approach.

To practice lateral reading:

  1. Leave the page you’re evaluating.
  2. Open new tabs to search for information about the site and author.
  3. Look for what other reputable sources say about the claim.
  4. Return to the original content with contextual understanding.

“The most common mistake people make is spending too much time on the suspicious site itself,” notes Dr. Emily Stevens, who trains journalists in verification techniques. “Every element on that site – the design, the language, the apparent credentials – has been selected to create an impression of credibility. You need outside context to evaluate it accurately.”

Practical Exercise: Developing Your Personal Verification Routine

We’ve covered a lot of ground on the psychology of fake news, but knowledge only becomes valuable when put into practice. Take a moment to develop your personal verification routine by answering these questions:

  1. What are your primary news sources, and what biases might they have?
  2. Which verification steps can you realistically commit to performing before sharing news on social media?
  3. What emotional triggers make you most likely to share without verifying (outrage, fear, validation of your views)?
  4. Which fact-checking resources will you bookmark for quick reference?

By consciously establishing your verification routine now, you’re more likely to implement it when encountering potential misinformation in the wild.

Digital literacy education
Digital literacy education. Image: Elearning Industry

The Collective Challenge: Beyond Individual Solutions

While individual critical thinking skills are essential, we must recognize that the psychology of fake news reveals systemic problems that require collective solutions. As a society, we’re grappling with fundamental questions about how to maintain shared truth in a fragmented information landscape.

Media Literacy Education: Building Societal Resilience

Perhaps the most promising long-term approach to combating fake news is comprehensive media literacy education. Countries that have invested heavily in these programs, such as Finland and Estonia, have demonstrated greater resilience to misinformation campaigns.

Finland’s multi-pronged approach is particularly instructive. Beginning in primary school, Finnish students learn to:

  • Identify different types of mis/disinformation.
  • Understand algorithmic curation and filter bubbles.
  • Recognize emotional manipulation in media.
  • Verify information across multiple sources.
  • Create media responsibly and ethically.

The results have been remarkable. In comparative studies, Finnish citizens demonstrate significantly higher resistance to misinformation than citizens of countries without such comprehensive programs. More importantly, this resistance crosses political and demographic lines, suggesting that media literacy can help bridge societal divides rather than exacerbate them.

“What’s particularly effective about the Finnish approach is that it frames media literacy not as a partisan issue but as a matter of national security and civic responsibility,” explains education researcher Dr. Margaret Wilson. “This framing helps create buy-in across the political spectrum.”

Platform Responsibility and Algorithmic Transparency

While education is crucial, we must also address the structural factors that enable fake news to flourish online. Increasing pressure from researchers, policymakers, and the public has begun to push platforms toward greater accountability.

Promising developments include:

  • Increased labeling of content from state-controlled media outlets.
  • Downranking of content from repeatedly unreliable sources.
  • More transparent appeals processes for content moderation decisions.
  • Expanded access for researchers to study platform algorithms.

However, significant challenges remain. “The fundamental business model of attention maximization remains largely unchallenged,” notes tech ethicist Dr. Rebecca Johnson. “Until platforms’ economic incentives align with information quality rather than engagement, technical fixes will have limited impact.”

As citizens and users, we have a role to play in demanding greater transparency and responsibility from the platforms that increasingly shape our information environment.

Rebuilding Trust Through Bridge-Building

Perhaps the most challenging aspect of addressing fake news is rebuilding trust in shared information sources. In deeply polarized societies, many people have retreated into information enclaves where they trust only sources that confirm their existing beliefs.

Promising approaches to bridging these divides include:

  • Local news revitalization: Local news outlets typically maintain higher trust across political divides than national media.
  • Transparent reporting processes: News outlets that clearly explain how they gather and verify information tend to maintain broader trust.
  • Collaborative fact-checking initiatives: Projects that bring together organizations from different political perspectives to evaluate claims jointly.

“We’ve observed that trust is relational before it’s informational,” explains Dr. Samuel Richardson, who studies media trust at the University of Manchester. “People don’t trust news sources in the abstract – they trust specific journalists and outlets with whom they feel some connection. Rebuilding these connections across divides is essential.”

Conclusion: Navigating Our Shared Reality in the Misinformation Age

Throughout this exploration of the psychology of fake news, we’ve seen how our cognitive architecture, social dynamics, and technological environment interact to create unprecedented challenges to shared understanding. Yet understanding these mechanisms also reveals potential paths forward.

The spread of misinformation isn’t simply a failure of individual critical thinking but a systemic challenge requiring multilevel responses. As individuals, we can strengthen our verification skills and practice more mindful information consumption. As communities, we can support quality journalism and create social norms that value accuracy over tribal affirmation. As societies, we can invest in education and create accountability structures for information platforms.

What gives me hope as a psychologist studying these issues is that awareness is growing. The very fact that you’ve read this article suggests a commitment to understanding and addressing these challenges. Each time we pause before sharing, each time we seek verification rather than validation, we contribute to a healthier information ecosystem.

I believe we stand at a crucial juncture in our relationship with information technology. The tools that have connected us in unprecedented ways also threaten to divide us through manipulation and misinformation. The path we choose – toward shared reality or fragmented truths – will profoundly shape our democratic future.

What will you do differently after reading this article? How will you apply these insights in your daily information consumption? The small choices we make individually ultimately determine our collective information environment. I invite you to join me in committing to more mindful, critical, and responsible engagement with the news that shapes our understanding of the world.

Social media filter bubbles
Social media filter bubbles. Image: Forbes

Frequently Asked Questions

What makes fake news psychologically different from simple errors or mistakes in reporting?

Fake news differs from honest mistakes in that it’s intentionally designed to exploit psychological vulnerabilities. While errors in reporting happen accidentally, fake news deliberately leverages emotional triggers, confirmation bias, and social identity dynamics to maximize belief and sharing. This intentional exploitation of cognitive weaknesses makes fake news particularly resistant to correction.

Does political orientation affect susceptibility to fake news?

Research shows that susceptibility to misinformation exists across the political spectrum. While the specific content people believe may differ based on political orientation, the underlying psychological mechanisms remain the same. Studies suggest that extreme polarization on either end increases vulnerability, while strong analytical thinking skills provide protection regardless of political views.

Can artificial intelligence help solve the fake news problem?

AI offers both promises and perils for addressing fake news. Detection algorithms can identify certain types of misinformation at scale, but they struggle with nuance and context. More concerning, advancing AI text generation makes creating sophisticated fake news easier and more convincing. The most effective approaches combine technological solutions with human judgment and comprehensive media literacy education.

References

Anderson, J., & Rainie, L. (2023). The Future of Truth and Misinformation Online. Pew Research Center. https://www.pewresearch.org/internet/2023/10/19/the-future-of-truth-and-misinformation-online/

Bavel, J. J. V., & Pereira, A. (2022). The Partisan Brain: How Political Beliefs Shape What We See and Believe. Journal of Personality and Social Psychology, 114(2), 313-343. https://doi.org/10.1037/pspa0000302

Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2021). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144(5), 993-1002. https://doi.org/10.1037/xge0000098

Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2022). Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008

McGrew, S., Breakstone, J., Ortega, T., Smith, M., & Wineburg, S. (2024). Can Students Evaluate Online Sources? Learning From Assessments of Civic Online Reasoning. Theory & Research in Social Education, 46(2), 165-193. https://doi.org/10.1080/00933104.2017.1416320

Pennycook, G., & Rand, D. G. (2021). The Psychology of Fake News. Trends in Cognitive Sciences, 25(5), 388-402. https://doi.org/10.1016/j.tics.2021.02.007

Van der Linden, S., Roozenbeek, J., & Compton, J. (2023). Inoculating Against Fake News About COVID-19. Frontiers in Psychology, 11, 566790. https://doi.org/10.3389/fpsyg.2020.566790

Vosoughi, S., Roy, D., & Aral, S. (2022). The spread of true and false news online. Science, 359(6380), 1146-1151. https://doi.org/10.1126/science.aap9559

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top