Latest

Disinformation vs. Misinformation: A Psychological Primer

Produced for @empowervmedia
Edited & fact-checked by @jorgebscomm

A collection of media on a white surface, including a rolled-up newspaper with "NEWS" in large font on the front. On top of the newspaper are a smartphone and a tablet, both displaying digital versions of news sites. The tablet shows a news headline, "GLOBAL ECOLOGY: IT'S NOT TIME TO FALL BACK." The image contrasts traditional print media with modern digital platforms for news consumption.
Emotions such as fear and outrage make us more susceptible to false news. (📷:universityofcambridge)

Before diving into psychology, we must define our termsMisinformation is misleading or false information that is spread, regardless of intent. It covers everything from honest mistakes to urban legends. In contrast, disinformation is a form of misinformation that is created and shared on purpose to mislead or manipulate people. In practice, experts often consider disinformation as a subset of misinformation, because it can be hard to know the originator’s intent. For example, a viral hoax about a miracle cure might be pure misinformation (someone misunderstood a fact), while an organised campaign spreading false health scares to sow panic would be disinformation. In either case, the core problem is that the information is false and can have harmful effects on people and society.

'Maria Ressa on the difference between misinformation and disinformation' ▶️0m28s

Why the Brain Falls for False Information

At a basic level, humans are not perfectly rational computers. We use shortcuts (heuristics) and have biases that make us susceptible to false information. One major bias is confirmation bias, where we naturally pay more attention to information that confirms what we already believe. In practice, this means people tend to seek, interpret, and remember evidence in ways that favour their preexisting views. As a result, we may ignore or downplay facts that conflict with our opinions. This selective thinking serves to reduce mental discomfort: our minds unconsciously try to avoid cognitive dissonance (the uneasy feeling when we hold two conflicting ideas). In other words, we are “blind” to contradictory information in order to ease that inner tension.

An infographic titled "CONFIRMATION BIAS." The top section defines the bias with an illustration of a person with a magnifying glass over a document marked "FACTS," pointing to a smaller figure with a target on their back and a bullseye. The "Here's Why" section explains that people seek information that confirms their existing beliefs, illustrated with a man aiming a dart at a bullseye. "Definition of Confirmation Bias" is shown with a group of people and question marks. The "History" section mentions Peter Wason's 1960 test and coins the term "confirmation bias." The "Why Care?" section discusses the impact of confirmation bias on societal views, featuring a man with a bullhorn and a textbox mentioning the need to consider alternative explanations.
(📷:academy4sc)

Another related tendency is motivated reasoning, where we scrutinise disagreeable ideas more harshly than agreeable ones. We might even invent explanations or illusions to make unrelated events seem connected (illusory correlation) to fit our narrative. For example, someone who believes vaccines are dangerous might mentally link an unrelated illness to a vaccine shot, simply because it happened around the same time. When false ideas support a person’s attitude, these biases can make people resist corrections. Put simply: once our mind latches onto a belief, our biases lock in, making it hard to let go.

Memory plays a role too. We build mental models or stories to make sense of events. If misinformation fits into that story, our brain can treat it as real. Even after later correction, the false detail may remain in memory because it made the original story coherent. This is known as the continued influence effect: debunked information still “lingers” in our thinking. For example, people who heard a false news story about a fire might continue to believe untrue details of it even after hearing it was wrong, simply because those details helped explain the event. Corrections often fail to erase the false version; in fact, repeating the myth (even to debunk it) can make it feel more familiar and therefore truer.

Our brains also prefer “fluent” information. In plain terms, the more we see or hear a claim, the easier it feels to process, and thus more believable it seems. Repetition builds a sense of familiarity. Sadly, misinformation tends to be shared widely and repeated, which exploits this familiarity shortcut. Meanwhile, carefully worded factual corrections can be harder to digest and may even provoke skepticism. In short, effortless thinking often wins over careful analysis, a phenomenon psychologists call “cognitive miserliness”.

Social Forces

False information doesn’t just catch individuals. It spreads through social networks and communities. People naturally group with those who share their beliefs. When like-minded people reinforce each other’s views, group polarisation occurs: the group’s overall belief becomes more extreme than any one person’s belief. In echo chambers or filter bubbles (often created by social media algorithms), people mostly encounter information that agrees with their group. This isolates them from counter-evidence and strengthens shared biases. In these echo chambers, dissenting facts feel like threats, so open debate is discouraged. This is the classic “groupthink” dynamic, where groups prioritise unity over truth.

Social identity also matters: once we tie our beliefs to “who we are” (politically, culturally, or socially), admitting we were wrong feels like admitting a part of our group identity was wrong. For example, if someone strongly identifies with a political party, they may reject any news that makes their party look bad, even if it’s true. Surveys show that almost half of people worldwide say social media or familiar influencers are major sources of misinformation, reflecting how falsehoods can spread in communities. When people see others in their group endorsing a false claim, social pressure makes it harder to back down (we want to stay loyal to our group). In this way, the social context amplifies cognitive biases.

Emotional Triggers

Emotions give misinformation extra power. Humans have an innate negativity bias: we pay more attention to threats and bad news than to neutral or positive news. In fact, studies find that news and social media posts with negative or fear-inducing language are shared far more often than positive ones. One recent analysis of millions of social media posts showed that people were nearly twice as likely to share a news link if it had a negative tone. The reason is evolutionary: our brains evolved to scan for danger, so an alarming headline catches our eye. This means disinformation often uses scary or angry language on purpose.

Strong emotions grab our attention. Disinformation exploits this by pairing false claims with vivid, frightening visuals or stories. Research shows that emotionally charged language (words that trigger anger, fear, or disgust) helps a message “grab” the brain and stick there. Negative headlines get more clicks, and fear or outrage makes us want to share immediately, before we think critically. Thus, misinformation that scares us can override our better judgement. Scientists call it “inattention to accuracy" (the inattention-based account): our heart rate and emotions peak, but our fact-checking goes down.

Emotions can also colour how we evaluate facts. If a false story makes us angry or anxious, we may mentally justify believing it, because it fits how we feel. Conversely, when factual corrections try to calm us with logic, we may resist if we still feel threatened. The combined effect of negativity bias and strong emotions is to cement false beliefs. One review notes that personal beliefs and attitudes are often shaped more by emotional state and media tone than by cold facts. This helps explain why sensational false news can seem more “true” than dry corrections.

Information Overload and Attention

We live in an attention economy, with a deluge of news 24/7. This environment favours the catchy over the accurate. People are bombarded by hundreds of headlines daily, so it’s natural to skim or click the most striking ones. Fast, emotional content wins the race for our limited attention. In such an environment, selective exposure takes over: people tend to watch or read only what seems immediately relevant or agreeable. This amplifies all the biases we’ve discussed: we latch onto familiar, confirming ideas and scroll past anything complex or challenging.

Importantly, the mental effort required to sort fact from fiction is high. Deep analysis is slow and exhausting, so our brains often default to shortcuts. This “cognitive laziness” means we might remember a catchy headline but forget to check its source. Over time, repeated exposure to the same false claim across different channels makes us feel we’ve “heard it everywhere,” which falsely signals credibility. Studies show that once a false story spreads widely (even among experts), it can stick. Repeated exposure and emotional attention make people remember it later as if it were true, a phenomenon researched under the "continued influence" and "illusory-truth" effects. 

Strategies and Solutions

While the psychological hurdles are many, researchers and communicators have identified ways to fight back. One effective approach is media literacy education. By teaching people about cognitive biases and fact-checking, we make consumers more aware of pitfalls. For instance, knowing about confirmation bias and the continued influence effect can prompt readers to pause and verify before sharing. Schools and community programs that encourage people to question sensational claims and cross-check sources have shown promise.

Another strategy is inoculation (or prebunking). This is like a vaccine for the mind: before exposure to actual false news, people are given a small “dose” of misinformation tactics, making them more resistant later. For example, a short warning video might explain how a deepfake works, so when viewers later encounter a doctored video, they are more skeptical. Studies suggest inoculation can significantly reduce the persuasive power of misinformation.

Fact-checking and credible journalism are also vital. Surveys find that many people rely on trusted news sources or fact-check sites to verify dubious claims. One global poll showed 38% of people will double-check a claim with a “news source I trust,” and 25% use dedicated fact-checkers. This suggests that strengthening independent media and fact-checking organisations can help. Journalists can contribute by providing context and calling out manipulative tactics (e.g. explaining why a viral claim is false) in clear, non-technical language.

Finally, some technology firms and platforms are exploring solutions: algorithm tweaks to reduce echo chambers, warning labels for questionable content, and redirects to reliable information. These controlled interventions can help break the cycle of repeated exposure. Still, all agree no single fix is perfect. Scholars emphasise a multi-pronged response: better education, responsible tech design, and sustained journalistic rigour are all needed. Over time, each small improvement (more skepticism, a well-placed fact-check, a diverse news feed) can help inoculate society against the flood of falsehoods.

An illustration of a man wearing a suit, a surgical mask, and a backpack sprayer. He is spraying a disinfectant-like substance at a large, overflowing pile of newspapers with headlines like "FAKE NEWS" and "VS NEWS," which also have virus-like icons on them. The background is a solid red. The image symbolizes efforts to combat the spread of misinformation and "fake news."
Researchers and communicators have identified ways to fight back disinformation and misinformation. (📷:BBC)

Disinformation and misinformation are not just “problems on the internet”. They tap into deep psychological tendencies. Our biases for simplicity, consistency, and emotion give false information a natural advantage. We may feel “smart” for reading news all day, but in reality our brains often default to the easiest narratives. Acknowledging this isn’t pessimistic; it’s empowering. By understanding how our mind works (why we share an angry headline without thinking, or stick to our party line despite evidence) we can begin to correct course. In a connected world, vigilance is key. A doubting question (“Is this really true?”) is one of our best tools. Because the truth often gets complicated, while lies come wrapped in emotion, the audience must do extra work. But with clear awareness of how confirmation bias, cognitive dissonance, groupthink, and fear impact us, ordinary people can become their own first line of defence. 

Comments

Popular

Effective Communication in Healthcare Settings: Building Trust and Improving Patient Outcomes

Doomscrolling, Information Overload, and Societal Anxiety: A Critical Analysis

Inside the Web of Hate: How Misogynistic Ideologies Thrive in Online Spaces