It’s hard to wrap your head around at first. How can anyone really believe the kinds of things we see spreading online? The conspiracy theories, the blatant falsehoods, the twisted narratives. it’s easy to assume people must be misinformed, manipulated, or just not thinking clearly.
But what if the truth is more complicated and more dangerous?
<aside> 💡
Subscribe to The Informed Citizen News - Click HERE!
</aside>
What if these lies don’t just fool people... what if they become people? What if they shape identity, not just opinion?
This is the uncomfortable reality we’re facing. Disinformation today isn’t just about making people believe false things, it’s about forging a sense of belonging. It’s about anchoring people in a worldview that feels safe, righteous, and morally unshakable. It gives people villains to blame, heroes to worship, and meaning to hold onto.
And once that belief becomes who someone is, trying to "correct" it doesn’t just fall flat, it feels like an attack. A challenge to the story they tell themselves about the world, their tribe, and their place in it.
If you’ve ever wondered why people cling so tightly to ideas that seem so clearly false, this post is for you. We’re going to unpack how disinformation fuses with identity and, more importantly, how we can start to reach people who’ve been pulled into the fog.
Humans are wired for connection. Long before we needed to be right, we needed to belong. And that’s the foundation disinformation exploits, not reason, but identity.
When people adopt false beliefs, it’s often not because they evaluated the facts and came to the wrong conclusion. It’s because those beliefs say something about who they are and where they belong. A false narrative about election fraud, for example, isn’t just a belief, it becomes a badge of loyalty. A test of group membership. A declaration of which side you're on in a divided world.
This is what researchers call identity fusion, when belief becomes inseparable from the self. It means challenging the idea feels like challenging the person. And once someone is in that space, even the most carefully crafted fact-check won’t break through. In fact, it might backfire.
Disinformation thrives in this terrain. It binds people together around shared fears, grievances, and enemies. It offers clarity and moral simplicity in a chaotic world. And once someone is anchored in that identity, they’re not just defending a position, they’re defending their tribe, their values, and their sense of self.
So, if you’re wondering why facts don’t seem to work, this is why. The disinformation machine isn’t playing a game of logic, it’s playing a game of loyalty.
Most people, when confronted with someone who believes a wild conspiracy or a flat-out lie, default to the same instinct: just show them the truth. Send the article. Link the fact-check. Drop the stats.
<aside> 💡
It rarely works. And often, it makes things worse.
</aside>
That’s because most of us approach these moments with what psychologists call a “soldier mindset” arming ourselves with facts, ready to defeat the “enemy” argument. But if the person you’re talking to is acting out of fear, loyalty, or deep emotional investment, you’re not in a battle of facts. You’re in a relationship. And they’ll fight to defend it.
So what are the most common mistakes?
Flooding them with information. Facts alone don’t shift beliefs rooted in identity. People filter facts through their feelings and group loyalty. If the facts threaten their worldview, their brain doesn’t update, it goes to war.