Warped Media: When Truth Competes With Identity
Are you finding it harder than ever to decide what’s real and what isn’t?
Depending on which TV channel you watch or which social platform you scroll, the same story can sound completely different. Facts blur into opinions. Commentary disguises itself as news. And somewhere along the way, confidence replaces accuracy.
I started noticing this shift a few years ago when social platforms began using algorithms to determine what people see, and, just as importantly, what they don’t. Some voices were amplified. Others were buried. In certain cases, people were blocked entirely for views that didn’t align with those in control.
Yes, these platforms are privately owned, and owners have the right to decide how they operate. But for years, many of us believed social media existed as a space for open exchange, a modern public square where ideas could be shared, challenged, and refined. That belief no longer feels accurate.
I’ve always valued free speech. I don’t need to agree with every opinion to believe it deserves to be heard. I also believe people should be free to practice their faith openly, without fear of discrimination or harm. The ability to hear many perspectives, think critically, and arrive at our own conclusions is essential, not optional.
What I don’t support is the intentional spread of false information, especially when it shapes decisions that impact real lives.
And that’s where the tension lives.
The problem isn’t just biased media. It’s us.
Even when we suspect a source may be inaccurate, we’re far more likely to support it if it sounds like us, if it echoes our beliefs, reinforces our identity, or confirms what we already think is right. Truth becomes secondary to alignment.
That’s the hardest part to confront.
We want to be right. We want our views affirmed. And when a voice tells us what we want to hear, even if it bends facts to do it, resisting that voice feels almost impossible. In many ways, media no longer informs us; it validates us.
So what’s the solution?
I don’t have a perfect one. But I do know this: if we truly care about truth, we have to be willing to reject information that is false, even when it agrees with us. Especially when it agrees with us.
The question we should be asking isn’t, “Does this support my beliefs?”
It should be, “Is this true?”
Until we’re willing to choose truth over affirmation, no algorithm, platform, or policy will fix what’s broken.