The New Accusation: “This Must Be AI”
- Helen Yang
- 3 days ago
- 3 min read
OPINION PIECE DISCLAIMER
The following article is an opinion. That means it's not a court ruling, a holy text, or your mom’s advice. If it upsets you, don’t worry, your rage isn’t legally binding. Side effects may include thinking critically, questioning reality, or yelling into a void. Proceed accordingly.
SwampWire is not responsible for hurt feelings, moral panic, or any sudden desire to fact-check .

There was a time when people encountered an article they disagreed with and their instinct was to argue. They would write rebuttals, leave long comments, or even craft essays of their own. Today, that instinct is fading. Instead of grappling with ideas, people reach for a shortcut: they declare the writing “AI-generated” and move on.
At first glance, it sounds harmless, almost funny. But the psychology behind this habit is worth unpacking, because it reveals more about readers than about the writing itself. When someone labels a piece of work “AI-generated,” what they are really saying is, I don’t want to deal with this. It is a shield against discomfort. Psychologists call that discomfort cognitive dissonance, the unease we feel when our beliefs are challenged by something persuasive. In the past, people worked through that unease by debating or refining their own perspective. Now, instead of wrestling with the tension, they dismiss the entire conversation by questioning the authenticity of the text. If it’s not human, it doesn’t deserve engagement.
This reflex also speaks to projection. Most people already consume AI-written text daily without realizing it: emails drafted by autocomplete, shopping ads generated by algorithms, even short news blurbs repackaged by machine learning tools. When they encounter writing that feels polished, too smooth, or simply out of step with their worldview, they project suspicion onto it. The result is a kind of paranoid literacy, an assumption that clarity or confidence in language must mean a machine is speaking.
Emerging news outlets are hit hardest by this suspicion. Without the brand recognition of legacy media, they are easy targets for accusations of inauthenticity. Instead of evaluating the strength of the reporting, readers dismiss it out of hand: This can’t be real journalism; it’s just AI spam. In that sense, “AI-generated” has become the new “fake news,” a phrase that lets readers avoid the work of engaging with ideas.
There is also a deeper emotional layer at play: the fear of being duped. Nobody wants to feel tricked, especially in an era where misinformation spreads quickly. Dismissing something as AI-generated provides cover. If the story turns out to be wrong or controversial, the reader can preserve their pride by saying, "I never believed it anyway; it wasn’t even human." It is less about evaluating truth and more about protecting ego.
The irony is that this habit, meant to preserve conversation space, actually ends it. When people refuse to engage with content because they’ve decided it must be artificial, they close the door on dialogue. The exchange of ideas, the messy, uncomfortable, essential part of human discourse, gets replaced by a shrug and a dismissal.
In the end, calling something “AI-generated” has little to do with detecting machine output and everything to do with managing discomfort. It is a psychological defense mechanism, a way of sidestepping persuasion, vulnerability, and even curiosity. But if every disagreement is brushed aside under the assumption of artificial authorship, then the risk is not just to journalism or creative writing, it is to our ability to converse, to challenge one another, and to grow from friction.
We are entering a strange phase of the information age where suspicion is the default and dialogue is the casualty. The tragedy is that much of the work being casually dismissed as robotic is still very human, still thoughtful, and still worth debating. But debate requires vulnerability, and vulnerability is harder than denial. It is easier to wave away a voice than to hear it.