Discover more from The Trenchant Edges Newsletter
Debunking Debunking [Trenchant Edges]
Estimated reading time: 14 minutes, 22 seconds. Contains 2875 words
Welcome back to the Trenchant Edges, a newsletter about outbreaks of weird bullshit in the culture.
I’m Stephen, your beloved host, and guide in our confusing world.
Since our last chat I, uh, kind of went back to college. I’m now within a year of getting a BS in “General studies”, a Frankenstein degree made of behavioral health & psychology minors
Things are a bit of a mess, but this will probably give me more time to work on this since I won’t have to grind for rent as much.
I think I’ve cleared out the weird headers from this email, but I’m tired and I’m not going to check so if you get hit up for money before this let me know.
Anyway, I’m just going to send this out so let’s get started. Typos or wonky sentences are added as a courtesy, free of charge.
Notes On The Culture of Debunking
I cut my teeth on arguing about religion online in the early 2000s. Yes, I was one of those New Atheists.
But we grow and inevitably look back on our past with some… skepticism about our choices. This is normal and healthy and good.
What a really mean is that I was very invested in the concept of Debunking. I thought of myself as a nascent hard-nosed realist, reading big-boy books by war criminals like Henry Kissinger and their cheerleaders like Ayn Rand.
So as a good materialist, while I was interested in paranormal shit, I also applauded the usual suspects of James Randi and names I will not dignify here. I was into UFOs, conspiracies, and all that shit. Which brings with it a certain interest in claims of superhuman abilities.
But I wanted, uh, superpowers that worked. Like, what’s the point in feigning psychic powers? The best you can hope for is conning yourself and others and that’s no way to live.
This brings us to debunking: I’ve always figured that if something is true, it can take as much scrutiny as you can muster. If something collapses when you look at it the wrong way, it’s probably not real.
Now that’s… not entirely true. Many real things are simply fragile or highly contextual. This is true in literal, physical terms with high atomic number elements that only last a fraction of a second and a bunch of weird quantum effects.
But it works more often than not.
And this is the heart of debunking, right? Applying some scrutiny to upend a claim.
That’s the *stated* intent. But the actual social reality of debunking is more complicated.
Information is meaningless outside of a context, and the context for debunking is gatekeeping the boundaries between ingroup and outgroup.
We’re going to go considerably deeper into this later, but beliefs are about more than just mapping the world, they’re about coordinating people into and across groups.
They’re how we signal who we are to other people who are similar.
And debunking, though most often named in atheist/skeptic, anti-misinformation, or anti-conspiracy contexts, is a pretty basic function of maintaining social group cohesion in social spaces where multiple social realities are in play.
We can see this perhaps most clearly in the climate change advocates vs climate deniers “debate”, where both groups accuse each other of doing bad science, deferring to authorities instead of data, and try to debunk each other’s points. But it happens all over the place.
Much of the right-wing anti-woke campaign is an attempt to roll back social change by debunking the ideas that they don’t like.
Debunking works in two ways: If accepted it reinforces the narrative of the debunker and degrades the narrative they’re pushing against.
What this means in practice is debunking is as much about trading social validation as it is anything else. So partisans on this or that side will pass around debunks to armor themselves against the arguments made by the other side.
It conditions people into an us vs them mindset that extends beyond the single point. It makes people look at the entire side as unreliable and believing gibberish.
We’re all susceptible to this kind of thinking. Here’s a moment I had recently:
Anyway, I posted this missing that it was a reddit post copied to twitter, so he wasn’t just sharing the Nazi-ass quote from the holocaust denier, but the joke at the end.
Does that mean it’s OK? Not really. But I went after him for only the quote when that’s not all he posted.
It’s a real premium content moment for Elon: His far-right fans get the signal, and most of the criticism against him isn’t well-aimed so it’s easy to dismiss.
And it’s not like Elon hasn’t been openly supporting fascists like Ron Desantis, so *shrug*
Point is, debunking isn’t just about facts and isn’t always or really even usually done in good faith.
This is how a ton of the “political polarization” has happened in the US and it’s very beneficial for the system to have a third of the population fighting with another third while the rest are just trying to get by.
But that’s a different topic.
At its most basic, debunking is just showing someone that some claim isn’t true.
Be that exposing holes in its logical structure or demonstrating a different effect, it’s a handy tool in one’s arsenal.
But there’s several layers of problem here.
The biggest issue is what I think of as knowledge asymmetry.
The biggest issue of asymmetry is that nonbelievers are poorly placed to understand the stuff a belief system is made of, because they can only observe believer’s actions.
(Yes, this is a reference to the post on solipsism)
If I say something like, “This is an evil religion because of x, y, and z.” even if that statement is provably true, no believer could believe it.
No matter how obvious or true the proofs are.
Why? Because believers build up years of accumulated layers of interconnected psychic structures around their beliefs.
It’s not simply a logical structure you can bowl over with some words.
It’s every nice or cruel thing someone related to those beliefs ever said or did. It’s support in a crisis or the feeling of satisfaction when you showed up to help.
This is why proofs god doesn’t exist don’t really convince anyone.
Believers exist in a social network and, if they’re in a relatively healthy one, there’s going to be so much more than simple logical content tied up in that belief.
The smell of baked goods, the passion in the choir’s singing, a thousand thousand other sense details create a complex emotional geography.
Imagine thinking you can leverage that out with a bit of prose mocking logical inconsistencies.
Doesn’t make a lot of sense.
Doesn’t really matter what the nature of the belief is, as long as it’s embedded in a human social network it’ll end up meaning a lot more than merely the logical content of the belief written out.
And merely being ineffective isn’t even the main reason to reject this kind of behavior. See, people take attacks on their beliefs as attacks on themselves. Because the self is primarily made up of beliefs and attachments.
This is also the source of much of the reason why being provided with “correct facts” often causes people to double down on their inaccurate beliefs.
There’s a mess of research on this subject and the findings are pretty consistent. Contradicting people, even when they’re objectively wrong, usually causes them to double down.
Direct persuasion usually raises people’s hackles, they get defensive and the most interesting thing about being defensive is you usually can’t recognize yourself doing it without practice.
Even with practice, it’s easy to get caught up in the moment.
So, where does that leave us?
The Ground of Persuasion:
It’s really important to start with a couple of basic facts:
Wrongness is pervasive.
Wrongness is mostly trivial
Some Wrongness is extremely important sometimes
Wrongness predates information technology
Information Technology amplifies the spread of all information, but unevenly
What the fuck do I mean by all this? Simple, most of the time we’re inaccurate about most things. But relatively little information actually matters that much.
Memory is highly failable and there’s considerable evidence that remembering something is actually a kind of modifying it. So memories tend to drift over time.
But even potentially deadly mistaken information is only a problem if it comes up. Confuse a mushroom that gets you high with one that kills you immediately? Not a problem unless you’re eating the relevant mushrooms.
So even the majority of really important information is only important when it’s relevant.
Thus, through the regular action of our minds, we generate incorrect information and rarely have direct need to update it.
Thus does “misinformation” thrive and persist.
As far as I can tell this isn’t a fixable error and goes back as far as we have stories.
Now, media from the written word down to TikTok make this problem worse by unevenly amplifying and extending the range through time and space information can be accessed.
Even true information is often contextual and time-bound.
Let’s take an old example: Hamurabi’s Code of Laws, preserved in a diorite stela some nearly 4,000 years old. If you were living in Babylon in about 1740 BCE you would have every right to consider that an accurate factual thing.
But if you break a wall to rob a place today, you’re probably not going to be sealed inside a wall as punishment.
Information is time and space bound. It decays if not renewed.
Media lets fewer people’s voices reach an ever-greater number of people.
But once you have a system where one person can reach billions, you’ve got a situation where you need to figure out who will listen to what.
Initially, it was whoever could afford the big machines and organization to run the machines to do it (broadcast media) but now it’s whoever pays the platform holders who own those machines enough (Social/platform media).
The whole of culture is becoming the cultic milieu.
This is Terence McKenna’s Balkanization of Epistemology. The Post-Truth World of Alternative Facts.
We need to understand the terrain of persuasion before we embark on it.
This is the world we live in.
Though, personally, I think we’re more Pre-Truth than Post-Truth, as the gatekeeper-focused broadcast media did plenty of lying.
So that’s the external conditions.
What about a person’s internal state?
Well, we can only really generalize here aside from suggesting that there’s widespread and growing stress on individuals.
People are, frankly, more media savvy and media literate than ever.
Early advocates of media literacy thought they just needed to teach people to avoid bad information to escape the problem. But they underestimated the fundamental delusional nature of the human mind.
In college, I read the book Making Up The Mind: how the brain creates our mental world by Chris Firth, and it makes the rigorous case for what I’m hinting at here: Our experience of the world is a kind of hallucination. A best approximation for the world outside the mind based on currently accessible information and processing structures lagging a bit behind the real world.
Let’s take an example. In the novel Jurassic Park, dinosaur park island is covered by a 92% coverage motion sensor network. This is extremely good, one imagines.
But that relatively small 8% turns out to be a critical flaw, introducing errors and lost information because they didn’t bother to put motion sensors near the river, where the noise would make them useless.
All well and good until dinosaurs leave their enclosures and tourists attempt to use the river to get back to the visitor’s center.
The mind is similarly mostly reliable. And most of its failures don’t matter. This leads us to overestimate its reliability which can lead to unpleasant consequences.
Usually not dinosaur mauling, thankfully, but often equally unpleasant issues.
The point is errors accumulate and unless you’ve found an edge to work against, it can be very hard to notice.
Empiricism (trying things out and seeing what happens) is such an edge, but it’s limited by speed of execution and expense. To get really reliable results takes a long time and by that point, those results could be irrelevant.
Instead Of Debunking
I once heard Travis View of the Qanon Anonymous podcast describe debunking as a party trick. A bit of rhetorical flourish to make the claims it’s targeting seem absurd.
That’s not to say it’s invalid or useless. Just it’s not going to move someone deep in belief out of it. They’ll just rationalize it, motivated reasoning trumping any inconvenient facts.
It’s easy to think that only people we’d think of as delusional think that way but that’s just how humans think. Beliefs are tools, mostly for social organization. Locating our ingroup and dividing the outgroup from them.
So what can you do if you see someone heading towards dangerous beliefs?
Well, you probably can’t stop them. So that’s important to understand out front. People change their beliefs for compelling emotional reasons and no matter how little those decisions make sense to you.
We’re rarely in charge of our beliefs or how they change. Attachments fray or we find a flaw we can’t ignore and we start moving.
Suppressing cognitive dissonance is only a patch.
So with that in mind, my first suggestion is… maybe don’t focus on changing their mind. For one thing, they’re likely to have much more domain-specific knowledge than you do even if it’s wrong.
And the second is that entering a conversation with the intention to persuade is kind of dismissive of their agency and runs a high risk of burning trust with them.
Better is focusing on understanding what they believe, why, and most importantly what they’re trying to do or solve with their new beliefs.
From there you’re in a much better position to talk with them and see if the direction they’re headed is actually useful to the goal they want.
Often people pick extreme beliefs because they want to solve a problem those beliefs don’t actually address.
But understand the process is likely to be slow and not work.
Usually, extreme shifts in belief start with a change in who the person trusts, which means they probably have good reasons to do so. As distrust usually starts with a recognition of hypocrisy or betrayal. You almost certainly can’t repair that trust.
Like, plenty of people started supporting Donald Trump when they saw the media treating him like a shit person (not noticing how much they helped him).
Few of those people, even those who no longer support Trump, feel the trust they did in the media before 2016.
A ton of the reason the last decade has been so politically divisive is people on different sides stop trusting each other, then they start interpreting the world differently, and pretty soon it’s like they live on different planets.
And trying to travel to someone else’s planet is often pretty hazardous.
Again, beliefs are about social groups as much as anything else. Affinity vs aversion. So if you are pushing towards what they’re moving away from… well, you can’t always make a connection or keep one open.
But what if you want to do this on a bigger scale? If you’re “creating content”, you’ve got a more complex issue. Since your audience is going to shift as people’s affinity with you will usually mirror their affinity with your views and presentation style.
I’ve found the least bad way to do this is to avoid judgment, hammering talking points, or demonizing. I say least bad because I haven’t seen more than a handful of people really change their vector from my work. Maybe 10 or 20 in the last 5 years.
I’ve seen far more already start to shift their PoV and then pick up breadcrumbs I’ve left behind.
Your mileage will vary.
My overall point here is that beliefs are more relational between people than accurate maps of reality. Most are “accurate enough” by which I mean “not very accurate at all.”
But belief systems mostly collapse from internal structure collapses than from social pressure. And that’s good, we don’t actually want people believing stuff because they’ll face social punishment for saying they disagree.
Compliance isn’t understanding.
As I think I said earlier, debunking is more a performance towards an ingroup to reassure them they’re on the right side than an attempt at communication across group lines.
But it’s not enough. It will not take us past our current degrading feedback loops.
Alas, the only alternative I know is slow, trust-based conversations that can only really take place between two people acting in good faith. They’re frustrating, time-consuming, and if you describe them in terms of “effectiveness” they sound like they don’t work.
Media-tion, literally mediated by media, creates competing swarms of individuals alienated by the fact that their organization is some mechanized third party and not the other people in their life.
But without media we can’t really organize at the scale of billions of people and the world is now too complex not to do so without risking many of those lives. But the way we’re currently organizing dooms many other lifeforms on this planet.
Anyway, we’ll be seeing you soon. Things have been busy, but not *that* busy.
See you soon.
Without your sacrifice of time and money, this newsletter wouldn’t be possible. If you dig it, pass it back.