Welcome to the Trenchant Edges, a weekday newsletter where we explore detritus of fringe culture and see if any of it is still worth understanding.
Estimated reading time: 4 minutes, 31 seconds. Contains 904 words
I’m Stephen, your host, and today we’re noodling on one of the more well known McLuhan quotes because the newsletter I tried to write rests on a lot of military theory that I’m way out of practice with. I know, you’re surprised I used to be a war nerd.
But you can’t really talk about Vlad Surkov without at least mentioning John Boyd and noted piece of shit William Lind.
Now, this idea wasn’t invented by McLuhan and has a complicated history we’re going to ignore. You can read about it here.
The principle here is that content becomes expected and then invisible. Part of context.
Manipulating this is the point of propaganda, especially appeals to common sense (which often means popular biases). If it’s simply known that X is Y, then no discussion on the subject is required.
This has immediate utility for truths: If X actually is Y, then nobody has to haggle over fundamental assumptions or metaphysics. Saves a ton of time.
But if X isn’t Y, or if Y has some subtle or counter-intuitive implications, or more commonly if phrasing X as Y hides some abusive power relationship there may be excellent reasons to criticize it.
And anyone who wants to have that conversation will have to fight uphill even to start it let alone to arrive at any conclusion.
I first understood this when, in the wake of 9/11, I opposed George W Bush’s build-up to the Iraq war. I fancied myself an intellectual at the time and read a smattering of the easier books on politics and warfare, including Machiavelli’s The Prince.
And I’d watched enough History/Hitler Channel to have read Herman Goering’s quote on bringing the people into a war fevor.
“Of course the people don’t want war. But after all, it’s the leaders of the country who determine the policy, and it’s always a simple matter to drag the people along whether it’s a democracy, a fascist dictatorship, or a parliament, or a communist dictatorship. Voice or no voice, the people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked, and denounce the pacifists for lack of patriotism, and exposing the country to greater danger.”
All this felt very familiar in early 2003. As it does today.
There’s a concept in politics that tries to operationalize this concept called the Overton Window, wherein popular ideas push away discussion of unpopular ideas, who are always trying to claw into the mainstream.
Much propaganda is about moving the Overton window so your more extreme positions become popular.
If you then overlay Noam Chomsky and Edward S Herman’s Propaganda Model from manufacturing consent, you’ll see that the Overton window is less of a neutral arbiter of popularity or acceptability and more the result of a vigorous cultural immune response to keep out dangerous ideas.
You can also mix in the basic concept of relativity to see why so many people are able to accept statements like, “Joe Biden is a communist” and “Donald Trump is a centrist Republican.” Outside the circles of those paid to believe such things, the reason such things aren’t laughed at is that they’re weighted by perspective.
So if you’re to the right of Donald Trump and you’re not really looking closely, Joe Biden, Bernie Sanders, and say Fred Hampton all pretty much look the same.
Where if you’re far to Joe Biden’s left, it becomes hard to really make out the real differences between him and Donald Trump.
All of this feeds into context.
If you get all your information from sources who share biases, those biases will simply feel natural to you.
Sidebar: This kind of epistimic blindspot is much of what’s meant by the phrase, “Check your privilege” and privielge theory. It’s a shorthand for, “Hey, your experience probably hides or devalues a lot of important information on this topic and so you’re not really informed enough to have the opinion you’re stating.”
So, much of cultural conditioning is creating these networks of prior assumptions. As we said before, these are important in allowing someone to function within a cultural network, but also blind a person to some information.
As distinguished war criminal Donald Rumsfeld once put it:
The real world is complex enough that you can easily stumble through all of these boxes on a single issue, let alone big topics.
Generally, we need some kind of edge to push against what we know to check it against the real world. But, frustratingly, the tool you choose to use determines much of your results. Whether it’s a microscope or the idea of materialism.
For example, if I were to look at some batch of cells under a microscope I’d be confused and frustrated that the example slides don’t use the same pigments as the example slides and I didn’t think of studying multiple different images of the same thing to correct this mistake until just now, so I’m probably going to fail another bio practical.
OK, that was a little too specific.
But my point is still simple: How you look determines what you see.
The least bad way out of this is something I picked up from Literary Analysis: Interpretive lenses.
And that’s about a wrap for today. We’ll be seeing you tomorrow for the McKenna Invisible Landscape comparison.
Won’t that be fun?
-SF