Operant Conditioning and the Cybernetic Hacking of the Human Mind

Society has never more literally been in your head.

Welcome Back to Trenchant Edges, where we take deep dives into fringe culture for fun and prophet.

Estimated reading time: 4 minutes, 7 seconds. Contains 826 words

Yesterday night I left my phone in a friend’s car and so I’ve been taking this rare opportunity to explore my relationship with my technology. Not loving how I feel.

A computer is a kind of philosopher’s stone, where one’s intention melds with the deep mysteries of the universe to create whatever effect one wills.

Naturally, it’s highly addictive.

Western culture has operated mainly on a false assumption of the division of mind and body (and perhaps spirit) into separate and irreconcilable spheres since Descartes. Materialist science and neuroscience have been working on untangling this delusion from the materialism side for the last century.

Personally, I’ve found the opposite approach more beneficial. But that kind of Occultist thinking comes with some weird strings and maybe more complications than it’s worth. So I want to explore today with limited metaphysics, but with the most basic organs of consciousness.

Disclaimer: I cut a bunch of stuff here so if this next section doesn’t quite make sense, roll with me.

How BF Skinner Ruined the World

One of the theories on how to create stable neural connections is called conditioning. Simply consistently pairing two unrelated things enough times you can make a person or animal link them together. I’ll skip the usual talk on Pavlov and his, uh, maybe not super ethical experiments with dogs and bells here.

The important guy for this is BF Skinner, America’s preeminent Behaviorist. Skinner discovered the mechanics of how many kinds of addictions work: If you have an action with a clear relationship between cause and effect, you get mostly responsible behavior. But if you have a cause that has a sufficiently messy range of effects you can generate on-demand addictions. It’s called Intermittent Reinforcement

This is something, abusive romantic partners have always known: The wider the range of the emotions someone feels and the less they feel they can predict what they’ll get, the harder it will be to break out of the bad relationship.

Did I just call BF Skinner and his entire intellectual tradition abusers? Yup.

And their techniques form the basis for the kinds of computers that now suffuse life in our modern dystopia.

The skinner box is a kind of torture device where all stimulation is controlled by the experimenter. I call this torture because I’ve read too many Army and CIA interrogation manuals and am aware that total environmental control is often considered a prerequisite for “effective” interrogations.

Skinner used this box to refine his ideas on Operant conditioning. But it was his successors, particularly in the wake of the success of Google’s adwords advertising platform, that really turned it into a horrorshow. Yup, I’m rambling about The Age of Surveillance Capitalism again.

We talk a lot about the cyborgificaton of humanity in this newsletter, how our tools remake us as we remake them. And what we’ve seen over the last 20 years is the most intense period of cyborgifcation so far within history.

These themes, oddly enough, are explored most effectively in a pair of video game rock operas. Take an hour and a half and listen to them. I suggest starting with act 2 because it’s less dependant on knowing the source material, a famous action game from the 1980s.

Incidentally, one of the videos of it is the source of maybe the greatest youtube comment ever: Jesus loves you and died for your sins. Protoman hates you and died for your sins twice.

Trust me, it’ll make sense.

So, back to Google and friends. They quickly realized the more data they have to train their algorithms on, the better their predictions, the more profitable their ads will be.

This created an imperative to collect as much data as possible. Soon, they realized that every design element involved in each of their products would create different effects. You could change people’s experience of information by changing its presentation.

Now, this is an old bit of wisdom. But when paired with surveillance IT and so on, every app and every website is now its own Skinner boxes. Each a highly motivated structure to funnel anyone who uses the page or app to certain ends.

A business might point you to the buy button, Facebook points you to stay on the page for increasingly long, and Google points you to more google services.

So, every new phone sold creates a new mess of engagement with skinner boxes and technology measures responses in increasingly granular levels.

Shoshana Zuboff described all this as an attack on The Will To Will, our ability to even directly want an end goal enough to decide to decide about it as increasingly effective behaviorist capture of our lives rendered into data for optimal advertising.

And it tracks well with the sense of self disorientation many people report on social media.

Now, while this is kind of inevitable with the current social forces in play we’ll explore Skinner Box Self Defense tomorrow.