Social media is destroying your free will

3 Sep|Tom Hodgkinson

Tom Hodgkinson and John Harris quiz the unusual tech reformer at an Idler event and discuss how Silicon Valley became the new priesthood

In 1517, Luther wrote a document called The 95 Theses which argued that the practice of indulgences – or selling access to heaven to poor people for cash sums – was wrong. The story goes that he nailed these theses to the door of the church. Most scholars reckon that didn’t actually happen, but it’s a nice image.

In the same way, tech reformer Jaron Lanier has nailed his Ten Arguments For Deleting your Social Media Accounts Right Now to the door of Google. Well, he couldn’t possibly nail them on to the door, as it is made of glass. Maybe he could Blu Tack them. But his aim is the same: to get the digital priesthood off their perch and effect a revolution. Let’s just hope the Digital Reformation turns out better than the original Reformation which led to the banning of merry-making and fun stuff like Christmas in the 17th century.

We met Jaron Lanier at an Idler event at the Marx Memorial Library in Clerkenwell, where we were joined by Guardian journalist John Harris, author of a series of attacks on Amazon, Facebook and other Silicon Valley giants.

TOM HODGKINSON: Let us look at your theses, Jaron. Number one says: You are losing your free will.

JARON LANIER: What I propose is that, at the very least, free will has to involve some degree of creativity and unpredictability in how you respond to the world. That your future might be more than your past. That you might grow, you might change, you might discover. Now, the thing that counters that is when your reactions to the world are locked into a pattern that, by design, is making you more predictable – for the benefit of someone else. This was a technique developed by a branch of science called behaviourism, starting in the 19th century. The famous gurus might be Ivan Pavlov and BF Skinner. And, in that case, instead of attempting to train a horse by whispering in its ear, or to teach a child by holding the child’s hand, you take a very nerdy approach. You log very carefully everything the subject does, whether human or animal, and then you very carefully use the data that you gather about what the subject does to change something about the experience, often with punishments or rewards, like electric shocks or candy. Then you work on algorithms of when you give these changes and experiences, with what timing and so forth, until you find a formula that will cause behaviour change, or behaviour pattern change, in a predictable manner. The scientists, by the way, sometimes had a sort of perverse or cruel bent to them, and achieved celebrity through their arrogance and creepiness. The thing is, in order to find yourself being manipulated in this way, you have to be in a rather special circumstance.

TH: You’d have to be behind a glass screen?

JL: Behind a glass screen – or in a special cult. There have been many examples of near-universal surveillance: the Stasi achieved it in the former East German state, the North Koreans achieve it now. But the coupling of near-total surveillance with these behaviourist feedback techniques has never been achieved before. This is something entirely new in the world and distinct from any previous advertising, or policing, or statecraft. It’s important to understand that it’s not generally dramatic. It is once in a while, when an election is thrown, such as the Trump election. But, generally it’s very slow-moving, a little like climate change, where someone says, “It’s just another storm,” and there is no way to prove that any particular individual has been changed. But the thing is, it has a cumulative effect, sort of like compound interest. The types of effects that are easiest to bring about are to make people cranky and paranoid and irritable, and so, very gradually, the whole society has become more so. So, what I think is a very reasonable thing to say is that, when you come under the influence of a behaviourist regime, you’re losing your free will, because your actions become predictable.

JOHN HARRIS: I was very, very reluctant to go on Twitter. I just couldn’t under- stand it. It made clever people sound trite at best, and stupid at worst. But then I was told at work to go on it – it was a professional obligation, because under your by-line they were going to run your Twitter handle. So, I did, and I got addicted to it. Just lately I went on holiday for a week and found I’d lost the ability to read books. I mean, I sort of tried. There is now this mountain of books around the so-called “tech lash”, and as you’ve said, Jaron is among the most eloquent, and certainly the most witty. There was one I read, I probably read 9% of it, but it was the right 9%. It was talking about Skinner’s behaviourist psychology, and the notion of variable rewards. Now, one of Skinner’s great insights, if you want to call it that, was that, initially, the rat pulled the lever and got a food pellet, and that made the rat quite keen on pulling the lever. The next experiment was to abandon the correlation between pulling the lever and the food pellets, so they came out randomly. And lo and behold, the rat became even more frenzied about getting the food, because it didn’t know when it was going to get the food pellet, it just pulled the lever like crazy. All of these addictive platforms do the same thing – they all have this central facet of variable rewards. So, when you post on Twitter, you might get six likes and four replies, or nine retweets, or it might go viral and be the greatest thing that’s happened all day. I was getting dopamine rushes, definitely. If something went viral, if I called Donald Trump an idiot, or something similarly Oscar Wildean, it was going off the scale, right?

TH: So, what we’re saying is, the medium is the message, in the sense that these things control your behaviour. Did the inventors of this idea, and more importantly perhaps the investors, know all this at the beginning?

JL: There were many warnings in advance. We could mention EM Forster’s short story from 1907, ‘The Machine Stops’, which approximately warns of this. More technically, we could mention Norbert Wiener’s The Human Use of Human Beings, written in the late 1940s. We could mention many others. I had written a great many things in more modern idiom, starting around ’92, warning about all this stuff, pretty precisely.

I think back to the last century and the rise of fascist parties in Europe, and I feel it’s unlikely they would have achieved the degree of evil they did had they not leveraged what was then the new technology of the time: cinema, mass-access to radio and then television. The Nazis pioneered television broadcast. The analogy isn’t perfect, but at that time, the worst people were able to get the most mileage out of the latest media technology, when it was the most novel and the most potent.

TH: The social media companies sold themselves by doing the opposite. They said to people: “You now have the power to broadcast.”

JL: So did the Nazis. The Fascists were these distributed groups, the same as the Soviets. You know, everybody who centralises power in the worst way, always sells some form of power-sharing with everybody. There’s no exception to that. That early rhetoric of: “We’re all going to share,” is one of those things that really bothered me because it was so similar of the early rhetoric of some of the worst movements, and it really struck me that we might be falling into the same old traps.

A longer version of this piece appears in Idler 62. Click here to buy a copy.